Science.gov

Sample records for conceptual interoperability model

  1. Smart Grid Interoperability Maturity Model

    SciTech Connect

    Widergren, Steven E.; Levinson, Alex; Mater, J.; Drummond, R.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizational alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.

  2. Maturity model for enterprise interoperability

    NASA Astrophysics Data System (ADS)

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  3. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  4. Maturity Model for Advancing Smart Grid Interoperability

    SciTech Connect

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  5. Smart Grid Interoperability Maturity Model Beta Version

    SciTech Connect

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  6. Modelling and approaching pragmatic interoperability of distributed geoscience data

    NASA Astrophysics Data System (ADS)

    Ma, Xiaogang

    2010-05-01

    , intention, procedure, consequence, etc.) of local pragmatic contexts and thus context-dependent. Elimination of these elements will inevitably lead to information loss in semantic mediation between local ontologies. Correspondingly, understanding and effect of exchanged data in a new context may differ from that in its original context. Another problem is the dilemma on how to find a balance between flexibility and standardization of local ontologies, because ontologies are not fixed, but continuously evolving. It is commonly realized that we cannot use a unified ontology to replace all local ontologies because they are context-dependent and need flexibility. However, without coordination of standards, freely developed local ontologies and databases will bring enormous work of mediation between them. Finding a balance between standardization and flexibility for evolving ontologies, in a practical sense, requires negotiations (i.e. conversations, agreements and collaborations) between different local pragmatic contexts. The purpose of this work is to set up a computer-friendly model representing local pragmatic contexts (i.e. geodata sources), and propose a practical semantic negotiation procedure for approaching pragmatic interoperability between local pragmatic contexts. Information agents, objective facts and subjective dimensions are reviewed as elements of a conceptual model for representing pragmatic contexts. The author uses them to draw a practical semantic negotiation procedure approaching pragmatic interoperability of distributed geodata. The proposed conceptual model and semantic negotiation procedure were encoded with Description Logic, and then applied to analyze and manipulate semantic negotiations between different local ontologies within the National Mineral Resources Assessment (NMRA) project of China, which involves multi-source and multi-subject geodata sharing.

  7. Ontological Model for EHR Interoperability.

    PubMed

    Bouanani-Oukhaled, Zahra; Verdier, Christine; Dupuy-Chessa, Sophie; Fouladi, Karan; Breda, Laurent

    2016-01-01

    The main purpose of this paper is to design a data model for Electronic Health Records which main goal is to enable cooperation of various heterogeneous health information systems. We investigate the interest of the meta-ontologies proposed in [1] by instantiating it with real data. We tested the feasibility of our model on real anonymous medical data provided by the Médibase Systèmes company. PMID:27350489

  8. Interoperability challenges in river discharge modelling

    NASA Astrophysics Data System (ADS)

    Santoro, Mattia; Schlummer, Manuela; Andres, Volker; Jirka, Simon; Looser, Ulrich; Mladek, Richard; Pappenberger, Florian; Strauch, Adrian; Utech, Michael; Zsoter, Ervin

    2014-05-01

    River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related areas such as water resources assessment and management, as well as flood protection and disaster mitigation. Observations of river discharge are very important for the calibration and validation of hydrological or coupled land, atmosphere and ocean models . This requires the use of data from different scientific domains (Water, Weather, etc.). Typically, such data are provided using different technological solutions and formats. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. In the context of the FP7 funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), the "River Discharge" use scenario was developed in order to combine river discharge observations data from the Global Runoff Data Center (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information. In this presentation we describe interoperability solutions which were adopted in order to address the technological challenges of the "River Discharge" use scenario: 1) Development of a Hydrology Profile for the OGC SOS 2.0 standard; 2) Enhancement of the GEO DAB (Discovery and Access Broker) to support the use scenario: 2.1) Develop new interoperability arrangements for GRDC and ECMWF capacities; 2.2) Select multiple time series for comparison. The development of the above functionalities and tools aims to respond to the need of Water and Weather scientists to assess river discharge forecasting models.

  9. Model and Interoperability using Meta Data Annotations

    NASA Astrophysics Data System (ADS)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and

  10. A Patient Safety Information Model for Interoperability.

    PubMed

    Rodrigues, Jean Marie; Dhingra-Kumar, Neelam; Schulz, Stefan; Souvignet, Julien

    2016-01-01

    Current systems that target Patient Safety (PS) like mandatory reporting systems and specific vigilance reporting systems share the same information types but are not interoperable. Ten years ago, WHO embarked on an international project to standardize quality management information systems for PS. The goal is to support interoperability between different systems in a country and to expand international sharing of data on quality and safety management particularly for less developed countries. Two approaches have been used: (i) a bottom-up one starting with existing national PS reporting and international or national vigilance systems, and (ii) a top-down approach that uses the Patient Safety Categorial Structure (PS-CAST) and the Basic Formal Ontology (BFO) upper level ontology versions 1 and 2. The output is currently tested as an integrated information system for quality and PS management in four WHO member states. PMID:27139388

  11. Documenting Models for Interoperability and Reusability

    EPA Science Inventory

    Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration be...

  12. Conceptual IT model

    NASA Astrophysics Data System (ADS)

    Arnaoudova, Kristina; Stanchev, Peter

    2015-11-01

    The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.

  13. Advances in a distributed approach for ocean model data interoperability

    USGS Publications Warehouse

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  14. A Conceptual Framework to Enhance the Interoperability of Observatories among Countries, Continents and the World

    NASA Astrophysics Data System (ADS)

    Loescher, H.; Fundamental Instrument Unit

    2013-05-01

    , GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.

  15. Evaluating Sustainability Models for Interoperability through Brokering Software

    NASA Astrophysics Data System (ADS)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  16. A conceptual holding model for veterinary applications.

    PubMed

    Ferrè, Nicola; Kuhn, Werner; Rumor, Massimo; Marangon, Stefano

    2014-05-01

    Spatial references are required when geographical information systems (GIS) are used for the collection, storage and management of data. In the veterinary domain, the spatial component of a holding (of animals) is usually defined by coordinates, and no other relevant information needs to be interpreted or used for manipulation of the data in the GIS environment provided. Users trying to integrate or reuse spatial data organised in such a way, frequently face the problem of data incompatibility and inconsistency. The root of the problem lies in differences with respect to syntax as well as variations in the semantic, spatial and temporal representations of the geographic features. To overcome these problems and to facilitate the inter-operability of different GIS, spatial data must be defined according to a \\"schema\\" that includes the definition, acquisition, analysis, access, presentation and transfer of such data between different users and systems. We propose an application \\"schema\\" of holdings for GIS applications in the veterinary domain according to the European directive framework (directive 2007/2/EC--INSPIRE). The conceptual model put forward has been developed at two specific levels to produce the essential and the abstract model, respectively. The former establishes the conceptual linkage of the system design to the real world, while the latter describes how the system or software works. The result is an application \\"schema\\" that formalises and unifies the information-theoretic foundations of how to spatially represent a holding in order to ensure straightforward information-sharing within the veterinary community. PMID:24893036

  17. Interoperability in Collaborative Processes: Requirements Characterisation and Proof Approach

    NASA Astrophysics Data System (ADS)

    Roque, Matthieu; Chapurlat, Vincent

    Interoperability problems which can occur during the collaboration between several enterprises can endanger this collaboration. Consequently, it is necessary to become able to anticipate these problems. The proposed approach in this paper is based on the specification of properties, representing interoperability requirements, and their analysis on enterprise models. Due to the conceptual limits of existing modeling languages, formalizing these requirements and intending to translate them under the form of properties need to add conceptual enrichments to these languages. Finally, the analysis of the properties on enriched enterprise models, by formal checking techniques, aims to provide tools allowing to reasoning on enterprise models in order to detect interoperability problems, from an anticipative manner.

  18. Enabling Interoperation of High Performance, Scientific Computing Applications: Modeling Scientific Data with the Sets & Fields (SAF) Modeling System

    SciTech Connect

    Miller, M C; Reus, J F; Matzke, R P; Arrighi, W J; Schoof, L A; Hitt, R T; Espen, P K; Butler, D M

    2001-02-07

    This paper describes the Sets and Fields (SAF) scientific data modeling system. It is a revolutionary approach to interoperation of high performance, scientific computing applications based upon rigorous, math-oriented data modeling principles. Previous technologies have required all applications to use the same data structures and/or meshes to represent scientific data or lead to an ever expanding set of incrementally different data structures and/or meshes. SAF addresses this problem by providing a small set of mathematical building blocks--sets, relations and fields--out of which a wide variety of scientific data can be characterized. Applications literally model their data by assembling these building blocks. A short historical perspective, a conceptual model and an overview of SAF along with preliminary results from its use in a few ASCI codes are discussed.

  19. Fuzzy conceptual rainfall runoff models

    NASA Astrophysics Data System (ADS)

    Özelkan, Ertunga C.; Duckstein, Lucien

    2001-11-01

    A fuzzy conceptual rainfall-runoff (CRR) framework is proposed herein to deal with those parameter uncertainties of conceptual rainfall-runoff models, that are related to data and/or model structure: with every element of the rainfall-runoff model assumed to be possibly uncertain, taken here as being fuzzy. First, the conceptual rainfall-runoff system is fuzzified and then different operational modes are formulated using fuzzy rules; second, the parameter identification aspect is examined using fuzzy regression techniques. In particular, bi-objective and tri-objective fuzzy regression models are applied in the case of linear conceptual rainfall-runoff models so that the decision maker may be able to trade off prediction vagueness (uncertainty) and the embedding outliers. For the non-linear models, a fuzzy least squares regression framework is applied to derive the model parameters. The methodology is illustrated using: (1) a linear conceptual rainfall-runoff model; (2) an experimental two-parameter model; and (3) a simplified version of the Sacramento soil moisture accounting model of the US National Weather Services river forecast system (SAC-SMA) known as the six-parameter model. It is shown that the fuzzy logic framework enables the decision maker to gain insight about the model sensitivity and the uncertainty stemming from the elements of the CRR model.

  20. CityGML - Interoperable semantic 3D city models

    NASA Astrophysics Data System (ADS)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its

  1. The ISO Edi Conceptual Model Activity and Its Relationship to OSI.

    ERIC Educational Resources Information Center

    Fincher, Judith A.

    1990-01-01

    The edi conceptual model is being developed to define common structures, services, and processes that syntax-specific standards like X12 and EDIFACT could adopt. Open Systems Interconnection (OSI) is of interest to edi because of its potential to help enable global interoperability across Electronic Data Interchange (EDI) functional groups. A…

  2. NADM Conceptual Model 1.0 -- A Conceptual Model for Geologic Map Information

    USGS Publications Warehouse

    North American Geologic Map Data Model (NADM) Steering Committee Data Model Design Team

    2004-01-01

    Executive Summary -- The NADM Data Model Design Team was established in 1999 by the North American Geologic Map Data Model Steering Committee (NADMSC) with the purpose of drafting a geologic map data model for consideration as a standard for developing interoperable geologic map-centered databases by state, provincial, and federal geological surveys. The model is designed to be a technology-neutral conceptual model that can form the basis for a web-based interchange format using evolving information technology (e.g., XML, RDF, OWL), and guide implementation of geoscience databases in a common conceptual framework. The intended purpose is to allow geologic information sharing between geologic map data providers and users, independent of local information system implementation. The model emphasizes geoscience concepts and relationships related to information presented on geologic maps. Design has been guided by an informal requirements analysis, documentation of existing databases, technology developments, and other standardization efforts in the geoscience and computer-science communities. A key aspect of the model is the notion that representation of the conceptual framework (ontology) that underlies geologic map data must be part of the model, because this framework changes with time and understanding, and varies between information providers. The top level of the model distinguishes geologic concepts, geologic representation concepts, and metadata. The geologic representation part of the model provides a framework for representing the ontology that underlies geologic map data through a controlled vocabulary, and for establishing the relationships between this vocabulary and a geologic map visualization or portrayal. Top-level geologic classes in the model are Earth material (substance), geologic unit (parts of the Earth), geologic age, geologic structure, fossil, geologic process, geologic relation, and geologic event.

  3. Towards Model Driven Tool Interoperability: Bridging Eclipse and Microsoft Modeling Tools

    NASA Astrophysics Data System (ADS)

    Brunelière, Hugo; Cabot, Jordi; Clasen, Cauê; Jouault, Frédéric; Bézivin, Jean

    Successful application of model-driven engineering approaches requires interchanging a lot of relevant data among the tool ecosystem employed by an engineering team (e.g., requirements elicitation tools, several kinds of modeling tools, reverse engineering tools, development platforms and so on). Unfortunately, this is not a trivial task. Poor tool interoperability makes data interchange a challenge even among tools with a similar scope. This paper presents a model-based solution to overcome such interoperability issues. With our approach, the internal schema/s (i.e., metamodel/s) of each tool are explicited and used as basis for solving syntactic and semantic differences between the tools. Once the corresponding metamodels are aligned, model-to-model transformations are (semi)automatically derived and executed to perform the actual data interchange. We illustrate our approach by bridging the Eclipse and Microsoft (DSL Tools and SQL Server Modeling) modeling tools.

  4. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  5. Model Breaking Points Conceptualized

    ERIC Educational Resources Information Center

    Vig, Rozy; Murray, Eileen; Star, Jon R.

    2014-01-01

    Current curriculum initiatives (e.g., National Governors Association Center for Best Practices and Council of Chief State School Officers 2010) advocate that models be used in the mathematics classroom. However, despite their apparent promise, there comes a point when models break, a point in the mathematical problem space where the model cannot,…

  6. Extending the Interoperability of Sensor and Sample Based Earth Observations using a Community Information Model (Invited)

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Lehnert, K. A.; Mayorga, E.; Tarboton, D. G.; Zaslavsky, I.; Valentine, D. W.; Whitenack, T.

    2013-12-01

    from in situ sensors and from environmental samples, as well as data products directly derived from them. Using the existing CUAHSI HIS Observations Data Model (ODM), EarthChem's database structure, and the Open Geospatial Consortium's Observations & Measurements specification as starting points, our multidisciplinary, community-focused effort has been aimed at building consensus about the elements of the information model and addressing deficiencies in data interoperability both within and among existing geoscience cyberinfrastructures. The supporting software infrastructure we are developing includes storage, transfer, and catalog encodings of the information model and additional software tools aimed at improving data capture, validation, verification, sharing, and archival. We are using diverse data use cases from existing repositories and observatories to demonstrate how this advanced information model can support federation of earth observational data across multiple data publication systems within the geosciences. We anticipate that this information model and its prototype implementations can also serve as a common conceptual foundation for the next generation of geoscience cyberinfrastructure. In this presentation we describe our draft designs for the ODM2 information model and how ODM2 is foundational in achieving deeper interoperability across multiple disciplines and systems to support powerful data discovery, access, publication and analysis capabilities.

  7. Conceptual dynamical models for turbulence

    PubMed Central

    Majda, Andrew J.; Lee, Yoonsang

    2014-01-01

    Understanding the complexity of anisotropic turbulent processes in engineering and environmental fluid flows is a formidable challenge with practical significance because energy often flows intermittently from the smaller scales to impact the largest scales in these flows. Conceptual dynamical models for anisotropic turbulence are introduced and developed here which, despite their simplicity, capture key features of vastly more complicated turbulent systems. These conceptual models involve a large-scale mean flow and turbulent fluctuations on a variety of spatial scales with energy-conserving wave–mean-flow interactions as well as stochastic forcing of the fluctuations. Numerical experiments with a six-dimensional conceptual dynamical model confirm that these models capture key statistical features of vastly more complex anisotropic turbulent systems in a qualitative fashion. These features include chaotic statistical behavior of the mean flow with a sub-Gaussian probability distribution function (pdf) for its fluctuations whereas the turbulent fluctuations have decreasing energy and correlation times at smaller scales, with nearly Gaussian pdfs for the large-scale fluctuations and fat-tailed non-Gaussian pdfs for the smaller-scale fluctuations. This last feature is a manifestation of intermittency of the small-scale fluctuations where turbulent modes with small variance have relatively frequent extreme events which directly impact the mean flow. The dynamical models introduced here potentially provide a useful test bed for algorithms for prediction, uncertainty quantification, and data assimilation for anisotropic turbulent systems. PMID:24753605

  8. Conceptual dynamical models for turbulence.

    PubMed

    Majda, Andrew J; Lee, Yoonsang

    2014-05-01

    Understanding the complexity of anisotropic turbulent processes in engineering and environmental fluid flows is a formidable challenge with practical significance because energy often flows intermittently from the smaller scales to impact the largest scales in these flows. Conceptual dynamical models for anisotropic turbulence are introduced and developed here which, despite their simplicity, capture key features of vastly more complicated turbulent systems. These conceptual models involve a large-scale mean flow and turbulent fluctuations on a variety of spatial scales with energy-conserving wave-mean-flow interactions as well as stochastic forcing of the fluctuations. Numerical experiments with a six-dimensional conceptual dynamical model confirm that these models capture key statistical features of vastly more complex anisotropic turbulent systems in a qualitative fashion. These features include chaotic statistical behavior of the mean flow with a sub-Gaussian probability distribution function (pdf) for its fluctuations whereas the turbulent fluctuations have decreasing energy and correlation times at smaller scales, with nearly Gaussian pdfs for the large-scale fluctuations and fat-tailed non-Gaussian pdfs for the smaller-scale fluctuations. This last feature is a manifestation of intermittency of the small-scale fluctuations where turbulent modes with small variance have relatively frequent extreme events which directly impact the mean flow. The dynamical models introduced here potentially provide a useful test bed for algorithms for prediction, uncertainty quantification, and data assimilation for anisotropic turbulent systems. PMID:24753605

  9. CONCEPTUAL MODELS FOR WATERSHED AND REGIONAL ASSESSMENTS

    EPA Science Inventory

    Conceptual models, as defined here, describe and illustrate the relationships between ecological receptors, the stressors to which they may be exposed, and the potential sources of the those stressors within a particular area or ecosystem. This document describes conceptual model...

  10. Conceptual Models for Search Engines

    NASA Astrophysics Data System (ADS)

    Hendry, D. G.; Efthimiadis, E. N.

    Search engines have entered popular culture. They touch people in diverse private and public settings and thus heighten the importance of such important social matters as information privacy and control, censorship, and equitable access. To fully benefit from search engines and to participate in debate about their merits, people necessarily appeal to their understandings for how they function. In this chapter we examine the conceptual understandings that people have of search engines by performing a content analysis on the sketches that 200 undergraduate and graduate students drew when asked to draw a sketch of how a search engine works. Analysis of the sketches reveals a diverse range of conceptual approaches, metaphors, representations, and misconceptions. On the whole, the conceptual models articulated by these students are simplistic. However, students with higher levels of academic achievement sketched more complete models. This research calls attention to the importance of improving students' technical knowledge of how search engines work so they can be better equipped to develop and advocate policies for how search engines should be embedded in, and restricted from, various private and public information settings.

  11. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard

    PubMed Central

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong

    2014-01-01

    Objectives Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Methods Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. Results In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. Conclusions A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models. PMID:24627817

  12. toolkit computational mesh conceptual model.

    SciTech Connect

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-03-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  13. A conceptual model for translating omic data into clinical action

    PubMed Central

    Herr, Timothy M.; Bielinski, Suzette J.; Bottinger, Erwin; Brautbar, Ariel; Brilliant, Murray; Chute, Christopher G.; Denny, Joshua; Freimuth, Robert R.; Hartzler, Andrea; Kannry, Joseph; Kohane, Isaac S.; Kullo, Iftikhar J.; Lin, Simon; Pathak, Jyotishman; Peissig, Peggy; Pulley, Jill; Ralston, James; Rasmussen, Luke; Roden, Dan; Tromp, Gerard; Williams, Marc S.; Starren, Justin

    2015-01-01

    Genomic, proteomic, epigenomic, and other “omic” data have the potential to enable precision medicine, also commonly referred to as personalized medicine. The volume and complexity of omic data are rapidly overwhelming human cognitive capacity, requiring innovative approaches to translate such data into patient care. Here, we outline a conceptual model for the application of omic data in the clinical context, called “the omic funnel.” This model parallels the classic “Data, Information, Knowledge, Wisdom pyramid” and adds context for how to move between each successive layer. Its goal is to allow informaticians, researchers, and clinicians to approach the problem of translating omic data from bench to bedside, by using discrete steps with clearly defined needs. Such an approach can facilitate the development of modular and interoperable software that can bring precision medicine into widespread practice. PMID:26430534

  14. A conceptual model for translating omic data into clinical action.

    PubMed

    Herr, Timothy M; Bielinski, Suzette J; Bottinger, Erwin; Brautbar, Ariel; Brilliant, Murray; Chute, Christopher G; Denny, Joshua; Freimuth, Robert R; Hartzler, Andrea; Kannry, Joseph; Kohane, Isaac S; Kullo, Iftikhar J; Lin, Simon; Pathak, Jyotishman; Peissig, Peggy; Pulley, Jill; Ralston, James; Rasmussen, Luke; Roden, Dan; Tromp, Gerard; Williams, Marc S; Starren, Justin

    2015-01-01

    Genomic, proteomic, epigenomic, and other "omic" data have the potential to enable precision medicine, also commonly referred to as personalized medicine. The volume and complexity of omic data are rapidly overwhelming human cognitive capacity, requiring innovative approaches to translate such data into patient care. Here, we outline a conceptual model for the application of omic data in the clinical context, called "the omic funnel." This model parallels the classic "Data, Information, Knowledge, Wisdom pyramid" and adds context for how to move between each successive layer. Its goal is to allow informaticians, researchers, and clinicians to approach the problem of translating omic data from bench to bedside, by using discrete steps with clearly defined needs. Such an approach can facilitate the development of modular and interoperable software that can bring precision medicine into widespread practice. PMID:26430534

  15. Using a single content model for eHealth interoperability and secondary use.

    PubMed

    Atalag, Koray

    2013-01-01

    This chapter describes a middle-out approach to eHealth interoperability, with strong oversight on public health and health research, enabled by a uniform and shared content model to which all health information exchange conforms. As described in New Zealand's Interoperability Reference Architecture, the content model borrows its top level organization from the Continuity of Care Record (CCR) standard and is underpinned by the openEHR formalism. This provides a canonical model for representing a variety of clinical information, and serves as reference when determining payload in health information exchange. The main premise of this approach is that since all exchanged data conforms to the same model, interoperability of clinical information can readily be achieved. Use of Archetypes ensures preservation of clinical context which is critical for secondary use. The content model is envisaged to grow incrementally by adding new or specialised archetypes as finer details are needed in real projects. The consistency and long term viability of this approach critically depends on effective governance which requires new models of collaboration, decision making and appropriate tooling to support the process. PMID:24018523

  16. The Open Physiology workflow: modeling processes over physiology circuitboards of interoperable tissue units

    PubMed Central

    de Bono, Bernard; Safaei, Soroush; Grenon, Pierre; Nickerson, David P.; Alexander, Samuel; Helvensteijn, Michiel; Kok, Joost N.; Kokash, Natallia; Wu, Alan; Yu, Tommy; Hunter, Peter; Baldock, Richard A.

    2015-01-01

    A key challenge for the physiology modeling community is to enable the searching, objective comparison and, ultimately, re-use of models and associated data that are interoperable in terms of their physiological meaning. In this work, we outline the development of a workflow to modularize the simulation of tissue-level processes in physiology. In particular, we show how, via this approach, we can systematically extract, parcellate and annotate tissue histology data to represent component units of tissue function. These functional units are semantically interoperable, in terms of their physiological meaning. In particular, they are interoperable with respect to [i] each other and with respect to [ii] a circuitboard representation of long-range advective routes of fluid flow over which to model long-range molecular exchange between these units. We exemplify this approach through the combination of models for physiology-based pharmacokinetics and pharmacodynamics to quantitatively depict biological mechanisms across multiple scales. Links to the data, models and software components that constitute this workflow are found at http://open-physiology.org/. PMID:25759670

  17. A Conceptual Data Model of Datum Systems

    PubMed Central

    McCaleb, Michael R.

    1999-01-01

    A new conceptual data model that addresses the geometric dimensioning and tolerancing concepts of datum systems, datums, datum features, datum targets, and the relationships among these concepts, is presented. Additionally, a portion of a related data model, Part 47 of STEP (ISO 10303-47), is reviewed and a comparison is made between it and the new conceptual data model.

  18. Harmonization and translation of crop modeling data to ensure interoperability

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Agricultural Model Intercomparison and Improvement Project (AgMIP, www.agmip.org) seeks to improve the capability of ecophysiological and economic models to describe the potential impacts of climate change on agricultural systems. AgMIP protocols emphasize the use of multiple models; consequentl...

  19. Conceptual and logical level of database modeling

    NASA Astrophysics Data System (ADS)

    Hunka, Frantisek; Matula, Jiri

    2016-06-01

    Conceptual and logical levels form the top most levels of database modeling. Usually, ORM (Object Role Modeling) and ER diagrams are utilized to capture the corresponding schema. The final aim of business process modeling is to store its results in the form of database solution. For this reason, value oriented business process modeling which utilizes ER diagram to express the modeling entities and relationships between them are used. However, ER diagrams form the logical level of database schema. To extend possibilities of different business process modeling methodologies, the conceptual level of database modeling is needed. The paper deals with the REA value modeling approach to business process modeling using ER-diagrams, and derives conceptual model utilizing ORM modeling approach. Conceptual model extends possibilities for value modeling to other business modeling approaches.

  20. Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards

    PubMed Central

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir

    2013-01-01

    Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730

  1. Semantic Document Model to Enhance Data and Knowledge Interoperability

    NASA Astrophysics Data System (ADS)

    Nešić, Saša

    To enable document data and knowledge to be efficiently shared and reused across application, enterprise, and community boundaries, desktop documents should be completely open and queryable resources, whose data and knowledge are represented in a form understandable to both humans and machines. At the same time, these are the requirements that desktop documents need to satisfy in order to contribute to the visions of the Semantic Web. With the aim of achieving this goal, we have developed the Semantic Document Model (SDM), which turns desktop documents into Semantic Documents as uniquely identified and semantically annotated composite resources, that can be instantiated into human-readable (HR) and machine-processable (MP) forms. In this paper, we present the SDM along with an RDF and ontology-based solution for the MP document instance. Moreover, on top of the proposed model, we have built the Semantic Document Management System (SDMS), which provides a set of services that exploit the model. As an application example that takes advantage of SDMS services, we have extended MS Office with a set of tools that enables users to transform MS Office documents (e.g., MS Word and MS PowerPoint) into Semantic Documents, and to search local and distant semantic document repositories for document content units (CUs) over Semantic Web protocols.

  2. Utilizing Model Interoperability and High Performance Computing to Enhance Dust Storm Simulation

    NASA Astrophysics Data System (ADS)

    Huang, Q.; Yang, C.; Xie, J.; Wu, H.; Li, J.

    2009-12-01

    The simulations of dust storm and potential forecasting are of significant interest to public health, environment sciences, and global Earth observation system of systems (GEOSS). To support improved decision making of public health with higher resolution of dust storm forecasting. Model interoperability and high performance computing need to be leveraged to increase the resolution to the zip code level. This poses significant computational challenge for dust storm simulations. This presentation reports our research in utilizing interoperability technologies and high performance computing to enhance dust storm forecasting by facilitating model integration, data discovery, data access, and data utilization in a HPC (High performance computing) environment for a) reducing the computing time, b)lengthening the period of forecast, and c) ingesting large amount of geospatial datasets.DREAM-eta-8p and NMM-dust dust storm simulation models are utilized for the exploration of utilizing Model Interoperability and High Performance Computing to Enhance Dust Storm Forecasting. In our approach, the coarse model (DREAM-eta 8p) is used to identify hotspots of higher predicted dust concentration, and the output results are served as the input for the fine-grain model (NMM-dust) on the hotspot areas. After ingesting the DREAM-eta output the NMM-dust can start simulation. Experimental results demonstrates promising towards a forecasting system of dust storm forecasting. Acknowledgements: We would like to thank Drs. Karl Benedict, Bill Hudspeth of Univ. from New Mexico, Drs. William Sprigg, Goran Pejanovic, Slobodan Nickovic from UofArizona, and Dr. John D. Evans, and Ms. Myra J. Bambacus from NASA GSFC for the collaboration

  3. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    PubMed

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG). PMID:17512259

  4. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability

    PubMed Central

    Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.

    2008-01-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259

  5. The JSpOC Mission System (JMS) Common Data Model: Foundation for Net-Centric Interoperability for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Hutchison, M.; Kolarik, K.; Waters, J.

    2012-09-01

    The space situational awareness (SSA) data we access and use through existing SSA systems is largely provided in formats which cannot be readily understood by other systems (SSA or otherwise) without translation. As a result, while the data is useful for some known set of users, for other users it is not discoverable (no way to know it is there), accessible (if you did know, there is no way to electronically obtain the data) or machine-understandable (even if you did have access, the data exists in a format which cannot be readily ingested by your existing systems). Much of this existing data is unstructured, stored in non-standard formats which feed legacy systems. Data terms are not always unique, and calculations performed using legacy functions plugged into a service-oriented backbone can produce inconsistent results. The promise of data which is interoperable across systems and applications depends on a common data model as an underlying foundation for sharing information on a machine-to-machine basis. M2M interoperability is fundamental to performance, reducing or eliminating time-consuming translation and accelerating delivery to end users for final expert human analysis in support of mission fulfillment. A data model is common when it can be used by multiple programs and projects within a domain (e.g., C2 SSA). Model construction begins with known requirements and includes the development of conceptual and logical representations of the data. The final piece of the model is an implementable physical representation (e.g., XML schema) which can be used by developers to build working software components and systems. The JMS Common Data Model v1.0 was derived over six years from the National SSA Mission Threads under the direction of AFSPC/A5CN. The subsequent model became the A5CN approved JMS Requirements Model. The resulting logical and physical models have been registered in the DoD Metadata Registry under the C2 SSA Namespace and will be made available

  6. Linking Tectonics and Surface Processes through SNAC-CHILD Coupling: Preliminary Results Towards Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Choi, E.; Kelbert, A.; Peckham, S. D.

    2014-12-01

    We demonstrate that code coupling can be an efficient and flexible method for modeling complicated two-way interactions between tectonic and surface processes with SNAC-CHILD coupling as an example. SNAC is a deep earth process model (a geodynamic/tectonics model), built upon a scientific software framework called StGermain and also compatible with a model coupling framework called Pyre. CHILD is a popular surface process model (a landscape evolution model), interfaced to the CSDMS (Community Surface Dynamics Modeling System) modeling framework. We first present proof-of-concept but non-trivial results from a simplistic coupling scheme. We then report progress towards augmenting SNAC with a Basic Model Interface (BMI), a framework-agnostic standard interface developed by CSDMS that uses the CSDMS Standard Names as controlled vocabulary for model communication across domains. Newly interfaced to BMI, SNAC will be easily coupled with CHILD as well as other BMI-compatible models. In broader context, this work will test BMI as a general and easy-to-implement mechanism for sharing models between modeling frameworks and is a part of the NSF-funded EarthCube Building Blocks project, "Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks."

  7. Formalizing Linguistic Conventions for Conceptual Models

    NASA Astrophysics Data System (ADS)

    Becker, Jörg; Delfmann, Patrick; Herwig, Sebastian; Lis, Łukasz; Stein, Armin

    A precondition for the appropriate analysis of conceptual models is not only their syntactic correctness but also their semantic comparability. Assuring comparability is challenging especially when models are developed by different persons. Empirical studies show that such models can vary heavily, especially in model element naming, even if they express the same issue. In contrast to most ontology-driven approaches proposing the resolution of these differences ex-post, we introduce an approach that avoids naming differences in conceptual models already during modeling. Therefore we formalize naming conventions combining domain thesauri and phrase structures based on a lin-guistic grammar. This allows for guiding modelers automatically during the modeling process using standardized labels for model elements. Our approach is generic, making it applicable for any modeling language.

  8. Software interoperability for energy simulation

    SciTech Connect

    Hitchcock, Robert J.

    2002-07-31

    This paper provides an overview of software interoperability as it relates to the energy simulation of buildings. The paper begins with a discussion of the difficulties in using sophisticated analysis tools like energy simulation at various stages in the building life cycle, and the potential for interoperability to help overcome these difficulties. An overview of the Industry Foundation Classes (IFC), a common data model for supporting interoperability under continuing development by the International Alliance for Interoperability (IAI) is then given. The process of creating interoperable software is described next, followed by specific details for energy simulation tools. The paper closes with the current status of, and future plans for, the ongoing efforts to achieve software interoperability.

  9. Buildings Interoperability Landscape

    SciTech Connect

    Hardin, Dave; Stephan, Eric G.; Wang, Weimin; Corbin, Charles D.; Widergren, Steven E.

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  10. Leading Generative Groups: A Conceptual Model

    ERIC Educational Resources Information Center

    London, Manuel; Sobel-Lojeski, Karen A.; Reilly, Richard R.

    2012-01-01

    This article presents a conceptual model of leadership in generative groups. Generative groups have diverse team members who are expected to develop innovative solutions to complex, unstructured problems. The challenge for leaders of generative groups is to balance (a) establishing shared goals with recognizing members' vested interests, (b)…

  11. Self-Presentation: A Conceptualization and Model.

    ERIC Educational Resources Information Center

    Schlenker, Barry R.

    This paper provides a conceptual definition and model of self-presentational behavior. Self-presentation is defined as the attempt to control self-relevant images before real or imagined others. Several aspects of the definition are discussed along with the notion that people's self-presentations represent the choice of the most desirable images…

  12. A Conceptual Model of Rhetorical Community.

    ERIC Educational Resources Information Center

    Ehrenhaus, Peter

    A conceptual model of the rhetorical community that addresses the sociodramatic processes through which social order evolves, is maintained, can change, and is threatened is presented in this paper. Following an introduction, the paper identifies the various uses of rhetorical vision and rhetorical community that are found in fantasy theme…

  13. Administrator Training and Development: Conceptual Model.

    ERIC Educational Resources Information Center

    Boardman, Gerald R.

    A conceptual model for an individualized training program for school administrators integrates processes, characteristics, and tasks through theory training and application. Based on an application of contingency theory, it provides a system matching up administrative candidates' needs in three areas (administrative process, administrative…

  14. Conceptual Models of Frontal Cyclones.

    ERIC Educational Resources Information Center

    Eagleman, Joe R.

    1981-01-01

    This discussion of weather models uses maps to illustrate the differences among three types of frontal cyclones (long wave, short wave, and troughs). Awareness of these cyclones can provide clues to atmospheric conditions which can lead toward accurate weather forecasting. (AM)

  15. A conceptual model for megaprogramming

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Megaprogramming is component-based software engineering and life-cycle management. Magaprogramming and its relationship to other research initiatives (common prototyping system/common prototyping language, domain specific software architectures, and software understanding) are analyzed. The desirable attributes of megaprogramming software components are identified and a software development model and resulting prototype megaprogramming system (library interconnection language extended by annotated Ada) are described.

  16. The conceptualization model problem—surprise

    NASA Astrophysics Data System (ADS)

    Bredehoeft, John

    2005-03-01

    The foundation of model analysis is the conceptual model. Surprise is defined as new data that renders the prevailing conceptual model invalid; as defined here it represents a paradigm shift. Limited empirical data indicate that surprises occur in 20-30% of model analyses. These data suggest that groundwater analysts have difficulty selecting the appropriate conceptual model. There is no ready remedy to the conceptual model problem other than (1) to collect as much data as is feasible, using all applicable methods—a complementary data collection methodology can lead to new information that changes the prevailing conceptual model, and (2) for the analyst to remain open to the fact that the conceptual model can change dramatically as more information is collected. In the final analysis, the hydrogeologist makes a subjective decision on the appropriate conceptual model. The conceptualization problem does not render models unusable. The problem introduces an uncertainty that often is not widely recognized. Conceptual model uncertainty is exacerbated in making long-term predictions of system performance. C'est le modèle conceptuel qui se trouve à base d'une analyse sur un modèle. On considère comme une surprise lorsque le modèle est invalidé par des données nouvelles; dans les termes définis ici la surprise est équivalente à un change de paradigme. Des données empiriques limitées indiquent que les surprises apparaissent dans 20 à 30% des analyses effectuées sur les modèles. Ces données suggèrent que l'analyse des eaux souterraines présente des difficultés lorsqu'il s'agit de choisir le modèle conceptuel approprié. Il n'existe pas un autre remède au problème du modèle conceptuel que: (1) rassembler autant des données que possible en utilisant toutes les méthodes applicables—la méthode des données complémentaires peut conduire aux nouvelles informations qui vont changer le modèle conceptuel, et (2) l'analyste doit rester ouvert au fait

  17. Interoperability in Personalized Adaptive Learning

    ERIC Educational Resources Information Center

    Aroyo, Lora; Dolog, Peter; Houben, Geert-Jan; Kravcik, Milos; Naeve, Ambjorn; Nilsson, Mikael; Wild, Fridolin

    2006-01-01

    Personalized adaptive learning requires semantic-based and context-aware systems to manage the Web knowledge efficiently as well as to achieve semantic interoperability between heterogeneous information resources and services. The technological and conceptual differences can be bridged either by means of standards or via approaches based on the…

  18. Semantically Interoperable XML Data.

    PubMed

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-09-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  19. Semantically Interoperable XML Data

    PubMed Central

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  20. Uncertainty and the Conceptual Site Model

    NASA Astrophysics Data System (ADS)

    Price, V.; Nicholson, T. J.

    2007-12-01

    Our focus is on uncertainties in the underlying conceptual framework upon which all subsequent steps in numerical and/or analytical modeling efforts depend. Experienced environmental modelers recognize the value of selecting an optimal conceptual model from several competing site models, but usually do not formally explore possible alternative models, in part due to incomplete or missing site data, as well as relevant regional data for establishing boundary conditions. The value in and approach for developing alternative conceptual site models (CSM) is demonstrated by analysis of case histories. These studies are based on reported flow or transport modeling in which alternative site models are formulated using data that were not available to, or not used by, the original modelers. An important concept inherent to model abstraction of these alternative conceptual models is that it is "Far better an approximate answer to the right question, which is often vague, than the exact answer to the wrong question, which can always be made precise." (Tukey, 1962) The case histories discussed here illustrate the value of formulating alternative models and evaluating them using site-specific data: (1) Charleston Naval Site where seismic characterization data allowed significant revision of the CSM and subsequent contaminant transport modeling; (2) Hanford 300-Area where surface- and ground-water interactions affecting the unsaturated zone suggested an alternative component to the site model; (3) Savannah River C-Area where a characterization report for a waste site within the modeled area was not available to the modelers, but provided significant new information requiring changes to the underlying geologic and hydrogeologic CSM's used; (4) Amargosa Desert Research Site (ADRS) where re-interpretation of resistivity sounding data and water-level data suggested an alternative geologic model. Simple 2-D spreadsheet modeling of the ADRS with the revised CSM provided an improved

  1. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  2. Offline Interoperability, Cost Reduction and R eliability for Operational Procedures Using Meta-Modeling Technology

    NASA Astrophysics Data System (ADS)

    Poupart, E.; Jolly, G.; Percebois, C.; Bazex, P.; Palanque, P.; Basnyat, S.; Rabault, P.; Sabatier, L.; Walrawens, A.

    2008-08-01

    In this paper, we present a CNES participation through a case study in a research project called DOMINO financed by the French National Research Agency (ANR) RNTL. This project has started in March 2007 and will end in March 2009, it regroups academics (ENSIETA, IRISA, and IRIT), industries and agencies, (AIRBUS, CEA, CNES and SODIFRANCE). This project has two main goals: to develop reliable Model Driven Engineering (MDE) components and to build bridges with Domain Specific Languages (DSL). CNES participates in this project through a case study on the reliable design of operational procedures and associated applications. There are two main objectives for this case study: the first to improve "offline" interoperability with the possibility to build import/export tools for any scripting procedure language by using meta-modeling technology. The second is to improve efficiency for the production, validation, and execution of scripting procedures using operational specifications. It is anticipated that this will result in a reduction of costs and reliability improvement.

  3. A Conceptual Model of Learning Networks

    NASA Astrophysics Data System (ADS)

    Koper, Rob

    In the TENCompetence project a set of UML models (Booch et al. 1999) have been developed to specify the core concepts for Learning Networks Services that support professional competence development. The three most important, high-level models are (a) the use case model, (b) the conceptual model, and (c) the domain model. The first model identifies the primary use cases we need in order to support professional competence development. The second model describes the concept of competence and competence development from a theoretical point of view. What is a competence? How does it relate to the cognitive system of an actor? How are competences developed? The third model is a UML Domain Model that defines, among other things, the components of a Learning Network, defines the concepts and relationships between the concepts in a Learning Network and provides a starting point for the design of the overall architecture for Learning Network Services, including the data model.

  4. A contemporary conceptual model of hypochondriasis.

    PubMed

    Abramowitz, Jonathan S; Schwartz, Stefanie A; Whiteside, Stephen P

    2002-12-01

    Hypochondriasis (HC), which involves preoccupation with the fear of having a serious illness despite appropriate medical examination, is often encountered in medical settings. The most conspicuous feature of this disorder is seeking excessive reassurance from physicians, medical references, or self-inspection; however, many patients also fear they will receive upsetting information if evaluated and thus avoid consultations and remain preoccupied with physiologic events, believing they are physically ill. Thus, HC causes personal suffering for the patient and practical and cost management problems for professionals across fields of clinical practice. The past 2 decades have seen considerable improvement in the understanding and treatment of HC. In this article, we review a contemporary conceptual model of HC and an effective form of treatment called cognitive-behavioral therapy that is derived from this model. Recommendations for presenting this conceptualization to patients and encouraging proper treatment are also discussed. PMID:12479520

  5. Policy Issues in Accessibility and Interoperability of Scientific Data: Experiences from the Carbon Modeling Field

    NASA Astrophysics Data System (ADS)

    Kishor, P.; Peckham, S. D.; Gower, S. T.; Batzli, S.

    2010-12-01

    Large-scale terrestrial ecosystem modeling is highly parameterized, and requires lots of historical data. Routine model runs can easily utlize hundreds of Gigabytes, even Terabytes of data on tens, perhaps hundreds of parameters. It is a given that no one modeler can or does collect all the required data. All modelers depend upon other scientists, and governmental and research agencies for their data needs. This is where data accessibility and interoperability become crucial for the success of the project. Having well-documented and quality data available in a timely fashion can greatly assist a project's progress, while the converse can bring the project to a standstill, leading to a large amount of wasted staff time and resources. Data accessibility is a complex issue -- at best, it is an unscientific composite of a variety of factors: technological, legal, cultural, semantic, and economic. In reality, it is a concept that most scientists only worry about when they need some data, and mostly never after their project is complete. The exigencies of the vetting, review and publishing processes overtake the long-term view of making one's own data available to others with the same ease and openness that was desired when seeking data from others. This presentation describes our experience with acquiring data for our carbon modeling efforts, dealing with federal, state and local agencies, variety of data formats, some published, some not so easy to find, and documentation that ranges from excellent to non-existent. A set of indicators are proposed to place and determine the accessibility of scientific data -- those we are seeking and those we are producing -- in order to bring some transparency and clarity that can make data acquisition and sharing easier. The paper concludes with a proposal to utilize a free, open and well-recognized data marks such as CC0 (CC-Zero), Public Domain Dedication License, and CC-BY created by Creative Commons that would advertize the

  6. Optimal combinations of specialized conceptual hydrological models

    NASA Astrophysics Data System (ADS)

    Kayastha, Nagendra; Lal Shrestha, Durga; Solomatine, Dimitri

    2010-05-01

    In hydrological modelling it is a usual practice to use a single lumped conceptual model for hydrological simulations at all regimes. However often the simplicity of the modelling paradigm leads to errors in represent all the complexity of the physical processes in the catchment. A solution could be to model various hydrological processes separately by differently parameterized models, and to combine them. Different hydrological models have varying performance in reproducing catchment response. Generally it cannot be represented precisely in different segments of the hydrograph: some models performed well in simulating the peak flows, while others do well in capturing the low flows. Better performance can be achieved if a model being applied to the catchment using different model parameters that are calibrated using criteria favoring high or low flows. In this work we use a modular approach to simulate hydrology of a catchment, wherein multiple models are applied to replicate the catchment responses and each "specialist" model is calibrated according to a specific objective function which is chosen in a way that forces the model to capture certain aspects of the hydrograph, and outputs of models are combined using so-called "fuzzy committee". Such multi-model approach has been already previously implemented in the development of data driven and conceptual models (Fenicia et al., 2007), but its perfomance was considered only during the calibration period. In this study we tested an application to conceptual models in both calibration and verification period. In addition, we tested the sensitivity of the result to the use of different weightings used in the objective functions formulations, and memberbship functions used in the committee. The study was carried out for Bagamati catchment in Nepal and Brue catchment in United Kingdoms with the MATLAB-based implementation of HBV model. Multi-objective evolutionary optimization genetic algorithm (Deb, 2001) was used to

  7. Conceptual models for cumulative risk assessment.

    PubMed

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects. PMID:22021317

  8. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    ERIC Educational Resources Information Center

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  9. A Structural Equation Model of Conceptual Change in Physics

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Sinatra, Gale M.

    2011-01-01

    A model of conceptual change in physics was tested on introductory-level, college physics students. Structural equation modeling was used to test hypothesized relationships among variables linked to conceptual change in physics including an approach goal orientation, need for cognition, motivation, and course grade. Conceptual change in physics…

  10. Achieving control and interoperability through unified model-based systems and software engineering

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  11. A core observational data model for enhancing the interoperability of ontologically annotated environmental data

    NASA Astrophysics Data System (ADS)

    Schildhauer, M.; Bermudez, L. E.; Bowers, S.; Dibner, P. C.; Gries, C.; Jones, M. B.; McGuinness, D. L.; Cao, H.; Cox, S. J.; Kelling, S.; Lagoze, C.; Lapp, H.; Madin, J.

    2010-12-01

    Research in the environmental sciences often requires accessing diverse data, collected by numerous data providers over varying spatiotemporal scales, incorporating specialized measurements from a range of instruments. These measurements are typically documented using idiosyncratic, disciplinary specific terms, and stored in management systems ranging from desktop spreadsheets to the Cloud, where the information is often further decomposed or stylized in unpredictable ways. This situation creates major informatics challenges for broadly discovering, interpreting, and merging the data necessary for integrative earth science research. A number of scientific disciplines have recognized these issues, and been developing semantically enhanced data storage frameworks, typically based on ontologies, to enable communities to better circumscribe and clarify the content of data objects within their domain of practice. There is concern, however, that cross-domain compatibility of these semantic solutions could become problematic. We describe here our efforts to address this issue by developing a core, unified Observational Data Model, that should greatly facilitate interoperability among the semantic solutions growing organically within diverse scientific domains. Observational Data Models have emerged independently from several distinct scientific communities, including the biodiversity sciences, ecology, evolution, geospatial sciences, and hydrology, to name a few. Informatics projects striving for data integration within each of these domains had converged on identifying "observations" and "measurements" as fundamental abstractions that provide useful "templates" through which scientific data can be linked— at the structural, composited, or even cell value levels— to domain terms stored in ontologies or other forms of controlled vocabularies. The Scientific Observations Network, SONet (http://sonet.ecoinformatics.org) brings together a number of these observational

  12. A Conceptual Model of Referee Efficacy

    PubMed Central

    Guillén, Félix; Feltz, Deborah L.

    2010-01-01

    This paper presents a conceptual model of referee efficacy, defines the concept, proposes sources of referee specific efficacy information, and suggests consequences of having high or low referee efficacy. Referee efficacy is defined as the extent to which referees believe they have the capacity to perform successfully in their job. Referee efficacy beliefs are hypothesized to be influenced by mastery experiences, referee knowledge/education, support from significant others, physical/mental preparedness, environmental comfort, and perceived anxiety. In turn, referee efficacy beliefs are hypothesized to influence referee performance, referee stress, athlete rule violations, athlete satisfaction, and co-referee satisfaction. PMID:21713174

  13. Facet Modelling: An Approach to Flexible and Integrated Conceptual Modelling.

    ERIC Educational Resources Information Center

    Opdahl, Andreas L.; Sindre, Guttorm

    1997-01-01

    Identifies weaknesses of conceptual modelling languages for the problem domain of information systems (IS) development. Outlines an approach called facet modelling of real-world problem domains to deal with the complexity of contemporary analysis problems. Shows how facet models can be defined and visualized; discusses facet modelling in relation…

  14. Propulsion System Models for Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2014-01-01

    The conceptual design code NDARC (NASA Design and Analysis of Rotorcraft) was initially implemented to model conventional rotorcraft propulsion systems, consisting of turboshaft engines burning jet fuel, connected to one or more rotors through a mechanical transmission. The NDARC propulsion system representation has been extended to cover additional propulsion concepts, including electric motors and generators, rotor reaction drive, turbojet and turbofan engines, fuel cells and solar cells, batteries, and fuel (energy) used without weight change. The paper describes these propulsion system components, the architecture of their implementation in NDARC, and the form of the models for performance and weight. Requirements are defined for improved performance and weight models of the new propulsion system components. With these new propulsion models, NDARC can be used to develop environmentally-friendly rotorcraft designs.

  15. CONCEPTUAL MODELS FOR THE LASSEN HYDROTHERMAL SYSTEM.

    USGS Publications Warehouse

    Ingebritsen, S.E.; Sorey, M.L.

    1987-01-01

    The Lassen hydrothermal system, like a number of other systems in regions of moderate to great topographic relief, includes steam-heated features at higher elevations and high-chloride springs at lower elevations, connected to and fed by a single circulation system at depth. Two conceptual models for such systems are presented. They are similar in several ways: however, there are basic differences in terms of the nature and extent of vapor-dominated conditions beneath the steam-heated features. For some Lassen-like systems, these differences could have environmental and economic implications. Available data do not make it possible to establish a single preferred model for the Lassen system, and the actual system is complex enough that both models may apply to different parts of the system.

  16. Data Modeling & the Infrastructural Nature of Conceptual Tools

    ERIC Educational Resources Information Center

    Lesh, Richard; Caylor, Elizabeth; Gupta, Shweta

    2007-01-01

    The goal of this paper is to demonstrate the infrastructural nature of many modern conceptual technologies. The focus of this paper is on conceptual tools associated with elementary types of data modeling. We intend to show a variety of ways in which these conceptual tools not only express thinking, but also mold and shape thinking. And those ways…

  17. Detecting hydrological changes through conceptual model

    NASA Astrophysics Data System (ADS)

    Viola, Francesco; Caracciolo, Domenico; Pumo, Dario; Francipane, Antonio; Valerio Noto, Leonardo

    2015-04-01

    Natural changes and human modifications in hydrological systems coevolve and interact in a coupled and interlinked way. If, on one hand, climatic changes are stochastic, non-steady, and affect the hydrological systems, on the other hand, human-induced changes due to over-exploitation of soils and water resources modifies the natural landscape, water fluxes and its partitioning. Indeed, the traditional assumption of static systems in hydrological analysis, which has been adopted for long time, fails whenever transient climatic conditions and/or land use changes occur. Time series analysis is a way to explore environmental changes together with societal changes; unfortunately, the not distinguishability between causes restrict the scope of this method. In order to overcome this limitation, it is possible to couple time series analysis with an opportune hydrological model, such as a conceptual hydrological model, which offers a schematization of complex dynamics acting within a basin. Assuming that model parameters represent morphological basin characteristics and that calibration is a way to detect hydrological signature at a specific moment, it is possible to argue that calibrating the model over different time windows could be a method for detecting potential hydrological changes. In order to test the capabilities of a conceptual model in detecting hydrological changes, this work presents different "in silico" experiments. A synthetic-basin is forced with an ensemble of possible future scenarios generated with a stochastic weather generator able to simulate steady and non-steady climatic conditions. The experiments refer to Mediterranean climate, which is characterized by marked seasonality, and consider the outcomes of the IPCC 5th report for describing climate evolution in the next century. In particular, in order to generate future climate change scenarios, a stochastic downscaling in space and time is carried out using realizations of an ensemble of General

  18. Turnaround Time Modeling for Conceptual Rocket Engines

    NASA Technical Reports Server (NTRS)

    Nix, Michael; Staton, Eric J.

    2004-01-01

    Recent years have brought about a paradigm shift within NASA and the Space Launch Community regarding the performance of conceptual design. Reliability, maintainability, supportability, and operability are no longer effects of design; they have moved to the forefront and are affecting design. A primary focus of this shift has been a planned decrease in vehicle turnaround time. Potentials for instituting this decrease include attacking the issues of removing, refurbishing, and replacing the engines after each flight. less, it is important to understand the operational affects of an engine on turnaround time, ground support personnel and equipment. One tool for visualizing this relationship involves the creation of a Discrete Event Simulation (DES). A DES model can be used to run a series of trade studies to determine if the engine is meeting its requirements, and, if not, what can be altered to bring it into compliance. Using DES, it is possible to look at the ways in which labor requirements, parallel maintenance versus serial maintenance, and maintenance scheduling affect the overall turnaround time. A detailed DES model of the Space Shuttle Main Engines (SSME) has been developed. Trades may be performed using the SSME Processing Model to see where maintenance bottlenecks occur, what the benefits (if any) are of increasing the numbers of personnel, or the number and location of facilities, in addition to trades previously mentioned, all with the goal of optimizing the operational turnaround time and minimizing operational cost. The SSME Processing Model was developed in such a way that it can easily be used as a foundation for developing DES models of other operational or developmental reusable engines. Performing a DES on a developmental engine during the conceptual phase makes it easier to affect the design and make changes to bring about a decrease in turnaround time and costs.

  19. Evaluating Conceptual Site Models with Multicomponent Reactive Transport Modeling

    NASA Astrophysics Data System (ADS)

    Dai, Z.; Heffner, D.; Price, V.; Temples, T. J.; Nicholson, T. J.

    2005-05-01

    Modeling ground-water flow and multicomponent reactive chemical transport is a useful approach for testing conceptual site models and assessing the design of monitoring networks. A graded approach with three conceptual site models is presented here with a field case of tetrachloroethene (PCE) transport and biodegradation near Charleston, SC. The first model assumed a one-layer homogeneous aquifer structure with semi-infinite boundary conditions, in which an analytical solution of the reactive solute transport can be obtained with BIOCHLOR (Aziz et al., 1999). Due to the over-simplification of the aquifer structure, this simulation cannot reproduce the monitoring data. In the second approach we used GMS to develop the conceptual site model, a layer-cake multi-aquifer system, and applied a numerical module (MODFLOW and RT3D within GMS) to solve the flow and reactive transport problem. The results were better than the first approach but still did not fit the plume well because the geological structures were still inadequately defined. In the third approach we developed a complex conceptual site model by interpreting log and seismic survey data with Petra and PetraSeis. We detected a major channel and a younger channel, through the PCE source area. These channels control the local ground-water flow direction and provide a preferential chemical transport pathway. Results using the third conceptual site model agree well with the monitoring concentration data. This study confirms that the bias and uncertainty from inadequate conceptual models are much larger than those introduced from an inadequate choice of model parameter values (Neuman and Wierenga, 2003; Meyer et al., 2004). Numerical modeling in this case provides key insight into the hydrogeology and geochemistry of the field site for predicting contaminant transport in the future. Finally, critical monitoring points and performance indicator parameters are selected for future monitoring to confirm system

  20. Dynamic Multicriteria Evaluation of Conceptual Hydrological Models

    NASA Astrophysics Data System (ADS)

    de Vos, N. J.; Rientjes, T. H.; Fenicia, F.; Gupta, H. V.

    2007-12-01

    Accurate and precise forecasts of river streamflows are crucial for successful management of water resources and under the threat of hydrological extremes such as floods and droughts. Conceptual rainfall-runoff models are the most popular approach in flood forecasting. However, the calibration and evaluation of such models is often oversimplified by the use of performance statistics that largely ignore the dynamic character of a watershed system. This research aims to find novel ways of model evaluation by identifying periods of hydrologic similarity and customizing evaluation within each period using multiple criteria. A dynamic approach to hydrologic model identification, calibration and testing can be realized by applying clustering algorithms (e.g., Self-Organizing Map, Fuzzy C-means algorithm) to hydrological data. These algorithms are able to identify clusters in the data that represent periods of hydrological similarity. In this way, dynamic catchment system behavior can be simplified within the clusters that are identified. Although clustering requires a number of subjective choices, new insights into the hydrological functioning of a catchment can be obtained. Finally, separate model multi-criteria calibration and evaluation is performed for each of the clusters. Such a model evaluation procedure shows to be reliable and gives much-needed feedback on exactly where certain model structures fail. Several clustering algorithms were tested on two data sets of meso-scale and large-scale catchments. The results show that the clustering algorithms define categories that reflect hydrological process understanding: dry/wet seasons, rising/falling hydrograph limbs, precipitation-driven/ non-driven periods, etc. The results of various clustering algorithms are compared and validated using expert knowledge. Calibration results on a conceptual hydrological model show that the common practice of single-criteria calibration over the complete time series fails to perform

  1. Identifiability analysis in conceptual sewer modelling.

    PubMed

    Kleidorfer, M; Leonhardt, G; Rauch, W

    2012-01-01

    For a sufficient calibration of an environmental model not only parameter sensitivity but also parameter identifiability is an important issue. In identifiability analysis it is possible to analyse whether changes in one parameter can be compensated by appropriate changes of the other ones within a given uncertainty range. Parameter identifiability is conditional to the information content of the calibration data and consequently conditional to a certain measurement layout (i.e. types of measurements, number and location of measurement sites, temporal resolution of measurements etc.). Hence the influence of number and location of measurement sites on the number of identifiable parameters can be investigated. In the present study identifiability analysis is applied to a conceptual model of a combined sewer system aiming to predict the combined sewer overflow emissions. Different measurement layouts are tested and it can be shown that only 13 of the most sensitive catchment areas (represented by the model parameter 'effective impervious area') can be identified when overflow measurements of the 20 highest overflows and the runoff to the waste water treatment plant are used for calibration. The main advantage of this method is very low computational costs as the number of required model runs equals the total number of model parameters. Hence, this method is a valuable tool when analysing large models with a long runtime and many parameters. PMID:22864432

  2. A conceptual model of intentional comfort touch.

    PubMed

    Connor, Ann; Howett, Maeve

    2009-06-01

    This article discusses the application and integration of intentional comfort touch as a holistic nursing practice. A review of the literature on touch and its related concepts is included. Although nurses use touch frequently in patient encounters, it is not always used intentionally or deliberately to enhance care. The article compares and contrasts intentional comfort touch with nonintentional or procedural touch. The use of intentional comfort touch in innovative clinical settings with diverse and at-risk populations is described. Based on clinical experiences and the current literature, a conceptual model of intentional comfort touch is proposed. The application of touch is discussed as is the meaning and importance of intentional touch for students, faculty, and patients. PMID:19443699

  3. Stormwater infiltration trenches: a conceptual modelling approach.

    PubMed

    Freni, Gabriele; Mannina, Giorgio; Viviani, Gaspare

    2009-01-01

    In recent years, limitations linked to traditional urban drainage schemes have been pointed out and new approaches are developing introducing more natural methods for retaining and/or disposing of stormwater. These mitigation measures are generally called Best Management Practices or Sustainable Urban Drainage System and they include practices such as infiltration and storage tanks in order to reduce the peak flow and retain part of the polluting components. The introduction of such practices in urban drainage systems entails an upgrade of existing modelling frameworks in order to evaluate their efficiency in mitigating the impact of urban drainage systems on receiving water bodies. While storage tank modelling approaches are quite well documented in literature, some gaps are still present about infiltration facilities mainly dependent on the complexity of the involved physical processes. In this study, a simplified conceptual modelling approach for the simulation of the infiltration trenches is presented. The model enables to assess the performance of infiltration trenches. The main goal is to develop a model that can be employed for the assessment of the mitigation efficiency of infiltration trenches in an integrated urban drainage context. Particular care was given to the simulation of infiltration structures considering the performance reduction due to clogging phenomena. The proposed model has been compared with other simplified modelling approaches and with a physically based model adopted as benchmark. The model performed better compared to other approaches considering both unclogged facilities and the effect of clogging. On the basis of a long-term simulation of six years of rain data, the performance and the effectiveness of an infiltration trench measure are assessed. The study confirmed the important role played by the clogging phenomenon on such infiltration structures. PMID:19587416

  4. A Conceptual Modeling Approach for OLAP Personalization

    NASA Astrophysics Data System (ADS)

    Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan

    Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.

  5. A conceptual, distributed snow redistribution model

    NASA Astrophysics Data System (ADS)

    Frey, S.; Holzmann, H.

    2015-11-01

    When applying conceptual hydrological models using a temperature index approach for snowmelt to high alpine areas often accumulation of snow during several years can be observed. Some of the reasons why these "snow towers" do not exist in nature are vertical and lateral transport processes. While snow transport models have been developed using grid cell sizes of tens to hundreds of square metres and have been applied in several catchments, no model exists using coarser cell sizes of 1 km2, which is a common resolution for meso- and large-scale hydrologic modelling (hundreds to thousands of square kilometres). In this paper we present an approach that uses only gravity and snow density as a proxy for the age of the snow cover and land-use information to redistribute snow in alpine basins. The results are based on the hydrological modelling of the Austrian Inn Basin in Tyrol, Austria, more specifically the Ötztaler Ache catchment, but the findings hold for other tributaries of the river Inn. This transport model is implemented in the distributed rainfall-runoff model COSERO (Continuous Semi-distributed Runoff). The results of both model concepts with and without consideration of lateral snow redistribution are compared against observed discharge and snow-covered areas derived from MODIS satellite images. By means of the snow redistribution concept, snow accumulation over several years can be prevented and the snow depletion curve compared with MODIS (Moderate Resolution Imaging Spectroradiometer) data could be improved, too. In a 7-year period the standard model would lead to snow accumulation of approximately 2900 mm SWE (snow water equivalent) in high elevated regions whereas the updated version of the model does not show accumulation and does also predict discharge with more accuracy leading to a Kling-Gupta efficiency of 0.93 instead of 0.9. A further improvement can be shown in the comparison of MODIS snow cover data and the calculated depletion curve, where

  6. Conceptualizing Telehealth in Nursing Practice: Advancing a Conceptual Model to Fill a Virtual Gap.

    PubMed

    Nagel, Daniel A; Penner, Jamie L

    2016-03-01

    Increasingly nurses use various telehealth technologies to deliver health care services; however, there has been a lag in research and generation of empirical knowledge to support nursing practice in this expanding field. One challenge to generating knowledge is a gap in development of a comprehensive conceptual model or theoretical framework to illustrate relationships of concepts and phenomena inherent to adoption of a broad range of telehealth technologies to holistic nursing practice. A review of the literature revealed eight published conceptual models, theoretical frameworks, or similar entities applicable to nursing practice. Many of these models focus exclusively on use of telephones and four were generated from qualitative studies, but none comprehensively reflect complexities of bridging nursing process and elements of nursing practice into use of telehealth. The purpose of this article is to present a review of existing conceptual models and frameworks, discuss predominant themes and features of these models, and present a comprehensive conceptual model for telehealth nursing practice synthesized from this literature for consideration and further development. This conceptual model illustrates characteristics of, and relationships between, dimensions of telehealth practice to guide research and knowledge development in provision of holistic person-centered care delivery to individuals by nurses through telehealth technologies. PMID:25858897

  7. Hydrological Modeling Reproducibility Through Data Management and Adaptors for Model Interoperability

    NASA Astrophysics Data System (ADS)

    Turner, M. A.

    2015-12-01

    Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of

  8. Conceptual model for heart failure disease management.

    PubMed

    Andrikopoulou, Efstathia; Abbate, Kariann; Whellan, David J

    2014-03-01

    The objective of this review is to propose a conceptual model for heart failure (HF) disease management (HFDM) and to define the components of an efficient HFDM plan in reference to this model. Articles that evaluated 1 or more of the following aspects of HFDM were reviewed: (1) outpatient clinic follow-up; (2) self-care interventions to enhance patient skills; and (3) remote evaluation of worsening HF either using structured telephone support (STS) or by monitoring device data (telemonitoring). The success of programs in reducing readmissions and mortality were mixed. Outpatient follow-up programs generally resulted in improved outcomes, including decreased readmissions. Based on 1 meta-analysis, specialty clinics improved outcomes and nonspecialty clinics did not. Results from self-care programs were inconsistent and might have been affected by patient cognitive status and educational level, and intervention intensity. Telemonitoring, despite initially promising meta-analyses demonstrating a decrease in the number and duration of HF-related readmissions and all-cause mortality rates at follow-up, has not been shown in randomized trials to consistently reduce readmissions or mortality. However, evidence from device monitoring trials in particular might have been influenced by technology and design issues that might be rectified in future trials. Results from the literature suggest that the ideal HFDM plan would include outpatient follow-up at an HF specialty clinic and continuous education to improve patient self-care. The end result of this plan would lead to better understanding on the part of the patient and improved patient ability to recognize and respond to signs of decompensation. PMID:24565255

  9. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  10. Conceptual and Numerical Models for UZ Flow and Transport

    SciTech Connect

    H. Liu

    2000-03-03

    The purpose of this Analysis/Model Report (AMR) is to document the conceptual and numerical models used for modeling of unsaturated zone (UZ) fluid (water and air) flow and solute transport processes. This is in accordance with ''AMR Development Plan for U0030 Conceptual and Numerical Models for Unsaturated Zone (UZ) Flow and Transport Processes, Rev 00''. The conceptual and numerical modeling approaches described in this AMR are used for models of UZ flow and transport in fractured, unsaturated rock under ambient and thermal conditions, which are documented in separate AMRs. This AMR supports the UZ Flow and Transport Process Model Report (PMR), the Near Field Environment PMR, and the following models: Calibrated Properties Model; UZ Flow Models and Submodels; Mountain-Scale Coupled Processes Model; Thermal-Hydrologic-Chemical (THC) Seepage Model; Drift Scale Test (DST) THC Model; Seepage Model for Performance Assessment (PA); and UZ Radionuclide Transport Models.

  11. A Logical Model of Conceptual Integrity in Data Integration

    PubMed Central

    Flater, David

    2003-01-01

    Conceptual integrity is required for the result of data integration to be cohesive and sensible. Compromised conceptual integrity results in “semantic faults,” which are commonly blamed for latent integration bugs. A logical model of conceptual integrity in data integration and a simple example application are presented. Unlike constructive models that attempt to prevent semantic faults, this model allows both correct and incorrect integrations to be described. Imperfect legacy systems can therefore be modeled, allowing a more formal analysis of their flaws and the possible remedies.

  12. Model of Conceptual Change for INQPRO: A Bayesian Network Approach

    ERIC Educational Resources Information Center

    Ting, Choo-Yee; Sam, Yok-Cheng; Wong, Chee-Onn

    2013-01-01

    Constructing a computational model of conceptual change for a computer-based scientific inquiry learning environment is difficult due to two challenges: (i) externalizing the variables of conceptual change and its related variables is difficult. In addition, defining the causal dependencies among the variables is also not trivial. Such difficulty…

  13. Using Conceptual Change Theories to Model Position Concepts in Astronomy

    ERIC Educational Resources Information Center

    Yang, Chih-Chiang; Hung, Jeng-Fung

    2012-01-01

    The roles of conceptual change and model building in science education are very important and have a profound and wide effect on teaching science. This study examines the change in children's position concepts after instruction, based on different conceptual change theories. Three classes were chosen and divided into three groups, including a…

  14. A conceptual graphs modeling of UMLS components.

    PubMed

    Joubert, M; Miton, F; Fieschi, M; Robert, J J

    1995-01-01

    The Unified Medical Language System (UMLS) of the U.S. National Library of Medicine is a complex collection of terms, concepts, and relationships derived from standard classifications. Potential applications would benefit from a high level representation of its components. This paper proposes a conceptual representation of both the Metathesaurus and the Semantic Network of the UMLS based on conceptual graphs. It shows that the addition of a dictionary of concepts to the UMLS knowledge base allows the capability to exploit it pertinently. This dictionary defines more precisely the core concepts and adds constraints on their use. Constraints are dedicated to guide an "intelligent" browsing of the UMLS knowledge sources. PMID:8591348

  15. Challenges in Requirements Engineering: A Research Agenda for Conceptual Modeling

    NASA Astrophysics Data System (ADS)

    March, Salvatore T.; Allen, Gove N.

    Domains for which information systems are developed deal primarily with social constructions—conceptual objects and attributes created by human intentions and for human purposes. Information systems play an active role in these domains. They document the creation of new conceptual objects, record and ascribe values to their attributes, initiate actions within the domain, track activities performed, and infer conclusions based on the application of rules that govern how the domain is affected when socially-defined and identified causal events occur. Emerging applications of information technologies evaluate such business rules, learn from experience, and adapt to changes in the domain. Conceptual modeling grammars aimed at representing their system requirements must include conceptual objects, socially-defined events, and the rules pertaining to them. We identify challenges to conceptual modeling research and pose an ontology of the artificial as a step toward meeting them.

  16. Improving component interoperability and reusability with the java connection framework (JCF): overview and application to the ages-w environmental model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...

  17. Analysis of Hydrogeologic Conceptual Model and Parameter Uncertainty

    SciTech Connect

    Meyer, Philip D.; Nicholson, Thomas J.; Mishra, Srikanta

    2003-06-24

    A systematic methodology for assessing hydrogeologic conceptual model, parameter, and scenario uncertainties is being developed to support technical reviews of environmental assessments related to decommissioning of nuclear facilities. The first major task being undertaken is to produce a coupled parameter and conceptual model uncertainty assessment methodology. This task is based on previous studies that have primarily dealt individually with these two types of uncertainties. Conceptual model uncertainty analysis is based on the existence of alternative conceptual models that are generated using a set of clearly stated guidelines targeted at the needs of NRC staff. Parameter uncertainty analysis makes use of generic site characterization data as well as site-specific characterization and monitoring data to evaluate parameter uncertainty in each of the alternative conceptual models. Propagation of parameter uncertainty will be carried out through implementation of a general stochastic model of groundwater flow and transport in the saturated and unsaturated zones. Evaluation of prediction uncertainty will make use of Bayesian model averaging and visualization of model results. The goal of this study is to develop a practical tool to quantify uncertainties in the conceptual model and parameters identified in performance assessments.

  18. Overtraining and recovery. A conceptual model.

    PubMed

    Kenttä, G; Hassmén, P

    1998-07-01

    importance of active measures to improve the recovery process. Furthermore, directing attention to psychophysiological cues serves the same purpose as in RPE, i.e. increasing self-awareness. This article reviews and conceptualises the whole overtraining process. In doing so, it (i) aims to differentiate between the types of stress affecting an athlete's performance: (ii) identifies factors influencing an athlete's ability to adapt to physical training: (iii) structures the recovery process. The TQR method to facilitate monitoring of the recovery process is then suggested and a conceptual model that incorporates all of the important parameters for performance gain (adaptation) and loss (maladaptation). PMID:9739537

  19. Future of unmanned systems interoperability

    NASA Astrophysics Data System (ADS)

    Ackley, John J.; Wade, Robert L.; Gehring, Daniel G.

    2006-05-01

    There are many challenges in the area of interoperability of unmanned systems: increasing levels of autonomy, teaming and collaboration, long endurance missions, integration with civilian and military spaces. Several currently available methods and technologies may aid in meeting these and other challenges: consensus standards development, formal methods, model-based engineering, knowledge and ontology representation, agent-based systems, and plan language research. We believe the future of unmanned systems interoperability depends on the integration of these methods and technologies into a domain-independent plan language for unmanned systems.

  20. An Integrative-Interactive Conceptual Model for Curriculum Development.

    ERIC Educational Resources Information Center

    Al-Ibrahim, Abdul Rahman H.

    1982-01-01

    The Integrative-Interactive Conceptual Model for Curriculum Development calls for curriculum reform and innovation to be cybernetic so that all aspects of curriculum planning get adequate attention. (CJ)

  1. CONCEPTUAL MODEL DEVELOPMENT AND INFORMATION MANAGEMENT FRAMEWORK FOR DIAGNOSTICS RESEARCH

    EPA Science Inventory

    Conceptual model development will focus on the effects of habitat alteration, nutrients,suspended and bedded sediments, and toxic chemicals on appropriate endpoints (individuals, populations, communities, ecosystems) across spatial scales (habitats, water body, watershed, region)...

  2. Turning Interoperability Operational with GST

    NASA Astrophysics Data System (ADS)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially

  3. Towards a Model of Technology Adoption: A Conceptual Model Proposition

    NASA Astrophysics Data System (ADS)

    Costello, Pat; Moreton, Rob

    A conceptual model for Information Communication Technology (ICT) adoption by Small Medium Enterprises (SMEs) is proposed. The research uses several ICT adoption models as its basis with theoretical underpinning provided by the Diffusion of Innovation theory and the Technology Acceptance Model (TAM). Taking an exploratory research approach the model was investigated amongst 200 SMEs whose core business is ICT. Evidence from this study demonstrates that these SMEs face the same issues as all other industry sectors. This work points out weaknesses in SMEs environments regarding ICT adoption and suggests what they may need to do to increase the success rate of any proposed adoption. The methodology for development of the framework is described and recommendations made for improved Government-led ICT adoption initiatives. Application of the general methodology has resulted in new opportunities to embed the ethos and culture surrounding the issues into the framework of new projects developed as a result of Government intervention. A conceptual model is proposed that may lead to a deeper understanding of the issues under consideration.

  4. Conceptual Models and the Future of Special Education

    ERIC Educational Resources Information Center

    Kauffman, James M.

    2007-01-01

    A medical model has advantages over a legal model in thinking about special education, especially in responding supportively to difference, meeting individual needs, and practicing prevention. The legal conceptual model now dominates thinking about special education, but a medical model promises a brighter future for special education and for…

  5. Climate model uncertainty versus conceptual geological uncertainty in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-09-01

    Projections of climate change impact are associated with a cascade of uncertainties including in CO2 emission scenarios, climate models, downscaling and impact models. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context-dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty due to the climate models is more important for groundwater hydraulic heads and stream flow.

  6. Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-04-01

    Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.

  7. CONCEPTUAL FRAMEWORK FOR REGRESSION MODELING OF GROUND-WATER FLOW.

    USGS Publications Warehouse

    Cooley, Richard L.

    1985-01-01

    The author examines the uses of ground-water flow models and which classes of use require treatment of stochastic components. He then compares traditional and stochastic procedures for modeling actual (as distinguished from hypothetical) systems. Finally, he examines the conceptual basis and characteristics of the regression approach to modeling ground-water flow.

  8. Developing Models of Communicative Competence: Conceptual, Statistical, and Methodological Considerations.

    ERIC Educational Resources Information Center

    Cziko, Gary A.

    The development of an empirically based model of communicative competence is discussed in terms of conceptual, statistical, and methodological considerations. A distinction is made between descriptive and working models of communicative competence. Working models attempt to show how components of communicative competence are interrelated…

  9. A Conceptual Model of Career Development to Enhance Academic Motivation

    ERIC Educational Resources Information Center

    Collins, Nancy Creighton

    2010-01-01

    The purpose of this study was to develop, refine, and validate a conceptual model of career development to enhance the academic motivation of community college students. To achieve this end, a straw model was built from the theoretical and empirical research literature. The model was then refined and validated through three rounds of a Delphi…

  10. A Conceptual Model To Assist Educational Leaders Manage Change.

    ERIC Educational Resources Information Center

    Cochren, John R.

    This paper presents a conceptual model to help school leaders manage change effectively. The model was developed from a literature review of theory development and model construction. Specifically, the paper identifies the major components that inhibit organizational change, and synthesizes the most salient features of these components through a…

  11. Implementations Are Not Conceptualizations: Revising the Verb Learning Model.

    ERIC Educational Resources Information Center

    MacWhinney, Brian; Leinbach, Jared

    A model of the child's learning of the past tense forms of English verbs is discussed. This connectionist model takes as input a present-tense verb and provides as output a past tense form. A new simulation is applied to 13 problems raised by critics of the model, presented as fundamental flaws in the conceptualizations underlying connectionism.…

  12. Using a generalised identity reference model with archetypes to support interoperability of demographics information in electronic health record systems.

    PubMed

    Xu Chen; Berry, Damon; Stephens, Gaye

    2015-01-01

    Computerised identity management is in general encountered as a low-level mechanism that enables users in a particular system or region to securely access resources. In the Electronic Health Record (EHR), the identifying information of both the healthcare professionals who access the EHR and the patients whose EHR is accessed, are subject to change. Demographics services have been developed to manage federated patient and healthcare professional identities and to support challenging healthcare-specific use cases in the presence of diverse and sometimes conflicting demographic identities. Demographics services are not the only use for identities in healthcare. Nevertheless, contemporary EHR specifications limit the types of entities that can be the actor or subject of a record to health professionals and patients, thus limiting the use of two level models in other healthcare information systems. Demographics are ubiquitous in healthcare, so for a general identity model to be usable, it should be capable of managing demographic information. In this paper, we introduce a generalised identity reference model (GIRM) based on key characteristics of five surveyed demographic models. We evaluate the GIRM by using it to express the EN13606 demographics model in an extensible way at the metadata level and show how two-level modelling can support the exchange of instances of demographic identities. This use of the GIRM to express demographics information shows its application for standards-compliant two-level modelling alongside heterogeneous demographics models. We advocate this approach to facilitate the interoperability of identities between two-level model-based EHR systems and show the validity and the extensibility of using GIRM for the expression of other health-related identities. PMID:26737863

  13. A conceptual model for determining career choice of CHROME alumna based on farmer's conceptual models

    NASA Astrophysics Data System (ADS)

    Moore, Lisa Simmons

    This qualitative program evaluation examines the career decision-making processes and career choices of nine, African American women who participated in the Cooperating Hampton Roads Organization for Minorities in Engineering (CHROME) and who graduated from urban, rural or suburban high schools in the year 2000. The CHROME program is a nonprofit, pre-college intervention program that encourages underrepresented minority and female students to enter science, technically related, engineering, and math (STEM) career fields. The study describes career choices and decisions made by each participant over a five-year period since high school graduation. Data was collected through an Annual Report, Post High School Questionnaires, Environmental Support Questionnaires, Career Choice Questionnaires, Senior Reports, and standardized open-ended interviews. Data was analyzed using a model based on Helen C. Farmer's Conceptual Models, John Ogbu's Caste Theory and Feminist Theory. The CHROME program, based on its stated goals and tenets, was also analyzed against study findings. Findings indicated that participants received very low levels of support from counselors and teachers to pursue STEM careers and high levels of support from parents and family, the CHROME program and financial backing. Findings of this study also indicated that the majority of CHROME alumna persisted in STEM careers. The most successful participants, in terms of undergraduate degree completion and occupational prestige, were the African American women who remained single, experienced no critical incidents, came from a middle class to upper middle class socioeconomic background, and did not have children.

  14. Interoperability and information discovery

    USGS Publications Warehouse

    Christian, E.

    2001-01-01

    In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.

  15. Identifying students' mental models of sound propagation: The role of conceptual blending in understanding conceptual change

    NASA Astrophysics Data System (ADS)

    Hrepic, Zdeslav; Zollman, Dean A.; Rebello, N. Sanjay

    2010-07-01

    We investigated introductory physics students’ mental models of sound propagation. We used a phenomenographic method to analyze the data in the study. In addition to the scientifically accepted Wave model, students used the “Entity” model to describe the propagation of sound. In this latter model sound is a self-standing entity, different from the medium through which it propagates. All other observed alternative models contain elements of both Entity and Wave models, but at the same time are distinct from each of the constituent models. We called these models “hybrid” or “blend” models. We discuss how students use these models in various contexts before and after instruction and how our findings contribute to the understanding of conceptual change. Implications of our findings for teaching are summarized.

  16. Guide for developing conceptual models for ecological risk assessments

    SciTech Connect

    Suter, G.W., II

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs.

  17. Conceptual model for transferring information between small watersheds

    USGS Publications Warehouse

    Cleaves, E.T.

    2003-01-01

    Stream and watershed management and restoration can be greatly facilitated through use of physiographic landform classification to organize and communicate natural resource, hazard, and environmental information at a broad scale (1:250,000) as illustrated by the Piedmont and Coastal Plain Provinces in Maryland, or at a small scale (1:24,000) as illustrated using divisions and zones combined with a conceptual model. The conceptual model brings together geology, surficial processes, landforms and land use change information at the small watershed scale and facilitates transfer of information from one small watershed to another with similar geology and landforms. Stream flow, sediment erosion, and water quality illustrate the use of the model.

  18. Supporting user-defined granularities in a spatiotemporal conceptual model

    USGS Publications Warehouse

    Khatri, V.; Ram, S.; Snodgrass, R.T.; O'Brien, G. M.

    2002-01-01

    Granularities are integral to spatial and temporal data. A large number of applications require storage of facts along with their temporal and spatial context, which needs to be expressed in terms of appropriate granularities. For many real-world applications, a single granularity in the database is insufficient. In order to support any type of spatial or temporal reasoning, the semantics related to granularities needs to be embedded in the database. Specifying granularities related to facts is an important part of conceptual database design because under-specifying the granularity can restrict an application, affect the relative ordering of events and impact the topological relationships. Closely related to granularities is indeterminacy, i.e., an occurrence time or location associated with a fact that is not known exactly. In this paper, we present an ontology for spatial granularities that is a natural analog of temporal granularities. We propose an upward-compatible, annotation-based spatiotemporal conceptual model that can comprehensively capture the semantics related to spatial and temporal granularities, and indeterminacy without requiring new spatiotemporal constructs. We specify the formal semantics of this spatiotemporal conceptual model via translation to a conventional conceptual model. To underscore the practical focus of our approach, we describe an on-going case study. We apply our approach to a hydrogeologic application at the United States Geologic Survey and demonstrate that our proposed granularity-based spatiotemporal conceptual model is straightforward to use and is comprehensive.

  19. A Conceptual Model of Multiple Dimensions of Identity.

    ERIC Educational Resources Information Center

    Jones, Susan R.; McEwen, Marylu K.

    2000-01-01

    Presents a conceptual model of multiple dimensions of identity, which depicts a core sense of self or one's personal identity. Intersecting circles surrounding the core identity represent significant identity dimensions and contextual influences. The model evolved from a grounded theory study of a group of 10 women college students ranging in age…

  20. Conceptual model for assessment of inhalation exposure: defining modifying factors.

    PubMed

    Tielemans, Erik; Schneider, Thomas; Goede, Henk; Tischer, Martin; Warren, Nick; Kromhout, Hans; Van Tongeren, Martie; Van Hemmen, Joop; Cherrie, John W

    2008-10-01

    The present paper proposes a source-receptor model to schematically describe inhalation exposure to help understand the complex processes leading to inhalation of hazardous substances. The model considers a stepwise transfer of a contaminant from the source to the receptor. The conceptual model is constructed using three components, i.e. (i) the source, (ii) various transmission compartments and (iii) the receptor, and describes the contaminant's emission and its pattern of transport. Based on this conceptual model, a list of nine mutually independent principal modifying factors (MFs) is proposed: activity emission potential, substance emission potential, localized control, separation, segregation, dilution, worker behavior, surface contamination and respiratory protection. These MFs describe the exposure process at a high level of abstraction so that the model can be generically applicable. A list of exposure determinants underlying each of these principal MFs is proposed to describe the exposure process at a more detailed level. The presented conceptual model is developed in conjunction with an activity taxonomy as described in a separate paper. The proposed conceptual model and MFs should be seen as 'building blocks' for development of higher tier exposure models. PMID:18787181

  1. Revisiting "Discrepancy Analysis in Continuing Medical Education: A Conceptual Model"

    ERIC Educational Resources Information Center

    Fox, Robert D.

    2011-01-01

    Based upon a review and analysis of selected literature, the author presents a conceptual model of discrepancy analysis evaluation for planning, implementing, and assessing the impact of continuing medical education (CME). The model is described in terms of its value as a means of diagnosing errors in the development and implementation of CME. The…

  2. What Is FRBR? A Conceptual Model for the Bibliographic Universe

    ERIC Educational Resources Information Center

    Tillett, Barbara

    2005-01-01

    From 1992 to 1995 the IFLA Study Group on Functional Requirements for Bibliographic Records (FRBR) developed an entity relationship model as a generalised view of the bibliographic universe, intended to be independent of any cataloguing code or implementation. The FRBR report itself includes a description of the conceptual model (the entities,…

  3. A Conceptual Critique of Sternberg's WICS Model

    ERIC Educational Resources Information Center

    Koro-Ljungberg, Mirka

    2003-01-01

    Robert Sternberg's model of giftedness brings together interesting and very important elements of giftedness by synthesizing wisdom, intelligence, and creativity in novel ways. While his model is based on extensive research and utilizes a variety of sources and expertise, the epistemological consistency of Sternberg's model is, in the author's…

  4. From models to performance assessment: the conceptualization problem.

    PubMed

    Bredehoeft, John D

    2003-01-01

    Today, models are ubiquitous tools for ground water analyses. The intent of this paper is to explore philosophically the role of the conceptual model in analysis. Selection of the appropriate conceptual model is an a priori decision by the analyst. Calibration is an integral part of the modeling process. Unfortunately a wrong or incomplete conceptual model can often be adequately calibrated; good calibration of a model does not ensure a correct conceptual model. Petroleum engineers have another term for calibration; they refer to it as history matching. A caveat to the idea of history matching is that we can make a prediction with some confidence equal to the period of the history match. In other words, if we have matched a 10-year history, we can predict for 10 years with reasonable confidence; beyond 10 years the confidence in the prediction diminishes rapidly. The same rule of thumb applies to ground water model analyses. Nuclear waste disposal poses a difficult problem because the time horizon, 1000 years or longer, is well beyond the possibility of the history match (or period of calibration) in the traditional analysis. Nonetheless, numerical models appear to be the tool of choice for analyzing the safety of waste facilities. Models have a well-recognized inherent uncertainty. Performance assessment, the technique for assessing the safety of nuclear waste facilities, involves an ensemble of cascading models. Performance assessment with its ensemble of models multiplies the inherent uncertainty of the single model. The closer we can approach the idea of a long history with which to match the models, even models of nuclear waste facilities, the more confidence we will have in the analysis (and the models, including performance assessment). This thesis argues for prolonged periods of observation (perhaps as long as 300 to 1000 years) before a nuclear waste facility is finally closed. PMID:13678111

  5. Environmental Model Interoperability Enabled by Open Geospatial Standards - Results of a Feasibility Study (Invited)

    NASA Astrophysics Data System (ADS)

    Benedict, K. K.; Yang, C.; Huang, Q.

    2010-12-01

    The availability of high-speed research networks such as the US National Lambda Rail and the GÉANT network, scalable on-demand commodity computing resources provided by public and private "cloud" computing systems, and increasing demand for rapid access to the products of environmental models for both research and public policy development contribute to a growing need for the evaluation and development of environmental modeling systems that distribute processing, storage, and data delivery capabilities between network connected systems. In an effort to address the feasibility of developing a standards-based distributed modeling system in which model execution systems are physically separate from data storage and delivery systems, the research project presented in this paper developed a distributed dust forecasting system in which two nested atmospheric dust models are executed at George Mason University (GMU, in Fairfax, VA) while data and model output processing services are hosted at the University of New Mexico (UNM, in Albuquerque, NM). Exchange of model initialization and boundary condition parameters between the servers at UNM and the model execution systems at GMU is accomplished through Open Geospatial Consortium (OGC) Web Coverage Services (WCS) and Web Feature Services (WFS) while model outputs are pushed from GMU systems back to UNM using a REST web service interface. In addition to OGC and non-OGC web services for exchange between UNM and GMU, the servers at UNM also provide access to the input meteorological model products, intermediate and final dust model outputs, and other products derived from model outputs through OGC WCS, WFS, and OGC Web Map Services (WMS). The performance of the nested versus non-nested models is assessed in this research, with the results of the performance analysis providing the core content of the produced feasibility study. System integration diagram illustrating the storage and service platforms hosted at the Earth Data

  6. Designing Public Library Websites for Teens: A Conceptual Model

    ERIC Educational Resources Information Center

    Naughton, Robin Amanda

    2012-01-01

    The main goal of this research study was to develop a conceptual model for the design of public library websites for teens (TLWs) that would enable designers and librarians to create library websites that better suit teens' information needs and practices. It bridges a gap in the research literature between user interface design in…

  7. A Multiperspectival Conceptual Model of Transformative Meaning Making

    ERIC Educational Resources Information Center

    Freed, Maxine

    2009-01-01

    Meaning making is central to transformative learning, but little work has explored how meaning is constructed in the process. Moreover, no meaning-making theory adequately captures its characteristics and operations during radical transformation. The purpose of this dissertation was to formulate and specify a multiperspectival conceptual model of…

  8. A Conceptual Model of the World of Work.

    ERIC Educational Resources Information Center

    VanRooy, William H.

    The conceptual model described in this paper resulted from the need to organize a body of knowledge related to the world of work which would enable curriculum developers to prepare accurate, realistic instructional materials. The world of work is described by applying Malinowski's scientific study of the structural components of culture. It is…

  9. Conceptualizations of Creativity: Comparing Theories and Models of Giftedness

    ERIC Educational Resources Information Center

    Miller, Angie L.

    2012-01-01

    This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…

  10. Sources of Sex Discrimination in Educational Systems: A Conceptual Model

    ERIC Educational Resources Information Center

    Kutner, Nancy G.; Brogan, Donna

    1976-01-01

    A conceptual model is presented relating numerous variables contributing to sexism in American education. Discrimination is viewed as intervening between two sets of interrelated independent variables and the dependent variable of sex inequalities in educational attainment. Sex-role orientation changes are the key to significant change in the…

  11. A Conceptual Model for Effective Distance Learning in Higher Education

    ERIC Educational Resources Information Center

    Farajollahi, Mehran; Zare, Hosein; Hormozi, Mahmood; Sarmadi, Mohammad Reza; Zarifsanaee, Nahid

    2010-01-01

    The present research aims at presenting a conceptual model for effective distance learning in higher education. Findings of this research shows that an understanding of the technological capabilities and learning theories especially constructive theory and independent learning theory and communicative and interaction theory in Distance learning is…

  12. A new conceptual model of convection

    SciTech Connect

    Walcek, C.

    1995-09-01

    Classical cumulus parameterizations assume that cumulus clouds are entraining plumes of hot air rising through the atmosphere. However, ample evidence shows that clouds cannot be simulated using this approach. Dr. Walcek suggests that cumulus clouds can be reasonably simulated by assuming that buoyant plumes detrain mass as they rise through the atmosphere. Walcek successfully simulates measurements of tropical convection using this detraining model of cumulus convection. Comparisons with measurements suggest that buoyant plumes encounter resistance to upward movement as they pass through dry layers in the atmosphere. This probably results from turbulent mixing and evaporation of cloud water, which generates negatively buoyant mixtures which detrain from the upward moving plume. This mass flux model of detraining plumes is considerably simpler than existing mass flux models, yet reproduces many of the measured effects associated with convective activity. 1 fig.

  13. Designing Information Interoperability

    SciTech Connect

    Gorman, Bryan L.; Shankar, Mallikarjun; Resseguie, David R.

    2009-01-01

    Examples of incompatible systems are offered with a discussion of the relationship between incompatibility and innovation. Engineering practices and the role of standards are reviewed as a means of resolving issues of incompatibility, with particular attention to the issue of innovation. Loosely-coupled systems are described as a means of achieving and sustaining both interoperability and innovation in heterogeneous environments. A virtual unifying layer, in terms of a standard, a best practice, and a methodology, is proposed as a modality for designing information interoperability for enterprise applicaitons. The Uniform Resource Identifier (URI), microformats, and Joshua Porter s AOF Method are described and presented as solutions for designing interoperable information sharing web sites. The Special Operations Force Information Access (SOFIA), a mock design, is presented as an example of information interoperability.

  14. WC WAVE - Integrating Diverse Hydrological-Modeling Data and Services Into an Interoperable Geospatial Infrastructure

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Baros, S.; Barrett, H.; Savickas, J.; Erickson, J.

    2015-12-01

    WC WAVE (Western Consortium for Watershed Analysis, Visualization and Exploration) is a collaborative research project between the states of Idaho, Nevada, and New Mexico that is funded under the National Science Foundation's Experimental Program to Stimulate Competitive Research (EPSCoR). The goal of the project is to understand and document the effects of climate change on interactions between precipitation, vegetation growth, soil moisture and other landscape properties. These interactions are modeled within a framework we refer to as a virtual watershed (VW), a computer infrastructure that simulates watershed dynamics by linking scientific modeling, visualization, and data management components into a coherent whole. Developed and hosted at the Earth Data Analysis Center, University of New Mexico, the virtual watershed has a number of core functions which include: a) streamlined access to data required for model initialization and boundary conditions; b) the development of analytic scenarios through interactive visualization of available data and the storage of model configuration options; c) coupling of hydrological models through the rapid assimilation of model outputs into the data management system for access and use by sequent models. The WC-WAVE virtual watershed accomplishes these functions by provision of large-scale vector and raster data discovery, subsetting, and delivery via Open Geospatial Consortium (OGC) and REST web service standards. Central to the virtual watershed is the design and use of an innovative array of metadata elements that permits the stepwise coupling of diverse hydrological models (e.g. ISNOBAL, PRMS, CASiMiR) and input data to rapidly assess variation in outcomes under different climatic conditions. We present details on the architecture and functionality of the virtual watershed, results from three western U.S. watersheds, and discuss the realized benefits to watershed science of employing this integrated solution.

  15. Conceptualizing Evolving Models of Educational Development

    ERIC Educational Resources Information Center

    Fraser, Kym; Gosling, David; Sorcinelli, Mary Deane

    2010-01-01

    Educational development, which the authors use to refer to the field of professional and strategic development associated with university and college learning and teaching, can be described in many ways by referring to its different aspects. In this article the authors endeavor to categorize many of the models that have been used to describe…

  16. CONCEPTUAL DEVELOPMENT OF A TOXIC SCREENING MODEL

    EPA Science Inventory

    This report presents the application of the Routing and Graphical Display system developed by EPA to show how computer based modeling and simulation using the Reach File can be used to assess the types and concentrations of contaminants that could be found at any point in a river...

  17. A conceptual model of morphogenesis and regeneration

    PubMed Central

    Tosenberger, A.; Bessonov, N.; Levin, M.; Reinberg, N.; Volpert, V.; Morozova, N.

    2016-01-01

    This paper is devoted to computer modelling of the development and regeneration of multicellular biological structures. Some species (e.g., planaria and salamanders) are able to regenerate parts of their body after amputation damage, but the global rules governing cooperative cell behaviour during morphogenesis are not known. Here, we consider a simplified model organism, which consists of tissues formed around special cells that can be interpreted as stem cells. We assume that stem cells communicate with each other by a set of signals, and that the values of these signals depend on the distance between cells. Thus the signal distribution characterizes location of stem cells. If the signal distribution is changed, then the difference between the initial and the current signal distribution affects the behaviour of stem cells – e.g. as a result of an amputation of a part of tissue the signal distribution changes which stimulates stem cells to migrate to new locations, appropriate for regeneration of the proper pattern. Moreover, as stem cells divide and form tissues around them, they control the form and the size of regenerating tissues. This two-level organization of the model organism, with global regulation of stem cells and local regulation of tissues, allows its reproducible development and regeneration. PMID:25822060

  18. Solving system integration and interoperability problems using a model reference systems engineering framework

    NASA Astrophysics Data System (ADS)

    Makhlouf, Mahmoud A.

    2001-09-01

    This paper presents a model-reference systems engineering framework, which is applied on a number of ESC projects. This framework provides an architecture-driven system engineering process supported by a tool kit. This kit is built incrementally using an integrated set of commercial and government developed tools. These tools include project management, systems engineering, military worth-analysis and enterprise collaboration tools. Products developed using these tools enable the specification and visualization of an executable model of the integrated system architecture as it evolves from a low fidelity concept into a high fidelity system model. This enables end users of system products, system designers, and decision-makers; to perform what if analyses on system design alternatives before making costly final system acquisition decisions.

  19. Cooperative modelling and design on the computing grid: data, flux and knowledge interoperability.

    PubMed

    Laganà, Antonio; Rossi, Elda; Evangelisti, Stefano

    2013-10-01

    The fast interconnections of the presently available distributed platforms allow scientists to target highly complex problems by chaining software developed and maintained by experts of the relevant fields. A pillar of such cooperative endeavor in molecular and materials science and technologies is the so-called grid empowered molecular simulator that combines the expertise of molecular science theorists (electronic structure and nuclei dynamics) and experimentalists in order to build and validate ab initio models. This line has prompted an unprecedented level of data format standardization procedures, the bridging of high throughput and high performance platforms, the assemblage of ad hoc designed virtual experiments. In addition this approach has prompted the design and development of tools allowing the evaluation of the quality of the cooperative effort produced by the members of a given research community as well as its rewards to such effort through a credit economy is reported. PMID:23620227

  20. A Conceptual Model of the Information Requirements of Nursing Organizations

    PubMed Central

    Miller, Emmy

    1989-01-01

    Three related issues play a role in the identification of the information requirements of nursing organizations. These issues are the current state of computer systems in health care organizations, the lack of a well-defined data set for nursing, and the absence of models representing data and information relevant to clinical and administrative nursing practice. This paper will examine current methods of data collection, processing, and storage in clinical and administrative nursing practice for the purpose of identifying the information requirements of nursing organizations. To satisfy these information requirements, database technology can be used; however, a model for database design is needed that reflects the conceptual framework of nursing and the professional concerns of nurses. A conceptual model of the types of data necessary to produce the desired information will be presented and the relationships among data will be delineated.

  1. Conceptual Commitments of the LIDA Model of Cognition

    NASA Astrophysics Data System (ADS)

    Franklin, Stan; Strain, Steve; McCall, Ryan; Baars, Bernard

    2013-06-01

    Significant debate on fundamental issues remains in the subfields of cognitive science, including perception, memory, attention, action selection, learning, and others. Psychology, neuroscience, and artificial intelligence each contribute alternative and sometimes conflicting perspectives on the supervening problem of artificial general intelligence (AGI). Current efforts toward a broad-based, systems-level model of minds cannot await theoretical convergence in each of the relevant subfields. Such work therefore requires the formulation of tentative hypotheses, based on current knowledge, that serve to connect cognitive functions into a theoretical framework for the study of the mind. We term such hypotheses "conceptual commitments" and describe the hypotheses underlying one such model, the Learning Intelligent Distribution Agent (LIDA) Model. Our intention is to initiate a discussion among AGI researchers about which conceptual commitments are essential, or particularly useful, toward creating AGI agents.

  2. Factors Associated With Adoption of Health Information Technology: A Conceptual Model Based on a Systematic Review

    PubMed Central

    DeShazo, Jonathan; Kim, Forest; Fulton, Lawrence

    2014-01-01

    in articles, editorials, books, as well as quantitative and qualitative studies (n=83). As of 2009, only 16.30% (815/4999) of nonfederal, acute-care hospitals had adopted a fully interoperable EHR. From the 83 articles reviewed in this study, 16/83 (19%) identified internal organizational factors and 9/83 (11%) identified external environmental factors associated with adoption of the EHR, EMR, or CPOE. The conceptual model for EHR adoption associates each variable with the work that identified it. Conclusions Commonalities exist in the literature for internal organizational and external environmental factors associated with the adoption of the EHR and/or CPOE. The conceptual model for EHR adoption associates internal and external factors, specific to the health care industry, associated with adoption of the EHR. It becomes apparent that these factors have some level of association, but the association is not consistently calculated individually or in combination. To better understand effective adoption strategies, empirical studies should be performed from this conceptual model to quantify the positive or negative effect of each factor. PMID:25599673

  3. Conceptual Model for Selenium Cycling in the Great Salt Lake

    NASA Astrophysics Data System (ADS)

    Johnson, W. P.; Conover, M. R.; Wurtsbaugh, W. A.; Adams, J.

    2006-12-01

    The conceptual model for Selenium cycling in the Great Salt Lake was developed to guide investigations in support of determining an open water selenium standard for the Great Salt Lake. The motivation to determine this particular selenium standard derives from public concern for a plan to allow disposal of reverse osmosis (RO) concentrate in the GSL, which would contain elevated concentrations of major and trace elements, including selenium. The development of an open water standard for selenium requires a working knowledge of the biological significance of existing selenium concentrations in the Great Salt Lake, as well as a working understanding of the likely changes of these concentrations over time given existing and proposed loads to the system. This working knowledge" is being represented in a conceptual model that accounts for selenium in various stocks" in the system (e.g. water, sediment, biota) and the flow" of selenium between stocks (e.g., precipitation and settling, volatilization, bioconcentration). It illustrates the critical pathway of selenium in the Great Salt Lake from water, to microorganisms, to brine shrimp and brine flies, to birds, and to their eggs. It also addresses the complexity of the GSL system: a) Spatially diverse, being comprised by four distinct bays and two layers, with major differences in salinity among their waters. b) Temporally dynamic, due to seasonal and inter-annual variations in runoff. The conceptual model is presently descriptive, but will serve as the basis for a semi-quantitative model that will be fed by data accumulated during subsequent investigations.

  4. Conceptual classification model for Sustainable Flood Retention Basins.

    PubMed

    Scholz, Miklas; Sadowski, Adam J

    2009-01-01

    The aim of this paper is to recommend a rapid conceptual classification model for Sustainable Flood Retention Basins (SFRB) used to control runoff in a temperate climate. An SFRB is an aesthetically pleasing retention basin predominantly used for flood protection adhering to sustainable drainage and best management practices. The classification model was developed on the basis of a database of 141 SFRB using the River Rhine catchment in Baden (part of Baden-Württemberg, Germany) as a case study. It is based on an agglomerative cluster analysis and is intended to be used by engineers and scientists to adequately classify the following different types of SFRB: Hydraulic Flood Retention Basin, Traditional Flood Retention Basin, Sustainable Flood Retention Wetland, Aesthetic Flood Retention Wetland, Integrated Flood Retention Wetland and Natural Flood Retention Wetland. The selection of classification variables was supported by a principal component analysis. The identification of SFRB in the data set was based on a Ward cluster analysis of 34 weighted classification variables. Scoring tables were defined to enable the assignment of the six SFRB definitions to retention basins in the data set. The efficiency of these tables was based on a scoring system which gave the conceptual model for the example case study sites an overall efficiency of approximately 60% (as opposed to 17% by chance). This conceptual classification model should be utilized to improve communication by providing definitions for SFRB types. The classification definitions are likely to be applicable for other regions with both temperate oceanic and temperate continental climates. PMID:18280029

  5. eLac - Conceptual Model for Flood Management

    NASA Astrophysics Data System (ADS)

    Rata, Marius; Florentin Draghia, Aurelian; Drobot, Radu; Matreata, Marius; Corbus, Ciprian

    2015-04-01

    This article reviews the conceptual model of the decision support system (DSS) for flood management activities introduced in the scope of e-LAC project. Following the general system architecture which has an emphasize on the water management decision processes, hydrologic and hydraulic models are introduced and discussed according to their specific DSS integration potential. Three directions are discussed in dedicated sections corresponding to the main modules defined in the conceptual model : the Water Basin Management Module (mainly implements the management decision flow, but manages also data exchange between hydrologic modeling module and hydraulic modeling module, allow real time visualization for hydrological data), the Hydrologic Modeling Module (manages all the modeling functionalities of rainfalls - runoff processes, providing continuous hydrologic forecasts with a variable time-step depending on the actual basin situation) and the Hydraulic Modeling Module (computes the flood's waves routing having as boundary upstream conditions the discharge hydrographs, generated both by catchment's upper area, river tributaries and inter-basins, respectively the rating curves, water level hydrograph or water surface slope as downstream condition). The GIS concepts are contextually reviewed based on their use as geospatial database for water management modeling, integration within hydrologic time courses, hydraulic modeling (from both software and management perspective), expert knowledge or mathematical modeling results (knowledge database, rules).

  6. Toward technical interoperability in telemedicine.

    PubMed

    Craft, Richard L

    2005-06-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how "technical interoperability" compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal. PMID:16035933

  7. Towards technical interoperability in telemedicine.

    SciTech Connect

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  8. Scientific and conceptual flaws of coercive treatment models in addiction.

    PubMed

    Uusitalo, Susanne; van der Eijk, Yvette

    2016-01-01

    In conceptual debates on addiction, neurobiological research has been used to support the idea that addicted drug users lack control over their addiction-related actions. In some interpretations, this has led to coercive treatment models, in which, the purpose is to 'restore' control. However, neurobiological studies that go beyond what is typically presented in conceptual debates paint a different story. In particular, they indicate that though addiction has neurobiological manifestations that make the addictive behaviour difficult to control, it is possible for individuals to reverse these manifestations through their own efforts. Thus, addicted individuals should not be considered incapable of making choices voluntarily, simply on the basis that addiction has neurobiological manifestations, and coercive treatment models of addiction should be reconsidered in this respect. PMID:26463621

  9. CONCEPTUAL MODELS AND METHODS TO GUIDE DIAGNOSTIC RESEARCH INTO CAUSES OF IMPAIRMENT TO AQUATIC ECOSYSTEMS

    EPA Science Inventory

    Methods and conceptual models to guide the development of tools for diagnosing the causes of biological impairment within aquatic ecosystems of the United States are described in this report. The conceptual models developed here address nutrients, suspended and bedded sediments (...

  10. Conceptual Models in Health Informatics Research: A Literature Review and Suggestions for Development

    PubMed Central

    2016-01-01

    Background Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. Objectives The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. Methods A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. Results The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one’s choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Conclusions Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other

  11. Conceptual Model of Quantities, Units, Dimensions, and Values

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; DeKoenig, Hans-Peter; Burkhart, Roger; Espinoza, Huascar

    2011-01-01

    JPL collaborated with experts from industry and other organizations to develop a conceptual model of quantities, units, dimensions, and values based on the current work of the ISO 80000 committee revising the International System of Units & Quantities based on the International Vocabulary of Metrology (VIM). By providing support for ISO 80000 in SysML via the International Vocabulary of Metrology (VIM), this conceptual model provides, for the first time, a standard-based approach for addressing issues of unit coherence and dimensional analysis into the practice of systems engineering with SysML-based tools. This conceptual model provides support for two kinds of analyses specified in the International Vocabulary of Metrology (VIM): coherence of units as well as of systems of units, and dimension analysis of systems of quantities. To provide a solid and stable foundation, the model for defining quantities, units, dimensions, and values in SysML is explicitly based on the concepts defined in VIM. At the same time, the model library is designed in such a way that extensions to the ISQ (International System of Quantities) and SI Units (Systeme International d Unites) can be represented, as well as any alternative systems of quantities and units. The model library can be used to support SysML user models in various ways. A simple approach is to define and document libraries of reusable systems of units and quantities for reuse across multiple projects, and to link units and quantity kinds from these libraries to Unit and QuantityKind stereotypes defined in SysML user models.

  12. Life cycle cost modeling of conceptual space vehicles

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles

    1993-01-01

    This paper documents progress to date by the University of Dayton on the development of a life cycle cost model for use during the conceptual design of new launch vehicles and spacecraft. This research is being conducted under NASA Research Grant NAG-1-1327. This research effort changes the focus from that of the first two years in which a reliability and maintainability model was developed to the initial development of a life cycle cost model. Cost categories are initially patterned after NASA's three axis work breakdown structure consisting of a configuration axis (vehicle), a function axis, and a cost axis. The focus will be on operations and maintenance costs and other recurring costs. Secondary tasks performed concurrent with the development of the life cycle costing model include continual support and upgrade of the R&M model. The primary result of the completed research will be a methodology and a computer implementation of the methodology to provide for timely cost analysis in support of the conceptual design activities. The major objectives of this research are: to obtain and to develop improved methods for estimating manpower, spares, software and hardware costs, facilities costs, and other cost categories as identified by NASA personnel; to construct a life cycle cost model of a space transportation system for budget exercises and performance-cost trade-off analysis during the conceptual and development stages; to continue to support modifications and enhancements to the R&M model; and to continue to assist in the development of a simulation model to provide an integrated view of the operations and support of the proposed system.

  13. Lemnos Interoperable Security Program

    SciTech Connect

    Stewart, John; Halbgewachs, Ron; Chavez, Adrian; Smith, Rhett; Teumim, David

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  14. Integrating O/S models during conceptual design, part 1

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    The University of Dayton is pleased to submit this report to the National Aeronautics and Space Administration (NASA), Langley Research Center, which integrates a set of models for determining operational capabilities and support requirements during the conceptual design of proposed space systems. This research provides for the integration of the reliability and maintainability (R&M) model, both new and existing simulation models, and existing operations and support (O&S) costing equations in arriving at a complete analysis methodology. Details concerning the R&M model and the O&S costing model may be found in previous reports accomplished under this grant (NASA Research Grant NAG1-1327). In the process of developing this comprehensive analysis approach, significant enhancements were made to the R&M model, updates to the O&S costing model were accomplished, and a new simulation model developed. This is the 1st part of a 3 part technical report.

  15. Misrepresentation and amendment of soil moisture in conceptual hydrological modelling

    NASA Astrophysics Data System (ADS)

    Zhuo, Lu; Han, Dawei

    2016-04-01

    Although many conceptual models are very effective in simulating river runoff, their soil moisture schemes are generally not realistic in comparison with the reality (i.e., getting the right answers for the wrong reasons). This study reveals two significant misrepresentations in those models through a case study using the Xinanjiang model which is representative of many well-known conceptual hydrological models. The first is the setting of the upper limit of its soil moisture at the field capacity, due to the 'holding excess runoff' concept (i.e., runoff begins on repletion of its storage to the field capacity). The second is neglect of capillary rise of water movement. A new scheme is therefore proposed to overcome those two issues. The amended model is as effective as its original form in flow modelling, but represents more logically realistic soil water processes. The purpose of the study is to enable the hydrological model to get the right answers for the right reasons. Therefore, the new model structure has a better capability in potentially assimilating soil moisture observations to enhance its real-time flood forecasting accuracy. The new scheme is evaluated in the Pontiac catchment of the USA through a comparison with satellite observed soil moisture. The correlation between the XAJ and the observed soil moisture is enhanced significantly from 0.64 to 0.70. In addition, a new soil moisture term called SMDS (Soil Moisture Deficit to Saturation) is proposed to complement the conventional SMD (Soil Moisture Deficit).

  16. Design Oriented Structural Modeling for Airplane Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Livne, Eli

    1999-01-01

    The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.

  17. Operations and support cost modeling of conceptual space vehicles

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles

    1994-01-01

    The University of Dayton is pleased to submit this annual report to the National Aeronautics and Space Administration (NASA) Langley Research Center which documents the development of an operations and support (O&S) cost model as part of a larger life cycle cost (LCC) structure. It is intended for use during the conceptual design of new launch vehicles and spacecraft. This research is being conducted under NASA Research Grant NAG-1-1327. This research effort changes the focus from that of the first two years in which a reliability and maintainability model was developed to the initial development of an operations and support life cycle cost model. Cost categories were initially patterned after NASA's three axis work breakdown structure consisting of a configuration axis (vehicle), a function axis, and a cost axis. A revised cost element structure (CES), which is currently under study by NASA, was used to established the basic cost elements used in the model. While the focus of the effort was on operations and maintenance costs and other recurring costs, the computerized model allowed for other cost categories such as RDT&E and production costs to be addressed. Secondary tasks performed concurrent with the development of the costing model included support and upgrades to the reliability and maintainability (R&M) model. The primary result of the current research has been a methodology and a computer implementation of the methodology to provide for timely operations and support cost analysis during the conceptual design activities.

  18. Conceptual Modeling in the Time of the Revolution: Part II

    NASA Astrophysics Data System (ADS)

    Mylopoulos, John

    Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.

  19. Conceptual Model of the Klamath Falls, Oregon Geothermal Area

    SciTech Connect

    Prucha, R.H.; Benson, S.M.; Witherspoon, P.A.

    1987-01-20

    Over the last 50 years significant amounts of data have been obtained from the Klamath Falls geothermal resource. To date, the complexity of the system has stymied researchers, leading to the development of only very generalized hydrogeologic and geothermal models of the area. Recently, the large quantity of available temperature data have been re-evaluated, revealing new information on subsurface heat flow and locations of faults in the system. These inferences are supported by borehole, geochemical, geophysical, and hydrologic data. Based on re-evaluation of all available data, a detailed conceptual model for the Klamath Falls geothermal resource is proposed. 1 tab., 8 figs., 21 refs.

  20. A Conceptual Model of the Pasadena Housing System

    NASA Technical Reports Server (NTRS)

    Hirshberg, Alan S.; Barber, Thomas A.

    1971-01-01

    During the last 5 years, there have been several attempts at applying systems analysis to complex urban problems. This paper describes one such attempt by a multidisciplinary team of students, engineers, professors, and community representatives. The Project organization is discussed and the interaction of the different disciplines (the process) described. The two fundamental analysis questions posed by the Project were: "Why do houses deteriorate?" and "Why do people move?" The analysis of these questions led to the development of a conceptual system model of housing in Pasadena. The major elements of this model are described, and several conclusions drawn from it are presented.

  1. Conceptual model of the Klamath Falls, Oregon geothermal area

    SciTech Connect

    Prucha, R.H.; Benson, S.M.; Witherspoon, P.A.

    1987-01-01

    Over the last 50 years significant amounts of data have been obtained from the Klamath Falls geothermal resource. To date, the complexity of the system has stymied researchers, leading to the development of only very generalized hydrogeologic and geothermal models of the area. Recently, the large quantity of available temperature data have been re-evaluated, revealing new information on subsurface heat flow and locations of faults in the system. These inferences are supported by borehole, geochemical, geophysical, and hydrologic data. Based on re-evaluation of all available data, a detailed conceptual model for the Klamath Falls geothermal resource is proposed.

  2. Clinical data interoperability based on archetype transformation.

    PubMed

    Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2011-10-01

    The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. PMID:21645637

  3. Multimorbidity: conceptual basis, epidemiological models and measurement challenges.

    PubMed

    Fernández-Niño, Julián Alfredo; Bustos-Vázquez, Eduardo

    2016-01-01

    The growing number of patients with complex clinical profiles related to chronic diseases has contributed to the increasingly widespread use of the term 'multimorbidity'. A suitable measurement of this condition is essential to epidemiological studies considering that it represents a challenge for the clinical management of patients as well as for health systems and epidemiological investigations. In this context, the present essay reviews the conceptual proposals behind the measurement of multimorbidity including the epidemiological and methodological challenges it involves. We discuss classical definitions of comorbidity, how they differ from the concept of multimorbidity, and their roles in epidemiological studies. The various conceptual models that contribute to the operational definitions and strategies to measure this variable are also presented. The discussion enabled us to identify a significant gap between the modern conceptual development of multimorbidity and the operational definitions. This gap exists despite the theoretical developments that have occurred in the classical concept of comorbidity to arrive to the modern and multidimensional conception of multimorbidty. Measurement strategies, however, have not kept pace with this advance. Therefore, new methodological proposals need to be developed in order to obtain information regarding the actual impact on individuals' health and its implications for public health. PMID:27622480

  4. Toward interoperable bioscience data

    PubMed Central

    Sansone, Susanna-Assunta; Rocca-Serra, Philippe; Field, Dawn; Maguire, Eamonn; Taylor, Chris; Hofmann, Oliver; Fang, Hong; Neumann, Steffen; Tong, Weida; Amaral-Zettler, Linda; Begley, Kimberly; Booth, Tim; Bougueleret, Lydie; Burns, Gully; Chapman, Brad; Clark, Tim; Coleman, Lee-Ann; Copeland, Jay; Das, Sudeshna; de Daruvar, Antoine; de Matos, Paula; Dix, Ian; Edmunds, Scott; Evelo, Chris T; Forster, Mark J; Gaudet, Pascale; Gilbert, Jack; Goble, Carole; Griffin, Julian L; Jacob, Daniel; Kleinjans, Jos; Harland, Lee; Haug, Kenneth; Hermjakob, Henning; Ho Sui, Shannan J; Laederach, Alain; Liang, Shaoguang; Marshall, Stephen; McGrath, Annette; Merrill, Emily; Reilly, Dorothy; Roux, Magali; Shamu, Caroline E; Shang, Catherine A; Steinbeck, Christoph; Trefethen, Anne; Williams-Jones, Bryn; Wolstencroft, Katherine; Xenarios, Ioannis; Hide, Winston

    2012-01-01

    To make full use of research data, the bioscience community needs to adopt technologies and reward mechanisms that support interoperability and promote the growth of an open ‘data commoning’ culture. Here we describe the prerequisites for data commoning and present an established and growing ecosystem of solutions using the shared ‘Investigation-Study-Assay’ framework to support that vision. PMID:22281772

  5. Conceptual Change Texts in Chemistry Teaching: A Study on the Particle Model of Matter

    ERIC Educational Resources Information Center

    Beerenwinkel, Anne; Parchmann, Ilka; Grasel, Cornelia

    2011-01-01

    This study explores the effect of a conceptual change text on students' awareness of common misconceptions on the particle model of matter. The conceptual change text was designed based on principles of text comprehensibility, of conceptual change instruction and of instructional approaches how to introduce the particle model. It was evaluated in…

  6. Alternate conceptual model of ground water flow at Yucca Mountain

    SciTech Connect

    1993-12-31

    Attempts to predict the performance of a high-level nuclear waste repository in the United States have lead to the development of alternative conceptual models of the ground watre flow field in which the repository will be located. This step has come about because of the lage uncertainties involved in predicting the movement of water and radionuclides through an unsaturated fractured rock. Further, one of the standards to which we are comparing performance is probabilistic, so we are forced to try to conceive of all credible scenarios by which ground water may intersect the repository horizon and perhaps transport radionuclides to a given compliance boundary. To simplify this task, the DOE set about identifying alternative conceptual models of ground water flow which are consistent with existing data. Modeling these concepts necessitates the use of simplifying assumptions. Among the modeling assumptions commonly utilized by analysts of the Yucca Mountain site are those of uniformly distributed, small volumes of recharge and matrix or porous media flow. Most scientists would agree that recharge at Yucca Mountain does not occur in this ideal and simplified fashion, yet modeling endeavors continue to commonly utilize this approach. In this paper, we examine the potential effects of focused recharge on the flow field at Yucca Mountain in concert with a fractured matrix and non-equilibrium view of ground water flow.

  7. Use of Numerical Groundwater Modeling to Evaluate Uncertainty in Conceptual Models of Recharge and Hydrostratigraphy

    SciTech Connect

    Pohlmann, Karl; Ye, Ming; Pohll, Greg; Chapman, Jenny

    2007-01-19

    Numerical groundwater models are based on conceptualizations of hydrogeologic systems that are by necessity developed from limited information and therefore are simplifications of real conditions. Each aspect (e.g. recharge, hydrostratigraphy, boundary conditions) of the groundwater model is often based on a single conceptual model that is considered to be the best representation given the available data. However, the very nature of their construction means that each conceptual model is inherently uncertain and the available information may be insufficient to refute plausible alternatives, thereby raising the possibility that the flow model is underestimating overall uncertainty. In this study we use the Death Valley Regional Flow System model developed by the U.S. Geological Survey as a framework to predict regional groundwater flow southward into Yucca Flat on the Nevada Test Site. An important aspect of our work is to evaluate the uncertainty associated with multiple conceptual models of groundwater recharge and subsurface hydrostratigraphy and quantify the impacts of this uncertainty on model predictions. In our study, conceptual model uncertainty arises from two sources: (1) alternative interpretations of the hydrostratigraphy in the northern portion of Yucca Flat where, owing to sparse data, the hydrogeologic system can be conceptualized in different ways, and (2) uncertainty in groundwater recharge in the region as evidenced by the existence of several independent approaches for estimating this aspect of the hydrologic system. The composite prediction of groundwater flow is derived from the regional model that formally incorporates the uncertainty in these alternative input models using the maximum likelihood Bayesian model averaging method. An assessment of the joint predictive uncertainty of the input conceptual models is also produced. During this process, predictions of the alternative models are weighted by model probability, which is the degree of

  8. Our evolving conceptual model of the coastal eutrophication problem

    USGS Publications Warehouse

    Cloern, James E.

    2001-01-01

    A primary focus of coastal science during the past 3 decades has been the question: How does anthropogenic nutrient enrichment cause change in the structure or function of nearshore coastal ecosystems? This theme of environmental science is recent, so our conceptual model of the coastal eutrophication problem continues to change rapidly. In this review, I suggest that the early (Phase I) conceptual model was strongly influenced by limnologists, who began intense study of lake eutrophication by the 1960s. The Phase I model emphasized changing nutrient input as a signal, and responses to that signal as increased phytoplankton biomass and primary production, decomposition of phytoplankton-derived organic matter, and enhanced depletion of oxygen from bottom waters. Coastal research in recent decades has identified key differences in the responses of lakes and coastal-estuarine ecosystems to nutrient enrichment. The contemporary (Phase II) conceptual model reflects those differences and includes explicit recognition of (1) system-specific attributes that act as a filter to modulate the responses to enrichment (leading to large differences among estuarine-coastal systems in their sensitivity to nutrient enrichment); and (2) a complex suite of direct and indirect responses including linked changes in: water transparency, distribution of vascular plants and biomass of macroalgae, sediment biogeochemistry and nutrient cycling, nutrient ratios and their regulation of phytoplankton community composition, frequency of toxic/harmful algal blooms, habitat quality for metazoans, reproduction/growth/survival of pelagic and benthic invertebrates, and subtle changes such as shifts in the seasonality of ecosystem functions. Each aspect of the Phase II model is illustrated here with examples from coastal ecosystems around the world. In the last section of this review I present one vision of the next (Phase III) stage in the evolution of our conceptual model, organized around 5

  9. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    SciTech Connect

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four

  10. Development and Validation of a Mass Casualty Conceptual Model

    PubMed Central

    Culley, Joan M.; Effken, Judith A.

    2012-01-01

    Purpose To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. Design The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Methods Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Findings Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Conclusions Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. Clinical Relevance This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions. PMID:20487188

  11. Equivalent plate modeling for conceptual design of aircraft wing structures

    NASA Technical Reports Server (NTRS)

    Giles, Gary L.

    1995-01-01

    This paper describes an analysis method that generates conceptual-level design data for aircraft wing structures. A key requirement is that this data must be produced in a timely manner so that is can be used effectively by multidisciplinary synthesis codes for performing systems studies. Such a capability is being developed by enhancing an equivalent plate structural analysis computer code to provide a more comprehensive, robust and user-friendly analysis tool. The paper focuses on recent enhancements to the Equivalent Laminated Plate Solution (ELAPS) analysis code that significantly expands the modeling capability and improves the accuracy of results. Modeling additions include use of out-of-plane plate segments for representing winglets and advanced wing concepts such as C-wings along with a new capability for modeling the internal rib and spar structure. The accuracy of calculated results is improved by including transverse shear effects in the formulation and by using multiple sets of assumed displacement functions in the analysis. Typical results are presented to demonstrate these new features. Example configurations include a C-wing transport aircraft, a representative fighter wing and a blended-wing-body transport. These applications are intended to demonstrate and quantify the benefits of using equivalent plate modeling of wing structures during conceptual design.

  12. Conceptual Models as Hypotheses in Monitoring Urban Landscapes

    NASA Astrophysics Data System (ADS)

    Lookingbill, Todd R.; Gardner, Robert H.; Townsend, Philip A.; Carter, Shawn L.

    2007-08-01

    Many problems and challenges of ecosystem management currently are driven by the rapid pace and spatial extent of landscape change. Parks and reserves within areas of high human population density are especially challenged to meet the recreational needs of local populations and to preserve valued environmental resources. The complex problem of managing multiple objectives and multiple resources requires an enormous quantity of information, and conceptual models have been proposed as tools for organizing and interpreting this information. Academics generally prefer a bottom-up approach to model construction that emphasizes ecologic theory and process, whereas managers often use a top-down approach that takes advantage of existing information to address more pragmatic objectives. The authors propose a formal process for developing, applying, and testing conceptual models to be used in landscape monitoring that reconciles these seemingly opposing perspectives. The four-step process embraces the role of hypothesis testing in the development of models and evaluation of their utility. An example application of the process to a network of national parks in and around Washington, DC illustrates the ability of the approach to systematically identify monitoring data that would both advance ecologic theory and inform management decisions.

  13. Conceptual models as hypotheses in monitoring urban landscapes.

    PubMed

    Lookingbill, Todd R; Gardner, Robert H; Townsend, Philip A; Carter, Shawn L

    2007-08-01

    Many problems and challenges of ecosystem management currently are driven by the rapid pace and spatial extent of landscape change. Parks and reserves within areas of high human population density are especially challenged to meet the recreational needs of local populations and to preserve valued environmental resources. The complex problem of managing multiple objectives and multiple resources requires an enormous quantity of information, and conceptual models have been proposed as tools for organizing and interpreting this information. Academics generally prefer a bottom-up approach to model construction that emphasizes ecologic theory and process, whereas managers often use a top-down approach that takes advantage of existing information to address more pragmatic objectives. The authors propose a formal process for developing, applying, and testing conceptual models to be used in landscape monitoring that reconciles these seemingly opposing perspectives. The four-step process embraces the role of hypothesis testing in the development of models and evaluation of their utility. An example application of the process to a network of national parks in and around Washington, DC illustrates the ability of the approach to systematically identify monitoring data that would both advance ecologic theory and inform management decisions. PMID:17562105

  14. Benefits of Linked Data for Interoperability during Crisis Management

    NASA Astrophysics Data System (ADS)

    Roller, R.; Roes, J.; Verbree, E.

    2015-08-01

    Floodings represent a permanent risk to the Netherlands in general and to her power supply in particular. Data sharing is essential within this crisis scenario as a power cut affects a great variety of interdependant sectors. Currently used data sharing systems have been shown to hamper interoperability between stakeholders since they lack flexibility and there is no consensus in term definitions and interpretations. The study presented in this paper addresses these challenges by proposing a new data sharing solution based on Linked Data, a method of interlinking data points in a structured way on the web. A conceptual model for two data sharing parties in a flood-caused power cut crisis management scenario was developed to which relevant data were linked. The analysis revealed that the presented data sharing solution burderns its user with extra costs in the short run, but saves resources in the long run by overcoming interoperability problems of the legacy systems. The more stakeholders adopt Linked Data the stronger its benefits for data sharing will become.

  15. Conceptual Modeling for the Unified Medical Language System

    PubMed Central

    Barr, Charles E.; Komorowski, Henryk Jan; Pattison-Gordon, Edward; Greenes, Robert A.

    1988-01-01

    The Unified Medical Language System was proposed by the National Library of Medicine to facilitate the exchange and utilization of information from multiple sources. We are using semantic networks as the knowledge representation scheme in a prototype system to explore how to accomplish these goals. Conceptual modeling helps define a complete and consistent set of objects and relationships to include in the semantic net. Both top-down and bottom-up approaches were found useful in the seven step process of building the semantic network. Theoretical and practical issues are discussed as well as future extensions to the current prototype.

  16. Conceptual Modeling of mRNA Decay Provokes New Hypotheses

    PubMed Central

    Somekh, Judith; Haimovich, Gal; Guterman, Adi; Dori, Dov; Choder, Mordechai

    2014-01-01

    Biologists are required to integrate large amounts of data to construct a working model of the system under investigation. This model is often informal and stored mentally or textually, making it prone to contain undetected inconsistencies, inaccuracies, or even contradictions, not much less than a representation in free natural language. Using Object-Process Methodology (OPM), a formal yet visual and humanly accessible conceptual modeling language, we have created an executable working model of the mRNA decay process in Saccharomyces cerevisiae, as well as the import of its components to the nucleus following mRNA decay. We show how our model, which incorporates knowledge from 43 articles, can reproduce outcomes that match the experimental findings, evaluate hypotheses, and predict new possible outcomes. Moreover, we were able to analyze the effects of the mRNA decay model perturbations related to gene and interaction deletions, and predict the nuclear import of certain decay factors, which we then verified experimentally. In particular, we verified experimentally the hypothesis that Rpb4p, Lsm1p, and Pan2p remain bound to the RNA 3′-untralslated region during the entire process of the 5′ to 3′ degradation of the RNA open reading frame. The model has also highlighted erroneous hypotheses that indeed were not in line with the experimental outcomes. Beyond the scientific value of these specific findings, this work demonstrates the value of the conceptual model as an in silico vehicle for hypotheses generation and testing, which can reinforce, and often even replace, risky, costlier wet lab experiments. PMID:25255440

  17. Conceptual Understanding of Climate Change with a Simple Climate Model

    NASA Astrophysics Data System (ADS)

    Dommenget, Dietmar; Floeter, Janine

    2010-05-01

    The future climate change projections are essentially based on coupled general circulation model (CGCM) simulations, which give a distinct global warming pattern with arctic winter amplification, an equilibrium land-sea warming contrast and an inter-hemispheric warming gradient. While these simulations are the most important tool of the Intergovernmental Panel on Climate Change (IPCC) predictions, the conceptual understanding of these predicted structures of climate change and the causes of their uncertainties is very difficult to reach if only based on these highly complex CGCM simulations. In the study presented here we will introduce a very simple, globally resolved energy balance (GREB) model, which is capable of simulating the main characteristics of global warming. The model shall give a bridge between the strongly simplified energy balance models and the fully coupled 4-dimensional complex CGCMs. It provides a fast tool for the conceptual understanding and development of hypotheses for climate change studies and teaching. It is based on the surface energy balance by very simple representations of solar and thermal radiation, the atmospheric hydrological cycle, sensible turbulent heat flux, the transport by the mean atmospheric circulation and heat exchange with the deeper ocean. It can be run on any PC computer and compute 200yrs climate scenarios within minutes. The simple model's climate sensitivity and the spatial structure of the warming pattern are within the uncertainties of the IPCC models simulations. It is capable of simulating the arctic winter amplification, the equilibrium land-sea warming contrast and the inter-hemispheric warming gradient with good agreement to the IPCC models in amplitude and structure.

  18. Assessment of Alternative Conceptual Models Using Reactive Transport Modeling with Monitoring Data

    NASA Astrophysics Data System (ADS)

    Dai, Z.; Price, V.; Heffner, D.; Hodges, R.; Temples, T.; Nicholson, T.

    2005-12-01

    Monitoring data proved very useful in evaluating alternative conceptual models, simulating contaminant transport behavior, and reducing uncertainty. A graded approach using three alternative conceptual site models was formulated to simulate a field case of tetrachloroethene (PCE) transport and biodegradation. These models ranged from simple to complex in their representation of subsurface heterogeneities. The simplest model was a single-layer homogeneous aquifer that employed an analytical reactive transport code, BIOCHLOR (Aziz et al., 1999). Due to over-simplification of the aquifer structure, this simulation could not reproduce the monitoring data. The second model consisted of a multi-layer conceptual model, in combination with numerical modules, MODFLOW and RT3D within GMS, to simulate flow and reactive transport. Although the simulation results from the second model were comparatively better than those from the simple model, they still did not adequately reproduce the monitoring well concentrations because the geological structures were still inadequately defined. Finally, a more realistic conceptual model was formulated that incorporated heterogeneities and geologic structures identified from well logs and seismic survey data using the Petra and PetraSeis software. This conceptual model included both a major channel and a younger channel that were detected in the PCE source area. In this model, these channels control the local ground-water flow direction and provide a preferential chemical transport pathway. Simulation results using this conceptual site model proved compatible with the monitoring concentration data. This study demonstrates that the bias and uncertainty from inadequate conceptual models are much larger than those introduced from an inadequate choice of model parameter values (Neuman and Wierenga, 2003; Meyer et al., 2004; Ye et al., 2004). This case study integrated conceptual and numerical models, based on interpreted local hydrogeologic and

  19. An analogue conceptual rainfall-runoff model for educational purposes

    NASA Astrophysics Data System (ADS)

    Herrnegger, Mathew; Riedl, Michael; Schulz, Karsten

    2016-04-01

    Conceptual rainfall-runoff models, in which runoff processes are modelled with a series of connected linear and non-linear reservoirs, remain widely applied tools in science and practice. Additionally, the concept is appreciated in teaching due to its somewhat simplicity in explaining and exploring hydrological processes of catchments. However, when a series of reservoirs are used, the model system becomes highly parametrized and complex and the traceability of the model results becomes more difficult to explain to an audience not accustomed to numerical modelling. Since normally the simulations are performed with a not visible digital code, the results are also not easily comprehensible. This contribution therefore presents a liquid analogue model, in which a conceptual rainfall-runoff model is reproduced by a physical model. This consists of different acrylic glass containers representing different storage components within a catchment, e.g. soil water or groundwater storage. The containers are equipped and connected with pipes, in which water movement represents different flow processes, e.g. surface runoff, percolation or base flow. Water from a storage container is pumped to the upper part of the model and represents effective rainfall input. The water then flows by gravity through the different pipes and storages. Valves are used for controlling the flows within the analogue model, comparable to the parameterization procedure in numerical models. Additionally, an inexpensive microcontroller-based board and sensors are used to measure storage water levels, with online visualization of the states as time series data, building a bridge between the analogue and digital world. The ability to physically witness the different flows and water levels in the storages makes the analogue model attractive to the audience. Hands-on experiments can be performed with students, in which different scenarios or catchment types can be simulated, not only with the analogue but

  20. Updated Conceptual Model for the 300 Area Uranium Groundwater Plume

    SciTech Connect

    Zachara, John M.; Freshley, Mark D.; Last, George V.; Peterson, Robert E.; Bjornstad, Bruce N.

    2012-11-01

    The 300 Area uranium groundwater plume in the 300-FF-5 Operable Unit is residual from past discharge of nuclear fuel fabrication wastes to a number of liquid (and solid) disposal sites. The source zones in the disposal sites were remediated by excavation and backfilled to grade, but sorbed uranium remains in deeper, unexcavated vadose zone sediments. In spite of source term removal, the groundwater plume has shown remarkable persistence, with concentrations exceeding the drinking water standard over an area of approximately 1 km2. The plume resides within a coupled vadose zone, groundwater, river zone system of immense complexity and scale. Interactions between geologic structure, the hydrologic system driven by the Columbia River, groundwater-river exchange points, and the geochemistry of uranium contribute to persistence of the plume. The U.S. Department of Energy (DOE) recently completed a Remedial Investigation/Feasibility Study (RI/FS) to document characterization of the 300 Area uranium plume and plan for beginning to implement proposed remedial actions. As part of the RI/FS document, a conceptual model was developed that integrates knowledge of the hydrogeologic and geochemical properties of the 300 Area and controlling processes to yield an understanding of how the system behaves and the variables that control it. Recent results from the Hanford Integrated Field Research Challenge site and the Subsurface Biogeochemistry Scientific Focus Area Project funded by the DOE Office of Science were used to update the conceptual model and provide an assessment of key factors controlling plume persistence.

  1. Development of a conceptual model of cancer caregiver health literacy.

    PubMed

    Yuen, E Y N; Dodson, S; Batterham, R W; Knight, T; Chirgwin, J; Livingston, P M

    2016-03-01

    Caregivers play a vital role in caring for people diagnosed with cancer. However, little is understood about caregivers' capacity to find, understand, appraise and use information to improve health outcomes. The study aimed to develop a conceptual model that describes the elements of cancer caregiver health literacy. Six concept mapping workshops were conducted with 13 caregivers, 13 people with cancer and 11 healthcare providers/policymakers. An iterative, mixed methods approach was used to analyse and synthesise workshop data and to generate the conceptual model. Six major themes and 17 subthemes were identified from 279 statements generated by participants during concept mapping workshops. Major themes included: access to information, understanding of information, relationship with healthcare providers, relationship with the care recipient, managing challenges of caregiving and support systems. The study extends conceptualisations of health literacy by identifying factors specific to caregiving within the cancer context. The findings demonstrate that caregiver health literacy is multidimensional, includes a broad range of individual and interpersonal elements, and is influenced by broader healthcare system and community factors. These results provide guidance for the development of: caregiver health literacy measurement tools; strategies for improving health service delivery, and; interventions to improve caregiver health literacy. PMID:25630765

  2. A conceptual model of people's vulnerability to floods

    NASA Astrophysics Data System (ADS)

    Milanesi, Luca; Pilotti, Marco; Ranzi, Roberto

    2015-01-01

    Hydraulic risk maps provide the baseline for land use and emergency planning. Accordingly, they should convey clear information on the potential physical implications of the different hazards to the stakeholders. This paper presents a vulnerability criterion focused on human stability in a flow specifically devised for rapidly evolving floods where life, before than economic values, might be threatened. The human body is conceptualized as a set of cylinders and its stability to slipping and toppling is assessed by forces and moments equilibrium. Moreover, a depth threshold to consider drowning is assumed. In order to widen its scope of application, the model takes the destabilizing effect of local slope (so far disregarded in the literature) and fluid density into account. The resulting vulnerability classification could be naturally subdivided in three levels (low, medium, and high) that are limited by two stability curves for children and adults, respectively. In comparison with the most advanced literature conceptual approaches, the proposed model is weakly parameterized and the computed thresholds fit better the available experimental data sets. A code that implements the proposed algorithm is provided.

  3. Comparison of two conceptual models of flow using the TSA

    SciTech Connect

    Wilson, M.L.

    1992-01-01

    As part of the performance-assessment task for the potential repository site at Yucca Mountain, Nevada, Sandia National Laboratories is developing a set of programs called the Total-System Analyzer (TSA). The TSA is one of the tools being used in the current effort to provide a systematic preliminary estimate the total-system performance of the Yucca Mountain site. The purposes of this paper are twofold: (1) to describe capabilities that have been added to the TSA in the last year; and (2) to present a comparison of two conceptual models of unsaturated-zone flow and transport, in terms of the performance measure specified by the Environmental Protection Agency (EPA) in 40 CFR Part 191. The conceptual-model comparison is intended to demonstrate the new TSA capabilities and at the same time shed some light on the performance implications of fracture flow at Yucca Mountain. Unsaturated fracture flow is not yet well understood, and it is of great importance in determining the performance of Yucca Mountain.

  4. Scientific Digital Libraries, Interoperability, and Ontologies

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  5. National Flood Interoperability Experiment

    NASA Astrophysics Data System (ADS)

    Maidment, D. R.

    2014-12-01

    The National Flood Interoperability Experiment is led by the academic community in collaboration with the National Weather Service through the new National Water Center recently opened on the Tuscaloosa campus of the University of Alabama. The experiment will also involve the partners in IWRSS (Integrated Water Resources Science and Services), which include the USGS, the Corps of Engineers and FEMA. The experiment will address the following questions: (1) How can near-real-time hydrologic forecasting at high spatial resolution, covering the nation, be carried out using the NHDPlus or next generation geofabric (e.g. hillslope, watershed scales)? (2) How can this lead to improved emergency response and community resilience? (3) How can improved an improved interoperability framework support the first two goals and lead to sustained innovation in the research to operations process? The experiment will run from September 2014 through August 2015, in two phases. The mobilization phase from September 2014 until May 2015 will assemble the components of the interoperability framework. A Summer Institute to integrate the components will be held from June to August 2015 at the National Water Center involving faculty and students from the University of Alabama and other institutions coordinated by CUAHSI. It is intended that the insight that arises from this experiment will help lay the foundation for a new national scale, high spatial resolution, near-real-time hydrologic simulation system for the United States.

  6. Importance of incorporating agriculture in conceptual rainfall-runoff models

    NASA Astrophysics Data System (ADS)

    de Boer-Euser, Tanja; Hrachowitz, Markus; Winsemius, Hessel; Savenije, Hubert

    2016-04-01

    Incorporating spatially variable information is a frequently discussed option to increase the performance of (semi-)distributed conceptual rainfall-runoff models. One of the methods to do this is by using this spatially variable information to delineate Hydrological Response Units (HRUs) within a catchment. In large parts of Europe the original forested land cover is replaced by an agricultural land cover. This change in land cover probably affects the dominant runoff processes in the area, for example by increasing the Hortonian overland flow component, especially on the flatter and higher elevated parts of the catchment. A change in runoff processes implies a change in HRUs as well. A previous version of our model distinguished wetlands (areas close to the stream) from the remainder of the catchment. However, this configuration was not able to reproduce all fast runoff processes, both in summer as in winter. Therefore, this study tests whether the reproduction of fast runoff processes can be improved by incorporating a HRU which explicitly accounts for the effect of agriculture. A case study is carried out in the Ourthe catchment in Belgium. For this case study the relevance of different process conceptualisations is tested stepwise. Among the conceptualisations are Hortonian overland flow in summer and winter, reduced infiltration capacity due to a partly frozen soil and the relative effect of rainfall and snow smelt in case of this frozen soil. The results show that the named processes can make a large difference on event basis, especially the Hortonian overland flow in summer and the combination of rainfall and snow melt on (partly) frozen soil in winter. However, differences diminish when the modelled period of several years is evaluated based on standard metrics like Nash-Sutcliffe Efficiency. These results emphasise on one hand the importance of incorporating the effects of agricultural in conceptual models and on the other hand the importance of more event

  7. A conceptual model for short-term inpatient group psychotherapy.

    PubMed

    Kibel, H D

    1981-01-01

    The author reviews the history of the literature on inpatient group psychotherapy. Key observations of early workers--the central role of the group leader, the experiential benefits of the group, and the relationship to the milieu--have not resulted in wide application of this form of therapy because of limitations of previous conceptual models. The model presented draws on concepts of general systems and object relations theory. General systems theory explains how the small therapy group symbolically reflects the dynamic process of the psychiatric unit. Object relations theory provides a unique understanding of the central regulatory function of the therapist and the beneficial effects of the group. The author provides clinical illustrations of these points. PMID:7446787

  8. Trade-offs underlying maternal breastfeeding decisions: a conceptual model.

    PubMed

    Tully, Kristin P; Ball, Helen L

    2013-01-01

    This paper presents a new conceptual model that generates predictions about breastfeeding decisions and identifies interactions that affect outcomes. We offer a contextual approach to infant feeding that models multi-directional influences by expanding on the evolutionary parent-offspring conflict and situation-specific breastfeeding theories. The main hypothesis generated from our framework suggests that simultaneously addressing breastfeeding costs and benefits, in relation to how they are interpreted by mothers, will be most effective. Our approach focuses on contributors to the attitudes and commitment underlying breastfeeding outcomes, beginning in the prenatal period. We conclude that some maternal-offspring conflict is inherent with the dynamic infant feeding relationship. Guidance that anticipates and addresses family trade-offs over time can be incorporated into breastfeeding support for families. PMID:22188564

  9. Secrets in Primary Care: A Qualitative Exploration and Conceptual Model

    PubMed Central

    Biderman, Aya; Mitki, Revital; Borkan, Jeffrey M.

    2007-01-01

    Objective Secrets and issues of confidentiality are critical concerns in doctor–patient communication and fundamental aspects of every medical encounter. Nevertheless, the nature, content, prevalence, impact, and consequences of secrets in medicine have largely been unexplored. This study investigates the role of secrets in primary care. It describes the intuitive strategies used by primary care physicians to cope with secrets, provides a categorization system, and suggests a conceptual model. Design Focus groups of primary care physicians were the principal data collection method employed. Transcripts from 8 focus groups were analyzed using an “immersion–crystallization” framework involving cycles of concentrated textual review of data. Insights from this iterative process and from the literature were employed in the construction of contextual types, content categories, processes, and models. Participants Sixty-one family physicians and general practitioners in Israel with a wide variety of seniority, ethnic, religious, and immigration backgrounds. Setting Locations in the north, south, and center of Israel. Results Analysis revealed insights about definitions, prevalence, process, and content of secrets in primary care. The main content findings centered on categories of secrets such as propensity to secrecy, toxicity of secrets, and the special nature of secrets in family medicine. The main process findings regarded the life cycle of secrets and doctors’ coping strategies. Based on our findings and a review of the literature, a conceptual model of secrets in primary care is proposed. Conclusions The importance and impact of secrets are significant part of daily medical practice. Further research is needed to enhance physicians’ effective and ethical handling of secrets and secrecy in practice. PMID:17487521

  10. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    NASA Astrophysics Data System (ADS)

    Tomas, Robert; Lutz, Michael

    2013-04-01

    The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard

  11. Climate stability and sensitivity in some simple conceptual models

    NASA Astrophysics Data System (ADS)

    Bates, J. Ray

    2012-02-01

    A theoretical investigation of climate stability and sensitivity is carried out using three simple linearized models based on the top-of-the-atmosphere energy budget. The simplest is the zero-dimensional model (ZDM) commonly used as a conceptual basis for climate sensitivity and feedback studies. The others are two-zone models with tropics and extratropics of equal area; in the first of these (Model A), the dynamical heat transport (DHT) between the zones is implicit, in the second (Model B) it is explicitly parameterized. It is found that the stability and sensitivity properties of the ZDM and Model A are very similar, both depending only on the global-mean radiative response coefficient and the global-mean forcing. The corresponding properties of Model B are more complex, depending asymmetrically on the separate tropical and extratropical values of these quantities, as well as on the DHT coefficient. Adopting Model B as a benchmark, conditions are found under which the validity of the ZDM and Model A as climate sensitivity models holds. It is shown that parameter ranges of physical interest exist for which such validity may not hold. The 2 × CO2 sensitivities of the simple models are studied and compared. Possible implications of the results for sensitivities derived from GCMs and palaeoclimate data are suggested. Sensitivities for more general scenarios that include negative forcing in the tropics (due to aerosols, inadvertent or geoengineered) are also studied. Some unexpected outcomes are found in this case. These include the possibility of a negative global-mean temperature response to a positive global-mean forcing, and vice versa.

  12. Conceptual model and map of financial exploitation of older adults.

    PubMed

    Conrad, Kendon J; Iris, Madelyn; Ridings, John W; Fairman, Kimberly P; Rosen, Abby; Wilber, Kathleen H

    2011-10-01

    This article describes the processes and outcomes of three-dimensional concept mapping to conceptualize financial exploitation of older adults. Statements were generated from a literature review and by local and national panels consisting of 16 experts in the field of financial exploitation. These statements were sorted and rated using Concept Systems software, which grouped the statements into clusters and depicted them as a map. Statements were grouped into six clusters, and ranked by the experts as follows in descending severity: (a) theft and scams, (b) financial victimization, (c) financial entitlement, (d) coercion, (e) signs of possible financial exploitation, and (f) money management difficulties. The hierarchical model can be used to identify elder financial exploitation and differentiate it from related but distinct areas of victimization. The severity hierarchy may be used to develop measures that will enable more precise screening for triage of clients into appropriate interventions. PMID:21978290

  13. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool; (2) a low fidelity simulator development tool; (3) a dynamic, interactive interface between the HCI and the simulator; and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  14. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool, (2) a low fidelity simulator development tool, (3) a dynamic, interactive interface between the HCI and the simulator, and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  15. A methodology for the development of software agent based interoperable telemedicine systems: a tele-electrocardiography perspective.

    PubMed

    Ganguly, P; Ray, P

    2000-01-01

    Telemedicine involves the integration of information, human-machine, and healthcare technologies. Because different modalities of patient care require applications running on heterogeneous computing environment, software interoperability is a major issue in telemedicine. Software agent technology provides a range of promising techniques to solve this problem. This article discusses the development of a methodology for the design of interoperable telemedicine systems (illustrated with a tele-electrocardiography application). Software interoperability between different applications can be modeled at different levels of abstraction such as physical interoperability, data-type interoperability, specification-level interoperability, and semantic interoperability. Software agents address the issue of software interoperability at semantic level. A popular object-oriented software development methodology - unified modeling language (UML) - has been used for this development. This research has demonstrated the feasibility of the development of agent-based interoperable telemedicine systems. More research is needed before widespread deployment of such systems can take place. PMID:10957742

  16. Recurrence time distributions of large earthquakes in conceptual model studies

    NASA Astrophysics Data System (ADS)

    Zoeller, G.; Hainzl, S.

    2007-12-01

    The recurrence time distribution of large earthquakes in seismically active regions is a crucial ingredient for seismic hazard assessment. However, due to sparse observational data and a lack of knowledge on the precise mechanisms controlling seismicity, this distribution is unknown. In many practical applications of seismic hazard assessment, the Brownian passage time (BPT) distribution (or a different distribution) is fitted to a small number of observational recurrence times. Here, we study various aspects of recurrence time distributions in conceptual models of individual faults and fault networks: First, the dependence of the recurrence time distribution on the fault interaction is investigated by means of a network of Brownian relaxation oscillators. Second, the Brownian relaxation oscillator is modified towards a model for large earthquakes, taking into account also the statistics of intermediate events in a more appropriate way. This model simulates seismicity in a fault zone consisting of a major fault and some surrounding smaller faults with Gutenberg-Richter type seismicity. This model can be used for more realistic and robust estimations of the real recurrence time distribution in seismic hazard assessment.

  17. Karstification beneath dam sites: From conceptual models to realistic scenarios

    NASA Astrophysics Data System (ADS)

    Hiller, Thomas; Kaufmann, Georg; Romanov, Douchko

    2010-05-01

    Dam sites located above soluble rock such as limestone or gypsum can leak in relatively short times (tenths of years), when compared to the natural time scale of karstification (10.000-100.000 years). The reason for this leakage is the high hydraulic gradient imposed by the reservoir that drives aggressive water through the fracture and fissure system of the bedrock and this aggressive water dissolves the rock and increases permeability fairly fast. Thus, on the one hand water losses through enlarged fractures can become a problem for the reservoir. On the other hand, the void space itself can be a risk for the dam structure above. This may have unpredictable ecological and economical consequences. We present a three-dimensional conceptual model study of karstification in dam-site areas on limestone bedrock. We compare our three-dimensional model to a standard two-dimensional dam site model to verify the results of our code. We further carry out a sensitivity analysis on the physical and chemical parameters driving the karstification to derive an empirical formulation of the breakthrough time TB. In a next step we implement a statistical fracture network and topography to approach a more realistic scenario. Finally we show the results of a three dimensional model based on a real dam site.

  18. GIS-based Conceptual Database Model for Planetary Geoscientific Mapping

    NASA Astrophysics Data System (ADS)

    van Gasselt, Stephan; Nass, Andrea; Neukum, Gerhard

    2010-05-01

    We here report on the conceptual design of a geodatabase model as part of a larger-scaled GIS-based system composed of several applications, templates and database backend which supports conducting combined geological as well as geomorphological mapping of planetary surfaces and which simplifies the process of maintaining data and map products. Performing geological and/or geomorphological stand-alone or systematic mapping of planetary surfaces supported by modern GIS environments involves several tasks to be performed before the actual mapping process can be carried out. Such tasks deal with setting up a working environment by querying and defining raster data from a variety of planetary missions to be used and processed, importing auxiliary data, defining projection parameters for one or more map layer(s) and each raster/vector dataset, importing processed data, and defining a variety of vector shape geometries and attributes for mapping in terms of geometry type, representation symbology and attribute domains in a consistent way. In order to allow consistent mapping approaches and subsequent homogenisation success, a mapper makes use of pre-defined model schemas (templates) and definitions allowing to import mapping representation and styles as well as a backbone geo-database to immediately start working and making use of the provided infrastructure. The conceptual geo-database design developed far involves the design of the main object and data layers and consists of objects, object types, their relationships and additionally the formulation of integrity conditions on a level which is in principle independent of the exact implementation and its environment. Furthermore, the data layer containing attribute domains has been implemented. The conceptual design has been crafted using ESRI's ArcGIS File Geodatabase environment but it can be exported to any other GDBMS. The overall layout consists of several main elements or entity groups composed of relations

  19. Conceptual models of the evolution of transgressive dune field systems

    NASA Astrophysics Data System (ADS)

    Hesp, Patrick A.

    2013-10-01

    This paper examines the evolutionary paths of some transgressive dune fields that have formed on different coasts of the world, and presents some initial conceptual models of system dynamics for transgressive dune sheets and dune fields. Various evolutionary pathways are conceptualized based on a visual examination of dune fields from around the world. On coasts with high sediment supply, dune sheets and dune fields tend to accumulate as large scale barrier systems with little colonization of vegetation in arid-hyper to arid climate regimes, and as multiple, active discrete phases of dune field and deflation plain couplets in temperate to tropical environments. Active dune fields tend to be singular entities on coasts with low to moderate sediment supply. Landscape complexity and vegetation richness and diversity increases as dune fields evolve from simple active sheets and dunes to single and multiple deflation plains and basins, precipitation ridges, nebkha fields and a host of other dune types associated with vegetation (e.g. trailing ridges, slacks, remnant knobs, gegenwalle ridges and dune track ridges, 'tree islands' and 'bush pockets'). Three principal scenarios of transgressive dune sheet and dune field development are discussed, including dune sheets or dune fields evolving directly from the backshore, development following foredune and/or dune field erosion, and development from the breakdown or merging of parabolic dunes. Various stages of evolution are outlined for each scenario. Knowledge of evolutionary patterns and stages in coastal dune fields is very limited and caution is urged in attempts to reverse, change and/or modify dune fields to 'restore' some perceived loss of ecosystem or dune functioning.

  20. Conceptual models of the evolution of transgressive dune field systems

    NASA Astrophysics Data System (ADS)

    A. Hesp, Patrick

    2013-10-01

    This paper examines the evolutionary paths of some transgressive dune fields that have formed on different coasts of the world, and presents some initial conceptual models of system dynamics for transgressive dune sheets and dune fields. Various evolutionary pathways are conceptualized based on a visual examination of dune fields from around the world. On coasts with high sediment supply, dune sheets and dune fields tend to accumulate as large scale barrier systems with little colonization of vegetation in arid-hyper to arid climate regimes, and as multiple, active discrete phases of dune field and deflation plain couplets in temperate to tropical environments. Active dune fields tend to be singular entities on coasts with low to moderate sediment supply. Landscape complexity and vegetation richness and diversity increases as dune fields evolve from simple active sheets and dunes to single and multiple deflation plains and basins, precipitation ridges, nebkha fields and a host of other dune types associated with vegetation (e.g. trailing ridges, slacks, remnant knobs, gegenwalle ridges and dune track ridges, ‘tree islands' and ‘bush pockets'). Three principal scenarios of transgressive dune sheet and dune field development are discussed, including dune sheets or dune fields evolving directly from the backshore, development following foredune and/or dune field erosion, and development from the breakdown or merging of parabolic dunes. Various stages of evolution are outlined for each scenario. Knowledge of evolutionary patterns and stages in coastal dune fields is very limited and caution is urged in attempts to reverse, change and/or modify dune fields to ‘restore' some perceived loss of ecosystem or dune functioning.

  1. Providing semantic interoperability between clinical care and clinical research domains.

    PubMed

    Laleci, Gokce Banu; Yuksel, Mustafa; Dogac, Asuman

    2013-03-01

    Improving the efficiency with which clinical research studies are conducted can lead to faster medication innovation and decreased time to market for new drugs. To increase this efficiency, the parties involved in a regulated clinical research study, namely, the sponsor, the clinical investigator and the regulatory body, each with their own software applications, need to exchange data seamlessly. However, currently, the clinical research and the clinical care domains are quite disconnected because each use different standards and terminology systems. In this article, we describe an initial implementation of the Semantic Framework developed within the scope of SALUS project to achieve interoperability between the clinical research and the clinical care domains. In our Semantic Framework, the core ontology developed for semantic mediation is based on the shared conceptual model of both of these domains provided by the BRIDG initiative. The core ontology is then aligned with the extracted semantic models of the existing clinical care and research standards as well as with the ontological representations of the terminology systems to create a model of meaning for enabling semantic mediation. Although SALUS is a research and development effort rather than a product, the current SALUS knowledge base contains around 4.7 million triples representing BRIDG DAM, HL7 CDA model, CDISC standards and several terminology ontologies. In order to keep the reasoning process within acceptable limits without sacrificing the quality of mediation, we took an engineering approach by developing a number of heuristic mechanisms. The results indicate that it is possible to build a robust and scalable semantic framework with a solid theoretical foundation for achieving interoperability between the clinical research and clinical care domains. PMID:23008263

  2. A Conceptual Model For Effluent-Dependent Riverine Environments

    NASA Astrophysics Data System (ADS)

    Murphy, M. T.; Meyerhoff, R. D.; Osterkamp, W. R.; Smith, E. L.; Hawkins, R. H.

    2001-12-01

    The Arid West Water Quality Research Project (WQRP) is a multi-year, EPA-funded scientific endeavor directed by the Pima County, Wastewater Management Department in southern Arizona and focussed upon several interconnected ecological questions. These questions are crucial to water quality management in the arid and semi arid western US. A key component has been the ecological, hydrological and geomorphological investigation of habitat created by the discharge of treated effluent into ephemeral streams. Such environments are fundamentally different from the dry streams or rivers they displace; however, they are clearly not the perennial streams they superficially resemble. Under Arizona State regulations, such streams can bear the use designation of "Effluent Dependent Waters," or EDWs. Before this investigation, a hydrological/ecological conceptual model for these unique ecosystems had not been published. We have constructed one for general review that is designed to direct future work in the WQRP. The project investigated ten representative, yet contrasting EDW sites distributed throughout arid areas of the western US, to gather both historical and reconnaissance level field data, including in-stream and riparian, habitat and morphometric fluvial data. In most cases, the cross sectional area of the prior channel is oversized relative to the discharge of the introduced effluent. Where bed control is absent, the channels are incised downstream of the discharge point, further suggesting a disequilibrium between the channel and the regulated effluent flow. Several of the studied stream systems primarily convey storm water and are aggradational, exhibiting braided or anastomizing channels, high energy bedforms, and spatially dynamic interfluves. Active channels are formed in response to individual storm events and can be highly dynamic in both location and cross-sectional morphology. This poses a geomorphological challenge in the selection of a discharge point. We

  3. [Conceptual model of teasing and bullying in adolescents].

    PubMed

    Lien, Angela Shin-Yu; Dai, Yu-Tzu; Lee, Ya-Ling

    2013-08-01

    Teasing and bullying incident levels have increased markedly in recent years according to international news reports. School and community-level action to stop and prevent bullying is a key focus of government education policy worldwide. Teasing is a usual facet of social interaction among youth and is related to bullying behavior. Although teasing and bullying are significant concerns, references for relevant concept analysis are lacking in the nursing field. To facilitate early screening to identify high-risk bullies and help victims effectively stop bullying events, concept analysis is needed to clarify and distinguish between the two concepts of teasing and bullying. The aim of this study is to integrate relevant published literature to determine the reasons for and relationships between teasing and bullying. We chose obesity as an example to construct a teasing and bullying conceptual model for adolescents and used this model to explore the related factors and health impacts of obesity. We found that both teaser intent and recipient perceptions correlated with bullying behavior. Duration and severity may induce teasing to become bullying. Because weight-based teasing is common among adolescents, we chose obesity as an example issue to demonstrate our adolescents teasing and bullying concept model. We then integrated the antecedent and consequential factors of teasing and bullying for obese adolescents. Weight-control strategies can stop school bullying if early interventions are performed in high-risk populations. PMID:23922094

  4. Towards methodical modelling: Differences between the structure and output dynamics of multiple conceptual models

    NASA Astrophysics Data System (ADS)

    Knoben, Wouter; Woods, Ross; Freer, Jim

    2016-04-01

    Conceptual hydrologic models consist of a certain arrangement of spatial and temporal dynamics consisting of stores, fluxes and transformation functions, depending on the modeller's choices and intended use. They have the advantages of being computationally efficient, being relatively easy model structures to reconfigure and having relatively low input data demands. This makes them well-suited for large-scale and large-sample hydrology, where appropriately representing the dominant hydrologic functions of a catchment is a main concern. Given these requirements, the number of parameters in the model cannot be too high, to avoid equifinality and identifiability issues. This limits the number and level of complexity of dominant hydrologic processes the model can represent. Specific purposes and places thus require a specific model and this has led to an abundance of conceptual hydrologic models. No structured overview of these models exists and there is no clear method to select appropriate model structures for different catchments. This study is a first step towards creating an overview of the elements that make up conceptual models, which may later assist a modeller in finding an appropriate model structure for a given catchment. To this end, this study brings together over 30 past and present conceptual models. The reviewed model structures are simply different configurations of three basic model elements (stores, fluxes and transformation functions), depending on the hydrologic processes the models are intended to represent. Differences also exist in the inner workings of the stores, fluxes and transformations, i.e. the mathematical formulations that describe each model element's intended behaviour. We investigate the hypothesis that different model structures can produce similar behavioural simulations. This can clarify the overview of model elements by grouping elements which are similar, which can improve model structure selection.

  5. Student Conceptual Level and Models of Teaching: Theoretical and Empirical Coordination of Two Models.

    ERIC Educational Resources Information Center

    Hunt, David E.; And Others

    The Conceptual Level (CL) matching model describes the differential reaction of students varying in CL to educational environments varying in degree of structure. Models of teaching describe environments systematically varying in structure and therefore provide a specific basis for coordinated investigation of differential effects. The effects of…

  6. Computational Plume Modeling of COnceptual ARES Vehicle Stage Tests

    NASA Technical Reports Server (NTRS)

    Allgood, Daniel C.; Ahuja, Vineet

    2007-01-01

    The plume-induced environment of a conceptual ARES V vehicle stage test at the NASA Stennis Space Center (NASA-SSC) was modeled using computational fluid dynamics (CFD). A full-scale multi-element grid was generated for the NASA-SSC B-2 test stand with the ARES V stage being located in a proposed off-center forward position. The plume produced by the ARES V main power plant (cluster of five RS-68 LOX/LH2 engines) was simulated using a multi-element flow solver - CRUNCH. The primary objective of this work was to obtain a fundamental understanding of the ARES V plume and its impingement characteristics on the B-2 flame-deflector. The location, size and shape of the impingement region were quantified along with the un-cooled deflector wall pressures, temperatures and incident heating rates. Issues with the proposed tests were identified and several of these addressed using the CFD methodology. The final results of this modeling effort will provide useful data and boundary conditions in upcoming engineering studies that are directed towards determining the required facility modifications for ensuring safe and reliable stage testing in support of the Constellation Program.

  7. The Marx Models as Conceptual Models in School Effectiveness Research.

    ERIC Educational Resources Information Center

    Giesbers, J. H. G. I.; Sleegers, P.

    1994-01-01

    Discusses the educational and organizational theory of Ernst Marx, the educational-organizational models he developed, and their use in Dutch secondary schools, particularly to enhance policymaking capacity. Marx theory embraces several basic ideas: the capacity to individualize instruction, to offer a broad education, to enhance operational…

  8. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    NASA Astrophysics Data System (ADS)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  9. The Value of Conceptual Models in Coping with Complexity and Interdisciplinarity in Environmental Sciences Education

    ERIC Educational Resources Information Center

    Fortuin, Karen P. J.; van Koppen, C. S. A.; Leemans, Rik

    2011-01-01

    Conceptual models are useful for facing the challenges of environmental sciences curriculum and course developers and students. These challenges are inherent to the interdisciplinary and problem-oriented character of environmental sciences curricula. In this article, we review the merits of conceptual models in facing these challenges. These…

  10. Semantic Description of Educational Adaptive Hypermedia Based on a Conceptual Model

    ERIC Educational Resources Information Center

    Papasalouros, Andreas; Retalis, Symeon; Papaspyrou, Nikolaos

    2004-01-01

    The role of conceptual modeling in Educational Adaptive Hypermedia Applications (EAHA) is especially important. A conceptual model of an educational application depicts the instructional solution that is implemented, containing information about concepts that must be ac-quired by learners, tasks in which learners must be involved and resources…

  11. A Conceptual Model of Pain Assessment for Noncommunicative Persons with Dementia

    ERIC Educational Resources Information Center

    Snow, A. Lynn; O'Malley, Kimberly J.; Cody, Marisue; Kunik, Mark E.; Ashton, Carol M.; Beck, Cornelia; Bruera, Eduardo; Novy, Diane

    2004-01-01

    Purpose: Our objectives are to present a conceptual model of the pain assessment process in persons with dementia and discuss methods for validating our model within this population. Design and Methods: This conceptual work is based on an integrative review and current pain theory, pain assessment research in demented and nondemented populations,…

  12. Teacher Emotion Research: Introducing a Conceptual Model to Guide Future Research

    ERIC Educational Resources Information Center

    Fried, Leanne; Mansfield, Caroline; Dobozy, Eva

    2015-01-01

    This article reports on the development of a conceptual model of teacher emotion through a review of teacher emotion research published between 2003 and 2013. By examining 82 publications regarding teacher emotion, the main aim of the review was to identify how teacher emotion was conceptualised in the literature and develop a conceptual model to…

  13. An Interoperability Testing Study: Automotive Inventory Visibility and Interoperability

    SciTech Connect

    Ivezic, Nenad; Kulvatunyou, Boonserm; Frechette, Simon; Jones, Albert

    2004-01-01

    This paper describes a collaborative effort between the NIST and Korean Business-to-Business Interoperability Test Beds to support a global, automotive-industry interoperability project. The purpose of the collaboration is to develop a methodology for validation of interoperable data-content standards implemented across inventory visibility tools within an internationally adopted testing framework. In this paper we describe methods (1) to help the vendors consistently implement prescribed message standards and (2) to assess compliance of those implementations with respect to the prescribed data content standards. We also illustrate these methods in support of an initial proof of concept for an international IV&I scenario.

  14. Development of a conceptual model for vertical flow wetland metabolism.

    PubMed

    Giraldo, E; Zárate, E

    2001-01-01

    Four parallel vertical constructed wetlands, with a total area of 556 m2, are used to treat domestic wastewater, coming from a community of 550 inhabitants. The system includes pre-treatment with an anaerobic filter and post-treatment with chlorine, before discharging the effluent to the ocean. Four native species of macrophytes were planted: Paspalum penisetum, Typha sp, Conocarpres erectus and Scirpus lacustris. In situ measurements of gas content were performed for each bed during an operation cycle. After a feeding discharge, an unaltered sample of sand from each bed was taken, and a respirometric test was implemented to measure the metabolic activity in terms of oxygen consumption kinetics, CO2 production and organic matter degradation. The results were used to develop a conceptual model of the microbiologic metabolism for the process of organic matter removal from wastewater. Sorption in the bed is the main mechanism for organic matter removal from the wastewater, with subsequent biological oxidation during the resting period. The degradation rate for dissolved organic matter is found to be dependent on its concentration and on oxygen content in the gaseous phase. During the days of major activity, the oxygen content was not fully recovered when a new discharge occurred, finding anaerobic activity within the bed. PMID:11804107

  15. Conceptual hydrogeological model of a coastal hydrosystem in the mediterranean

    NASA Astrophysics Data System (ADS)

    Mitropapas, Anastasios; Pouliaris, Christos; Apostolopoulos, Georgios; Vasileiou, Eleni; Schüth, Christoph; Vienken, Thomas; Dietrich, Peter; Kallioras, Andreas

    2016-04-01

    Groundwater resources management in the Mediterranean basin is an issue of paramount importance that becomes a necessity in the case of the coastal hydrosystems. Coastal aquifers are considered very sensitive ecosystems that are subject to several stresses being of natural or anthropogenic origin. The coastal hydrosystem of Lavrion can be used as a reference site that incorporates multi-disciplinary environmental problems, which are typical for Circum-Mediterranean. This study presents the synthesis of a wide range of field activities within the area of Lavrion including the monitoring of water resources within all hydrologic zones (surface, unsaturated and saturated) and geophysical (invasive and non-invasive) surveys. Different monitoring approaches -targeting to the collection of hydrochemical, geophysical, geological, hydrological data- were applied, that proved to provide a sound characterization of the groundwater flows within the coastal karstic system in connection to the surrounding water bodies of the study area. The above are used as input parameters process during the development of the conceptual model of the coastal hydrosystem of Lavrion. Key-words: Coastal hydrosystems, Mediterranean basin, seawater intrusion

  16. Burden of Illness in Hereditary Angioedema: A Conceptual Model.

    PubMed

    Bygum, Anette; Aygören-Pürsün, Emel; Beusterien, Kathleen; Hautamaki, Emily; Sisic, Zlatko; Wait, Suzanne; Boysen, Henrik B; Caballero, Teresa

    2015-07-01

    The objective of the Hereditary Angioedema Burden of Illness Study in Europe was to assess the real-world experience of hereditary angioedema (HAE) from the patient perspective. Based on open-ended qualitative interviews with 30 patients from Spain, Germany and Denmark, 5 key themes emerged characterizing the impact of HAE on health-related quality of life (HRQoL): (i) unnecessary treatments and procedures, (ii) symptom triggers, (iii) attack impacts, (iv) caregiver impacts, and (v) long-term impacts. Patients for example experience unnecessary medical procedures due to diagnostic delays; anxiety and fear about attacks, and passing HAE to children; reduced work/school productivity; and limited career/educational achievement. Patient caregivers also experience worry and work/activity interruption during the attacks. In conclusion, a conceptual model was developed illustrating the hypothesized relationships among the wide-ranging short- and long-term HRQoL impacts of HAE. These findings can be used to highlight important issues in clinical management, raise awareness of the patients' experience among policymakers and help guide measurement of HRQoL outcomes in future studies in HAE. PMID:25394853

  17. Initial Conceptualization and Application of the Alaska Thermokarst Model

    NASA Astrophysics Data System (ADS)

    Bolton, W. R.; Lara, M. J.; Genet, H.; Romanovsky, V. E.; McGuire, A. D.

    2015-12-01

    Thermokarst topography forms whenever ice-rich permafrost thaws and the ground subsides due to the volume loss when ground ice transitions to water. The Alaska Thermokarst Model (ATM) is a large-scale, state-and-transition model designed to simulate transitions between landscape units affected by thermokarst disturbance. The ATM uses a frame-based methodology to track transitions and proportion of cohorts within a 1-km2 grid cell. In the arctic tundra environment, the ATM tracks thermokarst-related transitions among wetland tundra, graminoid tundra, shrub tundra, and thermokarst lakes. In the boreal forest environment, the ATM tracks transitions among forested permafrost plateau, thermokarst lakes, collapse scar fens and bogs. The transition from one cohort to another due to thermokarst processes can take place if thaw reaches ice-rich ground layers either due to pulse disturbance (i.e. large precipitation event or fires), or due to gradual active layer deepening that eventually results in penetration of the protective layer. The protective layer buffers the ice-rich soils from the land surface and is critical to determine how susceptible an area is to thermokarst degradation. The rate of terrain transition in our model is determined by a set of rules that are based upon the ice-content of the soil, the drainage efficiency (or the ability of the landscape to store or transport water), the cumulative probability of thermokarst initiation, distance from rivers, lake dynamics (increasing, decreasing, or stable), and other factors. Tundra types are allowed to transition from one type to another (for example, wetland tundra to graminoid tundra) under favorable climatic conditions. In this study, we present our conceptualization and initial simulation results from in the arctic (the Barrow Peninsula) and boreal (the Tanana Flats) regions of Alaska.

  18. A Conceptual Model of the Formation of Filament Barbs

    NASA Astrophysics Data System (ADS)

    Martin, S. F.

    1997-05-01

    Barbs are the structures along the sides of a filament that connect its horizontal axis to chromosphere. The barbs, previously called 'legs' can be considered as magnetic field conduits along which mass is continuously guided and transported to and from the chromosphere. In the model presented, the barbs represent a secondary stage in filament formation which follows an intial stage in which a nearly horizontal axial magnetic field is first formed along a filament channel. Barb formation is most effectively and readily illustrated where the filament channel is broad and well-developed such as exists among the decaying network remnants of active regions. In these circumstances, the filament channel is a region of relatively low magnetic flux density compared to adjacent areas further from the polarity inversion. H-alpha filtergrams show that the axial parts of the filament are low and nearly contiguous with the chromosphere. The low height of the axial field, and the relative absence of concentrations of network magnetic field, are favorable conditions for magnetic reconnection between the axial field of the filament and new ephemeral regions and intranetwork magnetic fields beneath the filament. These reconnections lead to the formation of the barbs joining parts of the newly emerged fields to the axial field of the filament. Barb formation and motions seen in H-alpha filtergrams provide the evidence for this initial part of the conceptual model. The remaining part of the model is a demonstration of why only right-bearing barbs are seen on dextral filaments and left-bearing barbs on sinistral filaments; this is due to the sinistral or dextral magnetic configuration of the filament channel which does not permit the survival of barbs of the non-observed chirality as will be illustrated.

  19. Modelling public risk evaluation of natural hazards: a conceptual approach

    NASA Astrophysics Data System (ADS)

    Plattner, Th.

    2005-04-01

    In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk) situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.

  20. Using the conceptual site model approach to characterize groundwater quality

    SciTech Connect

    Shephard, E.; Glucksberg, N.; Walter, N.

    2007-07-01

    To understand groundwater quality, the first step is to develop a conceptual site model (CSM) that describes the site history, describes the geology and the hydrogeology of the site, identifies potential release areas or sources, and evaluates the fate and transport of site related compounds. After the physical site setting is understood and potential release areas are identified, appropriate and representative groundwater monitoring wells may be used to evaluate groundwater quality at a site and provide a network to assess impacts from potential future releases. To develop the CSM, the first step to understand the different requirements from each of the regulatory stakeholders. Each regulatory agency may have different approaches to site characterization and closure (i.e., different groundwater and soil remediation criteria). For example, the United States Environmental Protection Agency (EPA) and state governments have published guidance documents that proscribe the required steps and information needed to develop a CSM. The Nuclear Regulatory Commission (NRC) has a proscriptive model for the Historical Site Assessment under the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM), and contains requirements for developing a conceptual site model in NUREG 1757. Federal and state agencies may also have different closure criteria for potential contaminants of concern. Understanding these differences before starting a groundwater monitoring program is important because the minimum detectable activity (MDA), lowest limit detection (LLD), and sample quantitation limit (SQL) must be low enough so that data may be evaluated under each of the programs. After a Historical Site Assessment is completed a work plan is developed and executed to not only collect physical data that describes the geology and hydrogeology, but to also characterize the soil, groundwater, sediments, and surface water quality of each potentially impacted areas. Although the primary

  1. Conceptual Model of Water Resources in the Kabul Basin, Afghanistan

    USGS Publications Warehouse

    Mack, Thomas J.; Akbari, M. Amin; Ashoor, M. Hanif; Chornack, Michael P.; Coplen, Tyler B.; Emerson, Douglas G.; Hubbard, Bernard E.; Litke, David W.; Michel, Robert L.; Plummer, L. Niel; Rezai, M. Taher; Senay, Gabriel B.; Verdin, James P.; Verstraeten, Ingrid M.

    2010-01-01

    The United States (U.S.) Geological Survey has been working with the Afghanistan Geological Survey and the Afghanistan Ministry of Energy and Water on water-resources investigations in the Kabul Basin under an agreement supported by the United States Agency for International Development. This collaborative investigation compiled, to the extent possible in a war-stricken country, a varied hydrogeologic data set and developed limited data-collection networks to assist with the management of water resources in the Kabul Basin. This report presents the results of a multidisciplinary water-resources assessment conducted between 2005 and 2007 to address questions of future water availability for a growing population and of the potential effects of climate change. Most hydrologic and climatic data-collection activities in Afghanistan were interrupted in the early 1980s as a consequence of war and civil strife and did not resume until 2003 or later. Because of the gap of more than 20 years in the record of hydrologic and climatic observations, this investigation has made considerable use of remotely sensed data and, where available, historical records to investigate the water resources of the Kabul Basin. Specifically, this investigation integrated recently acquired remotely sensed data and satellite imagery, including glacier and climatic data; recent climate-change analyses; recent geologic investigations; analysis of streamflow data; groundwater-level analysis; surface-water- and groundwater-quality data, including data on chemical and isotopic environmental tracers; and estimates of public-supply and agricultural water uses. The data and analyses were integrated by using a simplified groundwater-flow model to test the conceptual model of the hydrologic system and to assess current (2007) and future (2057) water availability. Recharge in the basin is spatially and temporally variable and generally occurs near streams and irrigated areas in the late winter and early

  2. Improving Conceptual Models Using AEM Data and Probability Distributions

    NASA Astrophysics Data System (ADS)

    Davis, A. C.; Munday, T. J.; Christensen, N. B.

    2012-12-01

    main diagonal is used. Complications arise when we ask more specific questions, such as "What is the probability that the resistivity of layer 2 <= x, given that layer 1 <= y?" The probability then becomes conditional, calculation includes covariance terms, the integration is taken over many dimensions, and the cross-correlation of parameters becomes important. To illustrate, we examine the inversion results of a Tempest AEM survey over the Uley Basin aquifers in the Eyre Peninsula, South Australia. Key aquifers include the unconfined Bridgewater Formation that overlies the Uley and Wanilla Formations, which contain Tertiary clays and Tertiary sandstone. These Formations overlie weathered basement which define the lower bound of the Uley Basin aquifer systems. By correlating the conductivity of the sub-surface Formation types, we pose questions such as: "What is the probability-depth of the Bridgewater Formation in the Uley South Basin?", "What is the thickness of the Uley Formation?" and "What is the most probable depth to basement?" We use these questions to generate improved conceptual hydrogeological models of the Uley Basin in order to develop better estimates of aquifer extent and the available groundwater resource.

  3. A Scoping Review: Conceptualizations and Pedagogical Models of Learning in Nursing Simulation

    ERIC Educational Resources Information Center

    Poikela, Paula; Teräs, Marianne

    2015-01-01

    Simulations have been implemented globally in nursing education for years with diverse conceptual foundations. The aim of this scoping review is to examine the literature regarding the conceptualizations of learning and pedagogical models in nursing simulations. A scoping review of peer-reviewed articles published between 2000 and 2013 was…

  4. Implementing Clickers to Assist Learning in Science Lectures: The Clicker-Assisted Conceptual Change Model

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Liu, Tzu-Chien; Chu, Ching-Chi

    2011-01-01

    The purposes of this study were twofold. The first aim was to design and develop a clicker-based instructional model known as "Clicker-Assisted Conceptual Change" (CACC), based on the cognitive conflict approach for conceptual change, to help students to learn scientific concepts. The second aim was to determine the beneficial effects of CACC on…

  5. Navigating Tensions between Conceptual and Metaconceptual Goals in the Use of Models

    ERIC Educational Resources Information Center

    Delgado, Cesar

    2015-01-01

    Science education involves learning about phenomena at three levels: concrete (facts and generalizations), conceptual (concepts and theories), and metaconceptual (epistemology) (Snir et al. in "J Sci Educ Technol" 2(2):373-388, 1993). Models are key components in science, can help build conceptual understanding, and may also build…

  6. River Basin Standards Interoperability Pilot

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture

  7. Distributed hydrological models: comparison between TOPKAPI, a physically based model and TETIS, a conceptually based model

    NASA Astrophysics Data System (ADS)

    Ortiz, E.; Guna, V.

    2009-04-01

    The present work aims to carry out a comparison between two distributed hydrological models, the TOPKAPI (Ciarapica and Todini, 1998; Todini and Ciarapica, 2001) and TETIS (Vélez, J. J.; Vélez J. I. and Francés, F, 2002) models, obtaining the hydrological solution computed on the basis of the same storm events. The first model is physically based and the second one is conceptually based. The analysis was performed on the 21,4 km2 Goodwin Creek watershed, located in Panola County, Mississippi. This watershed extensively monitored by the Agricultural Research Service (ARS) National Sediment Laboratory (NSL) has been chosen because it offers a complete database compiling precipitation (16 rain gauges), runoff (6 discharge stations) and GIS data. Three storm events were chosen to evaluate the performance of the two models: the first one was chosen to calibrate the models, and the other two to validate them. Both models performed a satisfactory hydrological response both in calibration and validation events. While for the TOPKAPI model it wasn't a real calibration, due to its really good performance with parameters modal values derived of watershed characteristics, for the TETIS model it has been necessary to perform a previous automatic calibration. This calibration was carried out using the data provided by the observed hydrograph, in order to adjust the modeĺs 9 correction factors. Keywords: TETIS, TOPKAPI, distributed models, hydrological response, ungauged basins.

  8. Design of an UML conceptual model and implementation of a GIS with metadata information for a seismic hazard assessment cooperative project.

    NASA Astrophysics Data System (ADS)

    Torres, Y.; Escalante, M. P.

    2009-04-01

    This work illustrates the advantages of using a Geographic Information System in a cooperative project with researchers of different countries, such as the RESIS II project (financed by the Norwegian Government and managed by CEPREDENAC) for seismic hazard assessment of Central America. As input data present different formats, cover distinct geographical areas and are subjected to different interpretations, data inconsistencies may appear and their management get complicated. To achieve data homogenization and to integrate them in a GIS, it is required previously to develop a conceptual model. This is accomplished in two phases: requirements analysis and conceptualization. The Unified Modeling Language (UML) is used to compose the conceptual model of the GIS. UML complies with ISO 19100 norms and allows the designer defining model architecture and interoperability. The GIS provides a frame for the combination of large geographic-based data volumes, with an uniform geographic reference and avoiding duplications. All this information contains its own metadata following ISO 19115 normative. In this work, the integration in the same environment of active faults and subduction slabs geometries, combined with the epicentres location, has facilitated the definition of seismogenetic regions. This is a great support for national specialists of different countries to make easier their teamwork. The GIS capacity for making queries (by location and by attributes) and geostatistical analyses is used to interpolate discrete data resulting from seismic hazard calculations and to create continuous maps as well as to check and validate partial results of the study. GIS-based products, such as complete, homogenised databases and thematic cartography of the region, are distributed to all researchers, facilitating cross-national communication, the project execution and results dissemination.

  9. Identifying Students' Mental Models of Sound Propagation: The Role of Conceptual Blending in Understanding Conceptual Change

    ERIC Educational Resources Information Center

    Hrepic, Zdeslav; Zollman, Dean A.; Rebello, N. Sanjay

    2010-01-01

    We investigated introductory physics students' mental models of sound propagation. We used a phenomenographic method to analyze the data in the study. In addition to the scientifically accepted Wave model, students used the "Entity" model to describe the propagation of sound. In this latter model sound is a self-standing entity, different from the…

  10. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  11. Mapping the Territory: A Conceptual Model of Scholastic Journalism.

    ERIC Educational Resources Information Center

    Arnold, Mary

    Intended to provide a comprehensive conceptual framework to serve as a scaffold for past, present, and future research on "scholastic journalism" (journalism in the secondary school), a topical content analysis of the Association for Education in Journalism and Mass Communication (AEJMC) Secondary Education Division research, teaching, and issues…

  12. Adapting Conceptual Models for Cross-Cultural Applications

    ERIC Educational Resources Information Center

    Perleth, Christoph; Heller, Kurt A.

    2007-01-01

    It is of major importance to use psychological tests and questionnaires that are carefully constructed so that their reliability and validity can be determined in different (sub)cultures (Campbell & Tirri, 2004). However, a necessary prerequisite for this is the development of solid conceptual constructs. Otherwise, the researcher runs into the…

  13. Using Multilevel Modeling in Language Assessment Research: A Conceptual Introduction

    ERIC Educational Resources Information Center

    Barkaoui, Khaled

    2013-01-01

    This article critiques traditional single-level statistical approaches (e.g., multiple regression analysis) to examining relationships between language test scores and variables in the assessment setting. It highlights the conceptual, methodological, and statistical problems associated with these techniques in dealing with multilevel or nested…

  14. Assessment of Model Parameters Interdependency of a Conceptual Rainfall-Runoff Model

    NASA Astrophysics Data System (ADS)

    Das, T.; Bárdossy, A.; Zehe, E.

    2006-12-01

    Conceptual rainfall-runoff models are widely used tools in hydrology. Contrary to more complex physically- based distributed models, the required input data are readily available for most applications in the world. In addition to their modest data requirement, conceptual models are usually simple and relatively easy to apply. However, for partly or fully conceptual models, some parameters cannot be considered as physically measured quantities and thus have to be estimated on the basis of the available data and information. However, in the range of input data, it is often not possible to find one unique parameter set, i.e. a number of parameter sets can lead to similar model results (known as equifinality). Nevertheless, the model parameter sets which lead to equally good model performance may have interesting internal structures. The issue of equifinality following the internal model structures was investigated in this paper using two examples. The first example is one which uses a simple two parameter sediment transport model in a river. A large number of parameter pairs was generated randomly. The results indicated that they both can be taken from a wide interval of possible values which might lead to satisfactory model performance. However, a well structured set is obtained if one investigates the set of parameters as pairs. The second example was given by using model parameters of the modified HBV conceptual rainfall-runoff model. One hundred independent calibration runs for the HBV model were carried out. These runs were done using the automatic calibration procedure based on the simulated optimization algorithm; each run using a different, randomly selected initial seed value required for the calibration algorithm. No explicit dependence between the parameters was considered. The results demonstrated that parameters of rainfall-runoff models often can not be identified as individual values. A large set of possible parameters can lead to a similar model

  15. Experimental Evidence of the Superiority of the Prevalence Model of Conceptual Change over the Classical Models and Repetition

    ERIC Educational Resources Information Center

    Potvin, Patrice; Sauriol, Érik; Riopel, Martin

    2015-01-01

    This quasi-experimental study investigated the effects on 558 grades five and six students of three different teaching conditions: the "classical" model of conceptual change (for which cognitive conflict is considered as a precondition to the transformation of knowledge), the "prevalence" model of conceptual change (in which…

  16. Conceptual geologic model and native state model of the Roosevelt Hot Springs hydrothermal system

    SciTech Connect

    Faulder, D.D.

    1991-01-01

    A conceptual geologic model of the Roosevelt Hot Springs hydrothermal system was developed by a review of the available literature. The hydrothermal system consists of a meteoric recharge area in the Mineral Mountains, fluid circulation paths to depth, a heat source, and an outflow plume. A conceptual model based on the available data can be simulated in the native state using parameters that fall within observed ranges. The model temperatures, recharge rates, and fluid travel times are sensitive to the permeability in the Mineral Mountains. The simulation results suggests the presence of a magma chamber at depth as the likely heat source. A two-dimensional study of the hydrothermal system can be used to establish boundary conditions for further study of the geothermal reservoir.

  17. Internet-Based Solutions for Manufacturing Enterprise Systems Interoperability - A Standards Perspective

    SciTech Connect

    Ivezic, Nenad; Kulvatunyou, Boonserm; Jones, Albert

    2004-10-01

    This chapter reviews efforts of selected standards consortia to develop Internet-based approaches for interoperable manufacturing enterprise information systems. The focus of the chapter is on the efforts to capture common meaning of data exchanged among interoperable information systems inside and outside a manufacturing enterprise. We start this chapter by giving a general overview of the key concepts in standards approaches to enable interoperable manufacturing enterprise systems. These approaches are compared on the basis of several characteristics found in standards frameworks such as horizontal or vertical focus of the standard, the standard message content definitions, the standard process definitions, and dependence on specific standard messaging solutions. After this initial overview, we establish one basis for reasoning about interoperable information systems by recognizing key manufacturing enterprise objects managed and exchanged both inside and outside the enterprise. Such conceptual objects are coarse in granularity and are meant to drive semantic definitions of data interchanges by providing a shared context for data dictionaries detailing the semantics of these objects and interactions or processes involved in data exchange. In the case of intra-enterprise interoperability, we recognize enterprise information processing activities, responsibilities, and those high-level conceptual objects exchanged in interactions among systems to fulfill the assigned responsibilities. Here, we show a mapping of one content standard onto the identified conceptual objects. In the case of inter-enterprise interoperability, we recognize key business processes areas and enumerate high-level conceptual objects that need to be exchanged among supply chain or trading partners. Here, we also show example mappings of representative content standards onto the identified conceptual objects. We complete this chapter by providing an account of some advanced work to enhance

  18. Best Practices for Preparing Interoperable Geospatial Data

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.

    2010-12-01

    Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable

  19. How to conceptualize catalytic cycles? The energetic span model.

    PubMed

    Kozuch, Sebastian; Shaik, Sason

    2011-02-15

    efficiency of a catalyst. Additionally, the TDI and TDTS are not necessarily the highest and lowest states, nor do they have to be adjoined as a single step. As such, we can conclude that a change in the conceptualization of catalytic cycles is in order: in catalysis, there are no rate-determining steps, but rather rate-determining states. We also include a study on the effect of reactant and product concentrations. In the energetic span approximation, only the reactants or products that are located between the TDI and TDTS accelerate or inhibit the reaction. In this manner, the energetic span model creates a direct link between experimental quantities and theoretical results. The versatility of the energetic span model is demonstrated with several catalytic cycles of organometallic reactions. PMID:21067215

  20. Validation of the Continuum of Care Conceptual Model for Athletic Therapy

    PubMed Central

    Lafave, Mark R.; Butterwick, Dale; Eubank, Breda

    2015-01-01

    Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete's return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT). The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1) heading descriptors; (2) the order of the model; (3) the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline. PMID:26464897

  1. On the effect of scaling conceptual model complexity on stochastic response for water quality modeling.

    PubMed

    Parker, G T

    2011-01-01

    This paper extends previous work comparing the response of water quality models under uncertainty. A new model, the River Water Quality Model no. 1 (RWQM1), is compared to the previous work of two commonly used water quality models. Additionally, the effect of conceptual model scaling within a single modelling framework, as allowed by RWQM1, is explored under uncertainty. Model predictions are examined using against real-world data for the Potomac River with a Generalized Likelihood Uncertainty Estimation used to assess model response surfaces to uncertainty. Generally, it was found that there are tangible model characteristics that are closely tied to model complexity and thresholds for these characteristics were discussed. The novel work has yielded an illustrative example but also a conceptually scaleable water quality modelling tool, alongside defined metrics to assess when scaling is required under uncertainty. The resulting framework holds substantial, unique, promise for a new generation of modelling tools that are capable of addressing classically intractable problems. PMID:21252443

  2. The influence of conceptual model structure on model performance: a comparative study for 237 French catchments

    NASA Astrophysics Data System (ADS)

    van Esse, W. R.; Perrin, C.; Booij, M. J.; Augustijn, D. C. M.; Fenicia, F.; Lobligeois, F.

    2013-04-01

    In hydrological studies models with a fixed structure are commonly used. For various reasons, these models do not always perform well. As an alternative, a flexible modelling approach could be followed, where the identification of the model structure is part of the model set-up procedure. In this study, the performance of twelve different conceptual model structures from the SUPERFLEX framework with varying complexity and the fixed model structure of GR4H were compared on a large set of 237 French catchments. The results showed that in general the flexible approach performs better than the fixed approach. However, the flexible approach has a higher chance of inconsistent results when implemented on two different periods. The same holds for more complex model structures. When for practical reasons a fixed model structure is preferred, this study shows that models with parallel reservoirs and a power function to describe the reservoir outflow perform best. In general, conceptual hydrological models perform better on large or wet catchments than on small or dry catchments. The model structures performed poorly when there was a climatic difference between the calibration and validation period, for catchments with flashy flows or disturbances in low flow measurements.

  3. Implications of Simulation Conceptual Model Development for Simulation Management and Uncertainty Assessment

    NASA Technical Reports Server (NTRS)

    Pace, Dale K.

    2000-01-01

    A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.

  4. A Study of Child Variance, Volume 1: Conceptual Models; Conceptual Project in Emotional Disturbance.

    ERIC Educational Resources Information Center

    Rhodes, William C.; Tracy, Michael L.

    Presented are 11 papers discussing the following six models of emotional disturbance in children: biophysical, behavioral, psychodynamic, sociological, and ecological, models, and counter theory. Emotional disturbance is defined as a distinctive human state having multiple manifestations and involving disability, deviance, and alienation. All the…

  5. A Conceptual Model of Relationships among Constructivist Learning Environment Perceptions, Epistemological Beliefs, and Learning Approaches

    ERIC Educational Resources Information Center

    Ozkal, Kudret; Tekkaya, Ceren; Cakiroglu, Jale; Sungur, Semra

    2009-01-01

    This study proposed a conceptual model of relationships among constructivist learning environment perception variables (Personal Relevance, Uncertainty, Critical Voice, Shared Control, and Student Negotiation), scientific epistemological belief variables (fixed and tentative), and learning approach. It was proposed that learning environment…

  6. A CONCEPTUAL MODEL FOR MULTI-SCALAR ASSESSMENTS OF ESTUARINE ECOLOGICAL INTEGRITY

    EPA Science Inventory

    A conceptual model was developed that relates an estuarine system's anthropogenic inputs to it's ecological integrity. Ecological integrity is operationally defined as an emergent property of an ecosystem that exists when the structural components are complete and the functional ...

  7. The influence of conceptual model structure on model performance: a comparative study for 237 French catchments

    NASA Astrophysics Data System (ADS)

    van Esse, W. R.; Perrin, C.; Booij, M. J.; Augustijn, D. C. M.; Fenicia, F.; Kavetski, D.; Lobligeois, F.

    2013-10-01

    Models with a fixed structure are widely used in hydrological studies and operational applications. For various reasons, these models do not always perform well. As an alternative, flexible modelling approaches allow the identification and refinement of the model structure as part of the modelling process. In this study, twelve different conceptual model structures from the SUPERFLEX framework are compared with the fixed model structure GR4H, using a large set of 237 French catchments and discharge-based performance metrics. The results show that, in general, the flexible approach performs better than the fixed approach. However, the flexible approach has a higher chance of inconsistent results when calibrated on two different periods. When analysing the subset of 116 catchments where the two approaches produce consistent performance over multiple time periods, their average performance relative to each other is almost equivalent. From the point of view of developing a well-performing fixed model structure, the findings favour models with parallel reservoirs and a power function to describe the reservoir outflow. In general, conceptual hydrological models perform better on larger and/or wetter catchments than on smaller and/or drier catchments. The model structures performed poorly when there were large climatic differences between the calibration and validation periods, in catchments with flashy flows, and in catchments with unexplained variations in low flow measurements.

  8. A Conceptual Hydrogeologic Model of the Vicinity of DUSEL Homestake

    NASA Astrophysics Data System (ADS)

    Murdoch, L. C.; Germanovich, L. N.; Boutt, D. F.; Kieft, T. L.; Wang, H. F.; Onstott, T. C.

    2009-12-01

    The Deep Underground Science and Engineering Laboratory (DUSEL) is a research facility planned to occupy the workings of the former Homestake gold mine in the northern Black Hills, South Dakota. The hydrogeology was of minor importance to locating and recovering gold ore, so it was overlooked during mining and is relatively unknown. This knowledge gap hinders planning of the Deep EcoHydrology Experiment at DUSEL and motivated the work described here. The conceptual hydrogeologic model is characterized by permeability that is assumed to be anisotropic and controlled by regional foliation, which strikes approximately N20W and dips steeply to the NE. Permeability is on the order of 0.1 mD in fresh rock, but increases to roughly 100 mD at shallow depths. The permeability distribution is assumed to result from unloading of the foliated rock, and a simple model of stress-dependence explains the permeability distribution and suggests that the more permeable zone is on the order of ~100 m thick. A stream hydrograph from Whitetail Creek (station 06436156) was analyzed to estimate recharge flux and the result indicates an average value of approximately 5 x 10-9 m/s. A numerical model of the vicinity of the mine was developed by representing the mine workings as a dual- porosity inclusion embedded in a single-porosity, anisotropic material. The extent of the dual-porosity medium was advanced downward based on the mining records and the hydraulic head within the material representing the mine workings was adjusted to represent filling and draining of the workings. The results suggest that the groundwater is characterized by a shallow flow system of distributed recharge that mostly discharges to nearby streams. The mine itself acts like a large sink that moves downward and to the southeast during mining, and then is controlled by variations in pumping rate once the mine reaches its greatest depth. The deep flow system consists of (i) a zone of relatively rapid flow from the

  9. Understanding Co-development of Conceptual and Epistemic Understanding through Modeling Practices with Mobile Internet

    NASA Astrophysics Data System (ADS)

    Ryu, Suna; Han, Yuhwha; Paik, Seoung-Hey

    2015-04-01

    The present study explores how engaging in modeling practice, along with argumentation, leverages students' epistemic and conceptual understanding in an afterschool science/math class of 16 tenth graders. The study also explores how students used mobile Internet phones (smart phones) productively to support modeling practices. As the modeling practices became more challenging, student discussion occurred more often, from what to model to providing explanations for the phenomenon. Students came to argue about evidence that supported their model and how the model could explain target and related phenomena. This finding adds to the literature that modeling practice can help students improve conceptual understanding of subject knowledge as well as epistemic understanding.

  10. From the Conceptual Change Model to the Productive Ecological Koinos Model: Learning that transcends

    NASA Astrophysics Data System (ADS)

    Gelpi-Rodriguez, Phaedra

    This investigation presents the analysis of a model of teaching science called the Conceptual Change Model. This model stimulates students to identify their own and alternate science concepts, and to confront these concepts with dynamic situations that will incite a conceptual change and promote their ability to master and understand the conceptual systems that serve as foundations for scientific knowledge. During a previous research made by this investigator on the Conceptual Change Model, a proposal for a new teaching model came up which she called the Productive Ecological Koinos Model. This model incorporates, among other things, the teacher's reflection and inner thoughts about the concepts taught and the learning experiences achieved in concurrence with students. Using action research, an exploration and analysis was done that focused upon how students and teachers modified their perspective of science while testing the Productive Ecological Koinos Model during the teaching-learning processes that took place in a microbiology course. The action research design allows the researcher to analyze these points from the experiential perspective, while also allowing the researcher to participate in the study. The study employed qualitative research techniques such as reflective diaries, personal profiles of participants, document analysis, audio tape recordings and transcriptions. All of these techniques are accepted within action research (Elliot, 1991). The Wolcott Model was the data analysis method used in the research. The description, analysis and interpretation carried out allowed for the examination of the various components of the Productive Ecological Koinos Model with students and teachers as to the scientific terms virus and contagion, and their experiences during the learning process within and outside the classroom. From the analysis of the Model a modification cropped up which places emphasis on conscious introspection on the learning process. This new