Sample records for model model documentation

  1. Model documentation report: Transportation sector model of the National Energy Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-03-01

    This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. This document serves three purposes. First, it is a reference document providing a detailed description of TRAN for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, 57(b)(1)). Third, it permits continuity inmore » model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements.« less

  2. Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.

    ERIC Educational Resources Information Center

    Chen, Z.

    1993-01-01

    Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)

  3. 40 CFR 52.1490 - Original identification of plan.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... measures. (ii) A modeling analysis indicating 1982 attainment. (iii) Documentation of the modeling analysis... agencies, (ii) Additional supporting documentation for the 1982 attainment modeling analysis which included... factors for the model. (iii) A revised 1982 attainment modeling analysis and supporting documentation...

  4. 40 CFR 52.1490 - Original identification of plan.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... measures. (ii) A modeling analysis indicating 1982 attainment. (iii) Documentation of the modeling analysis... agencies, (ii) Additional supporting documentation for the 1982 attainment modeling analysis which included... factors for the model. (iii) A revised 1982 attainment modeling analysis and supporting documentation...

  5. 40 CFR 52.1490 - Original identification of plan.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... measures. (ii) A modeling analysis indicating 1982 attainment. (iii) Documentation of the modeling analysis... agencies, (ii) Additional supporting documentation for the 1982 attainment modeling analysis which included... factors for the model. (iii) A revised 1982 attainment modeling analysis and supporting documentation...

  6. Information Retrieval: A Sequential Learning Process.

    ERIC Educational Resources Information Center

    Bookstein, Abraham

    1983-01-01

    Presents decision-theoretic models which intrinsically include retrieval of multiple documents whereby system responds to request by presenting documents to patron in sequence, gathering feedback, and using information to modify future retrievals. Document independence model, set retrieval model, sequential retrieval model, learning model,…

  7. Managing the life cycle of electronic clinical documents.

    PubMed

    Payne, Thomas H; Graham, Gail

    2006-01-01

    To develop a model of the life cycle of clinical documents from inception to use in a person's medical record, including workflow requirements from clinical practice, local policy, and regulation. We propose a model for the life cycle of clinical documents as a framework for research on documentation within electronic medical record (EMR) systems. Our proposed model includes three axes: the stages of the document, the roles of those involved with the document, and the actions those involved may take on the document at each stage. The model includes the rules to describe who (in what role) can perform what actions on the document, and at what stages they can perform them. Rules are derived from needs of clinicians, and requirements of hospital bylaws and regulators. Our model encompasses current practices for paper medical records and workflow in some EMR systems. Commercial EMR systems include methods for implementing document workflow rules. Workflow rules that are part of this model mirror functionality in the Department of Veterans Affairs (VA) EMR system where the Authorization/ Subscription Utility permits document life cycle rules to be written in English-like fashion. Creating a model of the life cycle of clinical documents serves as a framework for discussion of document workflow, how rules governing workflow can be implemented in EMR systems, and future research of electronic documentation.

  8. Creating, documenting and sharing network models.

    PubMed

    Crook, Sharon M; Bednar, James A; Berger, Sandra; Cannon, Robert; Davison, Andrew P; Djurfeldt, Mikael; Eppler, Jochen; Kriener, Birgit; Furber, Steve; Graham, Bruce; Plesser, Hans E; Schwabe, Lars; Smith, Leslie; Steuber, Volker; van Albada, Sacha

    2012-01-01

    As computational neuroscience matures, many simulation environments are available that are useful for neuronal network modeling. However, methods for successfully documenting models for publication and for exchanging models and model components among these projects are still under development. Here we briefly review existing software and applications for network model creation, documentation and exchange. Then we discuss a few of the larger issues facing the field of computational neuroscience regarding network modeling and suggest solutions to some of these problems, concentrating in particular on standardized network model terminology, notation, and descriptions and explicit documentation of model scaling. We hope this will enable and encourage computational neuroscientists to share their models more systematically in the future.

  9. A nursing-specific model of EPR documentation: organizational and professional requirements.

    PubMed

    von Krogh, Gunn; Nåden, Dagfinn

    2008-01-01

    To present the Norwegian documentation KPO model (quality assurance, problem solving, and caring). To present the requirements and multiple electronic patient record (EPR) functions the model is designed to address. The model's professional substance, a conceptual framework for nursing practice is developed by examining, reorganizing, and completing existing frameworks. The model's methodology, an information management system, is developed using an expert group. Both model elements were clinically tested over a period of 1 year. The model is designed for nursing documentation in step with statutory, organizational, and professional requirements. Complete documentation is arranged for by incorporating the Nursing Minimum Data Set. A systematic and comprehensive documentation is arranged for by establishing categories as provided in the model's framework domains. Consistent documentation is arranged for by incorporating NANDA-I Nursing Diagnoses, Nursing Intervention Classification, and Nursing Outcome Classification. The model can be used as a tool in cooperation with vendors to ensure the interests of the nursing profession is met when developing EPR solutions in healthcare. The model can provide clinicians with a framework for documentation in step with legal and organizational requirements and at the same time retain the ability to record all aspects of clinical nursing.

  10. Dynamic reduction of dimensions of a document vector in a document search and retrieval system

    DOEpatents

    Jiao, Yu; Potok, Thomas E.

    2011-05-03

    The method and system of the invention involves processing each new document (20) coming into the system into a document vector (16), and creating a document vector with reduced dimensionality (17) for comparison with the data model (15) without recomputing the data model (15). These operations are carried out by a first computer (11) while a second computer (12) updates the data model (18), which can be comprised of an initial large group of documents (19) and is premised on the computing an initial data model (13, 14, 15) to provide a reference point for determining document vectors from documents processed from the data stream (20).

  11. Atmospheric Dispersal and Dispostion of Tephra From a Potential Volcanic Eruption at Yucca Mountain, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. Keating; W.Statham

    2004-02-12

    The purpose of this model report is to provide documentation of the conceptual and mathematical model (ASHPLUME) for atmospheric dispersal and subsequent deposition of ash on the land surface from a potential volcanic eruption at Yucca Mountain, Nevada. This report also documents the ash (tephra) redistribution conceptual model. The ASHPLUME conceptual model accounts for incorporation and entrainment of waste fuel particles associated with a hypothetical volcanic eruption through the Yucca Mountain repository and downwind transport of contaminated tephra. The ASHPLUME mathematical model describes the conceptual model in mathematical terms to allow for prediction of radioactive waste/ash deposition on the groundmore » surface given that the hypothetical eruptive event occurs. This model report also describes the conceptual model for tephra redistribution from a basaltic cinder cone. Sensitivity analyses and model validation activities for the ash dispersal and redistribution models are also presented. Analyses documented in this model report will improve and clarify the previous documentation of the ASHPLUME mathematical model and its application to the Total System Performance Assessment (TSPA) for the License Application (TSPA-LA) igneous scenarios. This model report also documents the redistribution model product outputs based on analyses to support the conceptual model.« less

  12. TRAC Searchable Research Library

    DTIC Science & Technology

    2016-05-01

    network accessible document repository for technical documents and similar document artifacts. We used a model-based approach using the Vector...demonstration and model refinement. 14. SUBJECT TERMS Knowledge Management, Document Repository , Digital Library, Vector Directional Data Model...27 Figure D1. Administrator Repository Upload Page. ................................................................... D-2 Figure D2

  13. World energy projection system: Model documentation

    NASA Astrophysics Data System (ADS)

    1992-06-01

    The World Energy Project System (WEPS) is an accounting framework that incorporates projects from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product) and about the rate of incremental energy requirements met by hydropower, geothermal, coal, and natural gas to produce projections of world energy consumption published annually by the Energy Information Administration (EIA) in the International Energy Outlook (IEO). Two independently documented models presented in Figure 1, the Oil Market Simulation (OMS) model and the World Integrated Nuclear Evaluation System (WINES), provide projections of oil and nuclear power consumption published in the IEO. Output from a third independently documented model, and the International Coal Trade Model (ICTM), is not published in the IEO but is used in WEPS as a supply check on projections of world coal consumption produced by WEPS and published in the IEO. A WEPS model of natural gas production documented in this report provides the same type of implicit supply check on the WEPS projections of world natural gas consumption published in the IEO. Two additional models are included in Figure 1, the OPEC Capacity model and the Non-OPEC Oil Production model. These WEPS models provide inputs to the OMS model and are documented in this report.

  14. Efforts to integrate CMIP metadata and standards into NOAA-GFDL's climate model workflow

    NASA Astrophysics Data System (ADS)

    Blanton, C.; Lee, M.; Mason, E. E.; Radhakrishnan, A.

    2017-12-01

    Modeling centers participating in CMIP6 run model simulations, publish requested model output (conforming to community data standards), and document models and simulations using ES-DOC. GFDL developed workflow software implementing some best practices to meet these metadata and documentation requirements. The CMIP6 Data Request defines the variables that should be archived for each experiment and specifies their spatial and temporal structure. We used the Data Request's dreqPy python library to write GFDL model configuration files as an alternative to hand-crafted tables. There was also a largely successful effort to standardize variable names within the model to reduce the additional overhead of translating "GFDL to CMOR" variables at a later stage in the pipeline. The ES-DOC ecosystem provides tools and standards to create, publish, and view various types of community-defined CIM documents, most notably model and simulation documents. Although ES-DOC will automatically create simulation documents during publishing by harvesting NetCDF global attributes, the information must be collected, stored, and placed in the NetCDF files by the workflow. We propose to develop a GUI to collect the simulation document precursors. In addition, a new MIP for CMIP6-CPMIP, a comparison of computational performance of climate models-is documented using machine and performance CIM documents. We used ES-DOC's pyesdoc python library to automatically create these machine and performance documents. We hope that these and similar efforts will become permanent features of the GFDL workflow to facilitate future participation in CMIP-like activities.

  15. In-Drift Microbial Communities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Jolley

    2000-11-09

    As directed by written work direction (CRWMS M and O 1999f), Performance Assessment (PA) developed a model for microbial communities in the engineered barrier system (EBS) as documented here. The purpose of this model is to assist Performance Assessment and its Engineered Barrier Performance Section in modeling the geochemical environment within a potential repository drift for TSPA-SR/LA, thus allowing PA to provide a more detailed and complete near-field geochemical model and to answer the key technical issues (KTI) raised in the NRC Issue Resolution Status Report (IRSR) for the Evolution of the Near Field Environment (NFE) Revision 2 (NRC 1999).more » This model and its predecessor (the in-drift microbial communities model as documented in Chapter 4 of the TSPA-VA Technical Basis Document, CRWMS M and O 1998a) was developed to respond to the applicable KTIs. Additionally, because of the previous development of the in-drift microbial communities model as documented in Chapter 4 of the TSPA-VA Technical Basis Document (CRWMS M and O 1998a), the M and O was effectively able to resolve a previous KTI concern regarding the effects of microbial processes on seepage and flow (NRC 1998). This document supercedes the in-drift microbial communities model as documented in Chapter 4 of the TSPA-VA Technical Basis Document (CRWMS M and O 1998a). This document provides the conceptual framework of the revised in-drift microbial communities model to be used in subsequent performance assessment (PA) analyses.« less

  16. Interface for the documentation and compilation of a library of computer models in physiology.

    PubMed Central

    Summers, R. L.; Montani, J. P.

    1994-01-01

    A software interface for the documentation and compilation of a library of computer models in physiology was developed. The interface is an interactive program built within a word processing template in order to provide ease and flexibility of documentation. A model editor within the interface directs the model builder as to standardized requirements for incorporating models into the library and provides the user with an index to the levels of documentation. The interface and accompanying library are intended to facilitate model development, preservation and distribution and will be available for public use. PMID:7950046

  17. Embedding Term Similarity and Inverse Document Frequency into a Logical Model of Information Retrieval.

    ERIC Educational Resources Information Center

    Losada, David E.; Barreiro, Alvaro

    2003-01-01

    Proposes an approach to incorporate term similarity and inverse document frequency into a logical model of information retrieval. Highlights include document representation and matching; incorporating term similarity into the measure of distance; new algorithms for implementation; inverse document frequency; and logical versus classical models of…

  18. Documentation of the GLAS fourth order general circulation model. Volume 1: Model documentation

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, J.; Pfaendtner, J.; Takacs, L.; Takano, K.

    1983-01-01

    The volume 1, of a 3 volume technical memoranda which contains a documentation of the GLAS Fourth Order General Circulation Model is presented. Volume 1 contains the documentation, description of the stratospheric/tropospheric extension, user's guide, climatological boundary data, and some climate simulation studies.

  19. Research on Generating Method of Embedded Software Test Document Based on Dynamic Model

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.

  20. Residential Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Model Documentation - Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code.

  1. EIA model documentation: Petroleum Market Model of the National Energy Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-12-30

    The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. Documentation of the model is in accordance with EIA`s legal obligation to provide adequate documentation in support of its models (Public Law 94-385, section 57.b.2). The PMM models petroleum refining activities, the marketing of products, the production of natural gas liquids and domestic methanol, projects petroleum provides and sources of supplies for meeting demand. In addition, the PMMmore » estimates domestic refinery capacity expansion and fuel consumption.« less

  2. Hypersonic Vehicle Propulsion System Simplified Model Development

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Raitano, Paul; Le, Dzu K.; Ouzts, Peter

    2007-01-01

    This document addresses the modeling task plan for the hypersonic GN&C GRC team members. The overall propulsion system modeling task plan is a multi-step process and the task plan identified in this document addresses the first steps (short term modeling goals). The procedures and tools produced from this effort will be useful for creating simplified dynamic models applicable to a hypersonic vehicle propulsion system. The document continues with the GRC short term modeling goal. Next, a general description of the desired simplified model is presented along with simulations that are available to varying degrees. The simulations may be available in electronic form (FORTRAN, CFD, MatLab,...) or in paper form in published documents. Finally, roadmaps outlining possible avenues towards realizing simplified model are presented.

  3. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  4. Transportation Sector Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model.

  5. Using phrases and document metadata to improve topic modeling of clinical reports.

    PubMed

    Speier, William; Ong, Michael K; Arnold, Corey W

    2016-06-01

    Probabilistic topic models provide an unsupervised method for analyzing unstructured text, which have the potential to be integrated into clinical automatic summarization systems. Clinical documents are accompanied by metadata in a patient's medical history and frequently contains multiword concepts that can be valuable for accurately interpreting the included text. While existing methods have attempted to address these problems individually, we present a unified model for free-text clinical documents that integrates contextual patient- and document-level data, and discovers multi-word concepts. In the proposed model, phrases are represented by chained n-grams and a Dirichlet hyper-parameter is weighted by both document-level and patient-level context. This method and three other Latent Dirichlet allocation models were fit to a large collection of clinical reports. Examples of resulting topics demonstrate the results of the new model and the quality of the representations are evaluated using empirical log likelihood. The proposed model was able to create informative prior probabilities based on patient and document information, and captured phrases that represented various clinical concepts. The representation using the proposed model had a significantly higher empirical log likelihood than the compared methods. Integrating document metadata and capturing phrases in clinical text greatly improves the topic representation of clinical documents. The resulting clinically informative topics may effectively serve as the basis for an automatic summarization system for clinical reports. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. DRI Model of the U.S. Economy -- Model Documentation

    EIA Publications

    1993-01-01

    Provides documentation on Data Resources, Inc., DRI Model of the U.S. Economy and the DRI Personal Computer Input/Output Model. It also describes the theoretical basis, structure and functions of both DRI models; and contains brief descriptions of the models and their equations.

  7. International Natural Gas Model 2011, Model Documentation Report

    EIA Publications

    2013-01-01

    This report documents the objectives, analytical approach and development of the International Natural Gas Model (INGM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  8. Model Documentation of Base Case Data | Regional Energy Deployment System

    Science.gov Websites

    Model | Energy Analysis | NREL Documentation of Base Case Data Model Documentation of Base Case base case of the model. The base case was developed simply as a point of departure for other analyses Base Case derives many of its inputs from the Energy Information Administration's (EIA's) Annual Energy

  9. Documentation of the Douglas-fir tussock moth outbreak-population model.

    Treesearch

    J.J. Colbert; W. Scott Overton; Curtis. White

    1979-01-01

    Documentation of three model versions: the Douglas-fir tussock moth population-branch model on (1) daily temporal resolution, (2) instart temporal resolution, and (3) the Douglas-fir tussock moth stand-outbreak model; the hierarchical framework and the conceptual paradigm used are described. The coupling of the model with a normal-stand model is discussed. The modeling...

  10. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    NASA Technical Reports Server (NTRS)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  11. A Re-Unification of Two Competing Models for Document Retrieval.

    ERIC Educational Resources Information Center

    Bodoff, David

    1999-01-01

    Examines query-oriented versus document-oriented information retrieval and feedback learning. Highlights include a reunification of the two approaches for probabilistic document retrieval and for vector space model (VSM) retrieval; learning in VSM and in probabilistic models; multi-dimensional scaling; and ongoing field studies. (LRW)

  12. The Computable Catchment: An executable document for model-data software sharing, reproducibility and interactive visualization

    NASA Astrophysics Data System (ADS)

    Gil, Y.; Duffy, C.

    2015-12-01

    This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.

  13. INEEL AIR MODELING PROTOCOL ext

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. S. Staley; M. L. Abbott; P. D. Ritter

    2004-12-01

    Various laws stemming from the Clean Air Act of 1970 and the Clean Air Act amendments of 1990 require air emissions modeling. Modeling is used to ensure that air emissions from new projects and from modifications to existing facilities do not exceed certain standards. For radionuclides, any new airborne release must be modeled to show that downwind receptors do not receive exposures exceeding the dose limits and to determine the requirements for emissions monitoring. For criteria and toxic pollutants, emissions usually must first exceed threshold values before modeling of downwind concentrations is required. This document was prepared to provide guidancemore » for performing environmental compliance-driven air modeling of emissions from Idaho National Engineering and Environmental Laboratory facilities. This document assumes that the user has experience in air modeling and dose and risk assessment. It is not intended to be a "cookbook," nor should all recommendations herein be construed as requirements. However, there are certain procedures that are required by law, and these are pointed out. It is also important to understand that air emissions modeling is a constantly evolving process. This document should, therefore, be reviewed periodically and revised as needed. The document is divided into two parts. Part A is the protocol for radiological assessments, and Part B is for nonradiological assessments. This document is an update of and supersedes document INEEL/INT-98-00236, Rev. 0, INEEL Air Modeling Protocol. This updated document incorporates changes in some of the rules, procedures, and air modeling codes that have occurred since the protocol was first published in 1998.« less

  14. Predicting Document Retrieval System Performance: An Expected Precision Measure.

    ERIC Educational Resources Information Center

    Losee, Robert M., Jr.

    1987-01-01

    Describes an expected precision (EP) measure designed to predict document retrieval performance. Highlights include decision theoretic models; precision and recall as measures of system performance; EP graphs; relevance feedback; and computing the retrieval status value of a document for two models, the Binary Independent Model and the Two Poisson…

  15. Technical report series on global modeling and data assimilation. Volume 1: Documentation of the Goddard Earth Observing System (GEOS) General Circulation Model, version 1

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); Takacs, Lawrence L.; Molod, Andrea; Wang, Tina

    1994-01-01

    This technical report documents Version 1 of the Goddard Earth Observing System (GEOS) General Circulation Model (GCM). The GEOS-1 GCM is being used by NASA's Data Assimilation Office (DAO) to produce multiyear data sets for climate research. This report provides a documentation of the model components used in the GEOS-1 GCM, a complete description of model diagnostics available, and a User's Guide to facilitate GEOS-1 GCM experiments.

  16. Documentation for the Waste Reduction Model (WARM)

    EPA Pesticide Factsheets

    This page describes the WARM documentation files and provides links to all documentation files associated with EPA’s Waste Reduction Model (WARM). The page includes a brief summary of the chapters documenting the greenhouse gas emission and energy factors.

  17. SBKF Modeling and Analysis Plan: Buckling Analysis of Compression-Loaded Orthogrid and Isogrid Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Hilburger, Mark W.

    2013-01-01

    This document outlines a Modeling and Analysis Plan (MAP) to be followed by the SBKF analysts. It includes instructions on modeling and analysis formulation and execution, model verification and validation, identifying sources of error and uncertainty, and documentation. The goal of this MAP is to provide a standardized procedure that ensures uniformity and quality of the results produced by the project and corresponding documentation.

  18. Model documentation: Renewable Fuels Module of the National Energy Modeling System

    NASA Astrophysics Data System (ADS)

    1994-04-01

    This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it related to the production of the 1994 Annual Energy Outlook (AEO94) forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. This documentation report serves two purposes. First, it is a reference document for model analysts, model users, and the public interested in the construction and application of the RFM. Second, it meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its models. The RFM consists of six analytical submodules that represent each of the major renewable energy resources -- wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. Of these six, four are documented in the following chapters: municipal solid waste, wind, solar and biofuels. Geothermal and wood are not currently working components of NEMS. The purpose of the RFM is to define the technological and cost characteristics of renewable energy technologies, and to pass these characteristics to other NEMS modules for the determination of mid-term forecasted renewable energy demand.

  19. Topology of Document Retrieval Systems.

    ERIC Educational Resources Information Center

    Everett, Daniel M.; Cater, Steven C.

    1992-01-01

    Explains the use of a topological structure to examine the closeness between documents in retrieval systems and analyzes the topological structure of a vector-space model, a fuzzy-set model, an extended Boolean model, a probabilistic model, and a TIRS (Topological Information Retrieval System) model. Proofs for the results are appended. (17…

  20. MPS Solidification Model. Volume 2: Operating guide and software documentation for the unsteady model

    NASA Technical Reports Server (NTRS)

    Maples, A. L.

    1981-01-01

    The operation of solidification Model 2 is described and documentation of the software associated with the model is provided. Model 2 calculates the macrosegregation in a rectangular ingot of a binary alloy as a result of unsteady horizontal axisymmetric bidirectional solidification. The solidification program allows interactive modification of calculation parameters as well as selection of graphical and tabular output. In batch mode, parameter values are input in card image form and output consists of printed tables of solidification functions. The operational aspects of Model 2 that differ substantially from Model 1 are described. The global flow diagrams and data structures of Model 2 are included. The primary program documentation is the code itself.

  1. Documenting Climate Models and Simulations: the ES-DOC Ecosystem in Support of CMIP

    NASA Astrophysics Data System (ADS)

    Pascoe, C. L.; Guilyardi, E.

    2017-12-01

    The results of climate models are of increasing and widespread importance. No longer is climate model output of sole interest to climate scientists and researchers in the climate change impacts and adaptation fields. Now non-specialists such as government officials, policy-makers, and the general public, all have an increasing need to access climate model output and understand its implications. For this host of users, accurate and complete metadata (i.e., information about how and why the data were produced) is required to document the climate modeling results. Here we describe the ES-DOC community-govern project to collect and make available documentation of climate models and their simulations for the internationally coordinated modeling activity CMIP6 (Coupled Model Intercomparison Project, Phase 6). An overview of the underlying standards, key properties and features, the evolution from CMIP5, the underlying tools and workflows as well as what modelling groups should expect and how they should engage with the documentation of their contribution to CMIP6 is also presented.

  2. Oil and Gas Supply Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Defines the objectives of the Oil and Gas Supply Model (OGSM), to describe the model's basic approach, and to provide detail on how the model works. This report is intended as a reference document for model analysts, users, and the public.

  3. Section 3. The SPARROW Surface Water-Quality Model: Theory, Application and User Documentation

    USGS Publications Warehouse

    Schwarz, G.E.; Hoos, A.B.; Alexander, R.B.; Smith, R.A.

    2006-01-01

    SPARROW (SPAtially Referenced Regressions On Watershed attributes) is a watershed modeling technique for relating water-quality measurements made at a network of monitoring stations to attributes of the watersheds containing the stations. The core of the model consists of a nonlinear regression equation describing the non-conservative transport of contaminants from point and diffuse sources on land to rivers and through the stream and river network. The model predicts contaminant flux, concentration, and yield in streams and has been used to evaluate alternative hypotheses about the important contaminant sources and watershed properties that control transport over large spatial scales. This report provides documentation for the SPARROW modeling technique and computer software to guide users in constructing and applying basic SPARROW models. The documentation gives details of the SPARROW software, including the input data and installation requirements, and guidance in the specification, calibration, and application of basic SPARROW models, as well as descriptions of the model output and its interpretation. The documentation is intended for both researchers and water-resource managers with interest in using the results of existing models and developing and applying new SPARROW models. The documentation of the model is presented in two parts. Part 1 provides a theoretical and practical introduction to SPARROW modeling techniques, which includes a discussion of the objectives, conceptual attributes, and model infrastructure of SPARROW. Part 1 also includes background on the commonly used model specifications and the methods for estimating and evaluating parameters, evaluating model fit, and generating water-quality predictions and measures of uncertainty. Part 2 provides a user's guide to SPARROW, which includes a discussion of the software architecture and details of the model input requirements and output files, graphs, and maps. The text documentation and computer software are available on the Web at http://usgs.er.gov/sparrow/sparrow-mod/.

  4. Regional Energy Deployment System (ReEDS) | Energy Analysis | NREL

    Science.gov Websites

    System Model The Regional Energy Deployment System (ReEDS) model helps the U.S. Department of model. Visualize Future Capacity Expansion of Renewable Energy Watch this video of the ReEDS model audio. Model Documentation ReEDS Model Documentation: Version 2016 ReEDS Map with Numbered Regions

  5. Industrial Demand Module - NEMS Documentation

    EIA Publications

    2014-01-01

    Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Industrial Demand Module. The report catalogues and describes model assumptions, computational methodology, parameter estimation techniques, and model source code.

  6. Semantic Document Model to Enhance Data and Knowledge Interoperability

    NASA Astrophysics Data System (ADS)

    Nešić, Saša

    To enable document data and knowledge to be efficiently shared and reused across application, enterprise, and community boundaries, desktop documents should be completely open and queryable resources, whose data and knowledge are represented in a form understandable to both humans and machines. At the same time, these are the requirements that desktop documents need to satisfy in order to contribute to the visions of the Semantic Web. With the aim of achieving this goal, we have developed the Semantic Document Model (SDM), which turns desktop documents into Semantic Documents as uniquely identified and semantically annotated composite resources, that can be instantiated into human-readable (HR) and machine-processable (MP) forms. In this paper, we present the SDM along with an RDF and ontology-based solution for the MP document instance. Moreover, on top of the proposed model, we have built the Semantic Document Management System (SDMS), which provides a set of services that exploit the model. As an application example that takes advantage of SDMS services, we have extended MS Office with a set of tools that enables users to transform MS Office documents (e.g., MS Word and MS PowerPoint) into Semantic Documents, and to search local and distant semantic document repositories for document content units (CUs) over Semantic Web protocols.

  7. Documenting Ground-Water Modeling at Sites Contaminated with Radioactive Substances

    EPA Pesticide Factsheets

    This report is the product of the Interagency Environmental Pathway Modeling Working Group. This report demonstrates how to document model applications in a consistent manner and is intended to assist technical staff.

  8. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  9. Methods of Sparse Modeling and Dimensionality Reduction to Deal with Big Data

    DTIC Science & Technology

    2015-04-01

    supervised learning (c). Our framework consists of two separate phases: (a) first find an initial space in an unsupervised manner; then (b) utilize label...model that can learn thousands of topics from a large set of documents and infer the topic mixture of each document, 2) a supervised dimension reduction...model that can learn thousands of topics from a large set of documents and infer the topic mixture of each document, (i) a method of supervised

  10. OWL references in ORM conceptual modelling

    NASA Astrophysics Data System (ADS)

    Matula, Jiri; Belunek, Roman; Hunka, Frantisek

    2017-07-01

    Object Role Modelling methodology is the fact-based type of conceptual modelling. The aim of the paper is to emphasize a close connection to OWL documents and its possible mutual cooperation. The definition of entities or domain values is an indispensable part of the conceptual schema design procedure defined by the ORM methodology. Many of these entities are already defined in OWL documents. Therefore, it is not necessary to declare entities again, whereas it is possible to utilize references from OWL documents during modelling of information systems.

  11. Peer Review of EPA's Draft BMDS Document: Exponential ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.

  12. Model documentation Natural Gas Transmission and Distribution Model of the National Energy Modeling System. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-02-26

    The Natural Gas Transmission and Distribution Model (NGTDM) of the National Energy Modeling System is developed and maintained by the Energy Information Administration (EIA), Office of Integrated Analysis and Forecasting. This report documents the archived version of the NGTDM that was used to produce the natural gas forecasts presented in the Annual Energy Outlook 1996, (DOE/EIA-0383(96)). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic approach, and provides detail on the methodology employed. Previously this report represented Volume I of amore » two-volume set. Volume II reported on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues.« less

  13. World Energy Projection System Plus Model Documentation: Coal Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Coal Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  14. World Energy Projection System Plus Model Documentation: Transportation Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) International Transportation model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  15. World Energy Projection System Plus Model Documentation: Residential Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Residential Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  16. World Energy Projection System Plus Model Documentation: Refinery Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  17. World Energy Projection System Plus Model Documentation: Main Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Main Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  18. World Energy Projection System Plus Model Documentation: Electricity Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Electricity Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  19. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    PubMed

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  20. ATMOSPHERIC DISPERSAL AND DEPOSITION OF TEPHRA FROM A POTENTIAL VOLCANIC ERUPTION AT YUCCA MOUNTAIN, NEVADA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Harrington

    2004-10-25

    The purpose of this model report is to provide documentation of the conceptual and mathematical model (Ashplume) for atmospheric dispersal and subsequent deposition of ash on the land surface from a potential volcanic eruption at Yucca Mountain, Nevada. This report also documents the ash (tephra) redistribution conceptual model. These aspects of volcanism-related dose calculation are described in the context of the entire igneous disruptive events conceptual model in ''Characterize Framework for Igneous Activity'' (BSC 2004 [DIRS 169989], Section 6.1.1). The Ashplume conceptual model accounts for incorporation and entrainment of waste fuel particles associated with a hypothetical volcanic eruption through themore » Yucca Mountain repository and downwind transport of contaminated tephra. The Ashplume mathematical model describes the conceptual model in mathematical terms to allow for prediction of radioactive waste/ash deposition on the ground surface given that the hypothetical eruptive event occurs. This model report also describes the conceptual model for tephra redistribution from a basaltic cinder cone. Sensitivity analyses and model validation activities for the ash dispersal and redistribution models are also presented. Analyses documented in this model report update the previous documentation of the Ashplume mathematical model and its application to the Total System Performance Assessment (TSPA) for the License Application (TSPA-LA) igneous scenarios. This model report also documents the redistribution model product outputs based on analyses to support the conceptual model. In this report, ''Ashplume'' is used when referring to the atmospheric dispersal model and ''ASHPLUME'' is used when referencing the code of that model. Two analysis and model reports provide direct inputs to this model report, namely ''Characterize Eruptive Processes at Yucca Mountain, Nevada and Number of Waste Packages Hit by Igneous Intrusion''. This model report provides direct inputs to the TSPA, which uses the ASHPLUME software described and used in this model report. Thus, ASHPLUME software inputs are inputs to this model report for ASHPLUME runs in this model report. However, ASHPLUME software inputs are outputs of this model report for ASHPLUME runs by TSPA.« less

  1. Representing Information in Patient Reports Using Natural Language Processing and the Extensible Markup Language

    PubMed Central

    Friedman, Carol; Hripcsak, George; Shagina, Lyuda; Liu, Hongfang

    1999-01-01

    Objective: To design a document model that provides reliable and efficient access to clinical information in patient reports for a broad range of clinical applications, and to implement an automated method using natural language processing that maps textual reports to a form consistent with the model. Methods: A document model that encodes structured clinical information in patient reports while retaining the original contents was designed using the extensible markup language (XML), and a document type definition (DTD) was created. An existing natural language processor (NLP) was modified to generate output consistent with the model. Two hundred reports were processed using the modified NLP system, and the XML output that was generated was validated using an XML validating parser. Results: The modified NLP system successfully processed all 200 reports. The output of one report was invalid, and 199 reports were valid XML forms consistent with the DTD. Conclusions: Natural language processing can be used to automatically create an enriched document that contains a structured component whose elements are linked to portions of the original textual report. This integrated document model provides a representation where documents containing specific information can be accurately and efficiently retrieved by querying the structured components. If manual review of the documents is desired, the salient information in the original reports can also be identified and highlighted. Using an XML model of tagging provides an additional benefit in that software tools that manipulate XML documents are readily available. PMID:9925230

  2. Risk Quantification of Systems Engineering Documents Improves Probability of DOD Project Success

    DTIC Science & Technology

    2009-09-01

    comprehensive risk model for DoD milestone review documentation as well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project...Milestone Documentation, Project Planning, Rational Frame, Political Frame, CMMI Project Planning Process Area, CMMI Risk Management Process Area...well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project Planning and Risk Management process areas. The intent is to

  3. World Energy Projection System Plus Model Documentation: Greenhouse Gases Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Greenhouse Gases Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  4. World Energy Projection System Plus Model Documentation: Natural Gas Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Natural Gas Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  5. World Energy Projection System Plus Model Documentation: District Heat Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) District Heat Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  6. World Energy Projection System Plus Model Documentation: Industrial Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Industrial Model (WIM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  7. EIA model documentation: Petroleum market model of the national energy modeling system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-12-28

    The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. Documentation of the model is in accordance with EIA`s legal obligation to provide adequate documentation in support of its models. The PMM models petroleum refining activities, the marketing of petroleum products to consumption regions, the production of natural gas liquids in gas processing plants, and domestic methanol production. The PMM projects petroleum product prices and sources of supplymore » for meeting petroleum product demand. The sources of supply include crude oil, both domestic and imported; other inputs including alcohols and ethers; natural gas plant liquids production; petroleum product imports; and refinery processing gain. In addition, the PMM estimates domestic refinery capacity expansion and fuel consumption. Product prices are estimated at the Census division level and much of the refining activity information is at the Petroleum Administration for Defense (PAD) District level.« less

  8. Model documentation Renewable Fuels Module of the National Energy Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-01-01

    This report documents the objectives, analaytical approach and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1996 Annual Energy Outlook forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described.

  9. Model-based document categorization employing semantic pattern analysis and local structure clustering

    NASA Astrophysics Data System (ADS)

    Fume, Kosei; Ishitani, Yasuto

    2008-01-01

    We propose a document categorization method based on a document model that can be defined externally for each task and that categorizes Web content or business documents into a target category in accordance with the similarity of the model. The main feature of the proposed method consists of two aspects of semantics extraction from an input document. The semantics of terms are extracted by the semantic pattern analysis and implicit meanings of document substructure are specified by a bottom-up text clustering technique focusing on the similarity of text line attributes. We have constructed a system based on the proposed method for trial purposes. The experimental results show that the system achieves more than 80% classification accuracy in categorizing Web content and business documents into 15 or 70 categories.

  10. The centrality of meta-programming in the ES-DOC eco-system

    NASA Astrophysics Data System (ADS)

    Greenslade, Mark

    2017-04-01

    The Earth System Documentation (ES-DOC) project is an international effort aiming to deliver a robust earth system model inter-comparison project documentation infrastructure. Such infrastructure both simplifies & standardizes the process of documenting (in detail) projects, experiments, models, forcings & simulations. In support of CMIP6, ES-DOC has upgraded its eco-system of tools, web-services & web-sites. The upgrade consolidates the existing infrastructure (built for CMIP5) and extends it with the introduction of new capabilities. The strategic focus of the upgrade is improvements in the documentation experience and broadening the range of scientific use-cases that the archived documentation may help deliver. Whether it is highlighting dataset errors, exploring experimental protocols, comparing forcings across ensemble runs, understanding MIP objectives, reviewing citations, exploring component properties of configured models, visualising inter-model relationships, scientists involved in CMIP6 will find the ES-DOC infrastructure helpful. This presentation underlines the centrality of meta-programming within the ES-DOC eco-system. We will demonstrate how agility is greatly enhanced by taking a meta-programming approach to representing data models and controlled vocabularies. Such an approach nicely decouples representations from encodings. Meta-models will be presented along with the associated tooling chain that forward engineers artefacts as diverse as: class hierarchies, IPython notebooks, mindmaps, configuration files, OWL & SKOS documents, spreadsheets …etc.

  11. Model Based Document and Report Generation for Systems Engineering

    NASA Technical Reports Server (NTRS)

    Delp, Christopher; Lam, Doris; Fosse, Elyse; Lee, Cin-Young

    2013-01-01

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  12. Model based document and report generation for systems engineering

    NASA Astrophysics Data System (ADS)

    Delp, C.; Lam, D.; Fosse, E.; Lee, Cin-Young

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  13. Model documentation: Electricity Market Module, Electricity Fuel Dispatch Submodule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report documents the objectives, analytical approach and development of the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  14. Comparison of statistical models for writer verification

    NASA Astrophysics Data System (ADS)

    Srihari, Sargur; Ball, Gregory R.

    2009-01-01

    A novel statistical model for determining whether a pair of documents, a known and a questioned, were written by the same individual is proposed. The goal of this formulation is to learn the specific uniqueness of style in a particular author's writing, given the known document. Since there are often insufficient samples to extrapolate a generalized model of an writer's handwriting based solely on the document, we instead generalize over the differences between the author and a large population of known different writers. This is in contrast to an earlier model proposed whereby probability distributions were a priori without learning. We show the performance of the model along with a comparison in performance to the non-learning, older model, which shows significant improvement.

  15. Airport Performance Model : Volume 2 - User's Manual and Program Documentation

    DOT National Transportation Integrated Search

    1978-10-01

    Volume II contains a User's manual and program documentation for the Airport Performance Model. This computer-based model is written in FORTRAN IV for the DEC-10. The user's manual describes the user inputs to the interactive program and gives sample...

  16. Peridynamics with LAMMPS : a user guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehoucq, Richard B.; Silling, Stewart Andrew; Plimpton, Steven James

    2008-01-01

    Peridynamics is a nonlocal formulation of continuum mechanics. The discrete peridynamic model has the same computational structure as a molecular dynamic model. This document details the implementation of a discrete peridynamic model within the LAMMPS molecular dynamic code. This document provides a brief overview of the peridynamic model of a continuum, then discusses how the peridynamic model is discretized, and overviews the LAMMPS implementation. A nontrivial example problem is also included.

  17. Automation for System Safety Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  18. Earth System Documentation (ES-DOC) Preparation for CMIP6

    NASA Astrophysics Data System (ADS)

    Denvil, S.; Murphy, S.; Greenslade, M. A.; Lawrence, B.; Guilyardi, E.; Pascoe, C.; Treshanksy, A.; Elkington, M.; Hibling, E.; Hassell, D.

    2015-12-01

    During the course of 2015 the Earth System Documentation (ES-DOC) project began its preparations for CMIP6 (Coupled Model Inter-comparison Project 6) by further extending the ES-DOC tooling ecosystem in support of Earth System Model (ESM) documentation creation, search, viewing & comparison. The ES-DOC online questionnaire, the ES-DOC desktop notebook, and the ES-DOC python toolkit will serve as multiple complementary pathways to generating CMIP6 documentation. It is envisaged that institutes will leverage these tools at different points of the CMIP6 lifecycle. Institutes will be particularly interested to know that the documentation burden will be either streamlined or completely automated.As all the tools are tightly integrated with the ES-DOC web-service, institutes can be confident that the latency between documentation creation & publishing will be reduced to a minimum. Published documents will be viewable with the online ES-DOC Viewer (accessible via citable URL's). Model inter-comparison scenarios will be supported using the ES-DOC online Comparator tool. The Comparator is being extended to:• Support comparison of both Model descriptions & Simulation runs;• Greatly streamline the effort involved in compiling official tables.The entire ES-DOC ecosystem is open source and built upon open standards such as the Common Information Model (CIM) (versions 1 and 2).

  19. Test Setup For Model Landing Investigation of a Winged Space Vehicle

    NASA Image and Video Library

    1960-07-20

    Test Setup For Model Landing Investigation of a Winged Space Vehicle Image used in NASA Document TN-D-1496 1960-L-04633.01 is Figure 9a for NASA Document L-2064 Photograph of model on launcher and landing on runway.

  20. Commercial Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  1. Analysis of time series for postal shipments in Regional VII East Java Indonesia

    NASA Astrophysics Data System (ADS)

    Kusrini, DE; Ulama, B. S. S.; Aridinanti, L.

    2018-03-01

    The change of number delivery goods through PT. Pos Regional VII East Java Indonesia indicates that the trend of increasing and decreasing the delivery of documents and non-documents in PT. Pos Regional VII East Java Indonesia is strongly influenced by conditions outside of PT. Pos Regional VII East Java Indonesia so that the prediction the number of document and non-documents requires a model that can accommodate it. Based on the time series plot monthly data fluctuations occur from 2013-2016 then the model is done using ARIMA or seasonal ARIMA and selected the best model based on the smallest AIC value. The results of data analysis about the number of shipments on each product sent through the Sub-Regional Postal Office VII East Java indicates that there are 5 post offices of 26 post offices entering the territory. The largest number of shipments is available on the PPB (Paket Pos Biasa is regular package shipment/non-document ) and SKH (Surat Kilat Khusus is Special Express Mail/document) products. The time series model generated is largely a Random walk model meaning that the number of shipment in the future is influenced by random effects that are difficult to predict. Some are AR and MA models, except for Express shipment products with Malang post office destination which has seasonal ARIMA model on lag 6 and 12. This means that the number of items in the following month is affected by the number of items in the previous 6 months.

  2. Restoring warped document images through 3D shape modeling.

    PubMed

    Tan, Chew Lim; Zhang, Li; Zhang, Zheng; Xia, Tao

    2006-02-01

    Scanning a document page from a thick bound volume often results in two kinds of distortions in the scanned image, i.e., shade along the "spine" of the book and warping in the shade area. In this paper, we propose an efficient restoration method based on the discovery of the 3D shape of a book surface from the shading information in a scanned document image. From a technical point of view, this shape from shading (SFS) problem in real-world environments is characterized by 1) a proximal and moving light source, 2) Lambertian reflection, 3) nonuniform albedo distribution, and 4) document skew. Taking all these factors into account, we first build practical models (consisting of a 3D geometric model and a 3D optical model) for the practical scanning conditions to reconstruct the 3D shape of the book surface. We next restore the scanned document image using this shape based on deshading and dewarping models. Finally, we evaluate the restoration results by comparing our estimated surface shape with the real shape as well as the OCR performance on original and restored document images. The results show that the geometric and photometric distortions are mostly removed and the OCR results are improved markedly.

  3. International Space Station Human Behavior and Performance Competency Model: Volume II

    NASA Technical Reports Server (NTRS)

    Schmidt, Lacey

    2008-01-01

    This document further defines the behavioral markers identified in the document "Human Behavior and Performance Competency Model" Vol. I. The Human Behavior and Performance (HBP) competencies were recommended as requirements to participate in international long duration missions, and form the basis for determining the HBP training curriculum for long duration crewmembers. This document provides details, examples, knowledge areas, and affective skills to support the use of the HBP competencies in training and evaluation. This document lists examples and details specific to HBP competencies required of astronauts/cosmonauts who participate in ISS expedition and other international long-duration missions. Please note that this model does not encompass all competencies required. While technical competencies are critical for crewmembers, they are beyond the scope of this document. Additionally, the competencies in this model (and subsequent objectives) are not intended to limit the internal activities or training programs of any international partner.

  4. Computer program for Stirling engine performance calculations

    NASA Technical Reports Server (NTRS)

    Tew, R. C., Jr.

    1983-01-01

    The thermodynamic characteristics of the Stirling engine were analyzed and modeled on a computer to support its development as a possible alternative to the automobile spark ignition engine. The computer model is documented. The documentation includes a user's manual, symbols list, a test case, comparison of model predictions with test results, and a description of the analytical equations used in the model.

  5. User Modeling in Adaptive Hypermedia Educational Systems

    ERIC Educational Resources Information Center

    Martins, Antonio Constantino; Faria, Luiz; Vaz de Carvalho, Carlos; Carrapatoso, Eurico

    2008-01-01

    This document is a survey in the research area of User Modeling (UM) for the specific field of Adaptive Learning. The aims of this document are: To define what it is a User Model; To present existing and well known User Models; To analyze the existent standards related with UM; To compare existing systems. In the scientific area of User Modeling…

  6. Supervised Gamma Process Poisson Factorization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dylan Zachary

    This thesis develops the supervised gamma process Poisson factorization (S- GPPF) framework, a novel supervised topic model for joint modeling of count matrices and document labels. S-GPPF is fully generative and nonparametric: document labels and count matrices are modeled under a uni ed probabilistic framework and the number of latent topics is controlled automatically via a gamma process prior. The framework provides for multi-class classification of documents using a generative max-margin classifier. Several recent data augmentation techniques are leveraged to provide for exact inference using a Gibbs sampling scheme. The first portion of this thesis reviews supervised topic modeling andmore » several key mathematical devices used in the formulation of S-GPPF. The thesis then introduces the S-GPPF generative model and derives the conditional posterior distributions of the latent variables for posterior inference via Gibbs sampling. The S-GPPF is shown to exhibit state-of-the-art performance for joint topic modeling and document classification on a dataset of conference abstracts, beating out competing supervised topic models. The unique properties of S-GPPF along with its competitive performance make it a novel contribution to supervised topic modeling.« less

  7. Asymmetrical booster ascent guidance and control system design study. Volume 2: SSFS math models - Ascent. [space shuttle development

    NASA Technical Reports Server (NTRS)

    Williams, F. E.; Lemon, R. S.

    1974-01-01

    The engineering equations and mathematical models developed for use in the space shuttle functional simulator (SSFS) are presented, and include extensive revisions and additions to earlier documentation. Definitions of coordinate systems used by the SSFS models and coordinate tranformations are given, along with documentation of the flexible body mathematical models. The models were incorporated in the SSFS and are in the checkout stage.

  8. An Object-Based Requirements Modeling Method.

    ERIC Educational Resources Information Center

    Cordes, David W.; Carver, Doris L.

    1992-01-01

    Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…

  9. Modeling Guru: Knowledge Base for NASA Modelers

    NASA Astrophysics Data System (ADS)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the same document, even allowing the author to select those who may edit and approve the document. To maintain knowledge integrity, all documents are moderated before they are visible to the public. Modeling Guru, running on Clearspace by Jive Software, has been an active resource to the NASA modeling and HEC communities for more than a year and currently has more than 100 active users. SIVO will soon install live instant messaging support, as well as a user-customizable homepage with social-networking features. In addition, SIVO plans to implement a large dataset/file storage capability so that users can quickly and easily exchange datasets and files with one another. Continued active community participation combined with periodic software updates and improved features will ensure that Modeling Guru remains a vibrant, effective, easy-to-use tool for the NASA scientific community.

  10. "DEP'ART." Un Modele de Prevision des Departs des Enseignants. Documents Demographie Scolaire 9-16 ("DEP'ART." A Model for Predicting Teacher Attrition. Scholastic Demographic Document 9-16).

    ERIC Educational Resources Information Center

    Meublat, Guy

    This document forms part of a research project initiated by the Ministry of Education in Quebec and designed to forecast teacher demand over the next 15 years. It analyzes the problem of identifying potential teacher dropouts by means of a statistical model which provides simulations of various hypotheses and which can be easily revised by the…

  11. Realtime Knowledge Management (RKM): From an International Space Station (ISS) Point of View

    NASA Technical Reports Server (NTRS)

    Robinson, Peter I.; McDermott, William; Alena, Richard L.

    2004-01-01

    We are developing automated methods to provide realtime access to spacecraft domain knowledge relevant a spacecraft's current operational state. The method is based upon analyzing state-transition signatures in the telemetry stream. A key insight is that documentation relevant to a specific failure mode or operational state is related to the structure and function of spacecraft systems. This means that diagnostic dependency and state models can provide a roadmap for effective documentation navigation and presentation. Diagnostic models consume the telemetry and derive a high-level state description of the spacecraft. Each potential spacecraft state description is matched against the predictions of models that were developed from information found in the pages and sections in the relevant International Space Station (ISS) documentation and reference materials. By annotating each model fragment with the domain knowledge sources from which it was derived we can develop a system that automatically selects those documents representing the domain knowledge encapsulated by the models that compute the current spacecraft state. In this manner, when the spacecraft state changes, the relevant documentation context and presentation will also change.

  12. E-nursing documentation as a tool for quality assurance.

    PubMed

    Rajkovic, Vladislav; Sustersic, Olga; Rajkovic, Uros

    2006-01-01

    The article presents the results of a project with which we describe the reengineering of nursing documentation. Documentation in nursing is an efficient tool for ensuring quality health care and consequently quality patient treatment along the whole clinical path. We have taken into account the nursing process and patient treatment based on Henderson theoretical model of nursing that consists of 14 basic living activities. The model of new documentation enables tracing, transparency, selectivity, monitoring and analyses. All these factors lead to improvements of a health system as well as to improved safety of patients and members of nursing teams. Thus the documentation was developed for three health care segments: secondary and tertiary level, dispensaries and community health care. The new quality introduced to the documentation process by information and communication technology is presented by a database model and a software prototype for managing documentation.

  13. A model for enhancing Internet medical document retrieval with "medical core metadata".

    PubMed

    Malet, G; Munoz, F; Appleyard, R; Hersh, W

    1999-01-01

    Finding documents on the World Wide Web relevant to a specific medical information need can be difficult. The goal of this work is to define a set of document content description tags, or metadata encodings, that can be used to promote disciplined search access to Internet medical documents. The authors based their approach on a proposed metadata standard, the Dublin Core Metadata Element Set, which has recently been submitted to the Internet Engineering Task Force. Their model also incorporates the National Library of Medicine's Medical Subject Headings (MeSH) vocabulary and MEDLINE-type content descriptions. The model defines a medical core metadata set that can be used to describe the metadata for a wide variety of Internet documents. The authors propose that their medical core metadata set be used to assign metadata to medical documents to facilitate document retrieval by Internet search engines.

  14. A Model for Enhancing Internet Medical Document Retrieval with “Medical Core Metadata”

    PubMed Central

    Malet, Gary; Munoz, Felix; Appleyard, Richard; Hersh, William

    1999-01-01

    Objective: Finding documents on the World Wide Web relevant to a specific medical information need can be difficult. The goal of this work is to define a set of document content description tags, or metadata encodings, that can be used to promote disciplined search access to Internet medical documents. Design: The authors based their approach on a proposed metadata standard, the Dublin Core Metadata Element Set, which has recently been submitted to the Internet Engineering Task Force. Their model also incorporates the National Library of Medicine's Medical Subject Headings (MeSH) vocabulary and Medline-type content descriptions. Results: The model defines a medical core metadata set that can be used to describe the metadata for a wide variety of Internet documents. Conclusions: The authors propose that their medical core metadata set be used to assign metadata to medical documents to facilitate document retrieval by Internet search engines. PMID:10094069

  15. An Action-Based Fine-Grained Access Control Mechanism for Structured Documents and Its Application

    PubMed Central

    Su, Mang; Li, Fenghua; Tang, Zhi; Yu, Yinyan; Zhou, Bo

    2014-01-01

    This paper presents an action-based fine-grained access control mechanism for structured documents. Firstly, we define a describing model for structured documents and analyze the application scenarios. The describing model could support the permission management on chapters, pages, sections, words, and pictures of structured documents. Secondly, based on the action-based access control (ABAC) model, we propose a fine-grained control protocol for structured documents by introducing temporal state and environmental state. The protocol covering different stages from document creation, to permission specification and usage control are given by using the Z-notation. Finally, we give the implementation of our mechanism and make the comparisons between the existing methods and our mechanism. The result shows that our mechanism could provide the better solution of fine-grained access control for structured documents in complicated networks. Moreover, it is more flexible and practical. PMID:25136651

  16. An action-based fine-grained access control mechanism for structured documents and its application.

    PubMed

    Su, Mang; Li, Fenghua; Tang, Zhi; Yu, Yinyan; Zhou, Bo

    2014-01-01

    This paper presents an action-based fine-grained access control mechanism for structured documents. Firstly, we define a describing model for structured documents and analyze the application scenarios. The describing model could support the permission management on chapters, pages, sections, words, and pictures of structured documents. Secondly, based on the action-based access control (ABAC) model, we propose a fine-grained control protocol for structured documents by introducing temporal state and environmental state. The protocol covering different stages from document creation, to permission specification and usage control are given by using the Z-notation. Finally, we give the implementation of our mechanism and make the comparisons between the existing methods and our mechanism. The result shows that our mechanism could provide the better solution of fine-grained access control for structured documents in complicated networks. Moreover, it is more flexible and practical.

  17. Three Dimensional Modeling via Photographs for Documentation of a Village Bath

    NASA Astrophysics Data System (ADS)

    Balta, H. B.; Hamamcioglu-Turan, M.; Ocali, O.

    2013-07-01

    The aim of this study is supporting the conceptual discussions of architectural restoration with three dimensional modeling of monuments based on photogrammetric survey. In this study, a 16th century village bath in Ulamış, Seferihisar, and Izmir is modeled for documentation. Ulamış is one of the historical villages within which Turkish population first settled in the region of Seferihisar - Urla. The methodology was tested on an antique monument; a bath with a cubical form. Within the limits of this study, only the exterior of the bath was modeled. The presentation scale for the bath was determined as 1 / 50, considering the necessities of designing structural interventions and architectural ones within the scope of a restoration project. The three dimensional model produced is a realistic document presenting the present situation of the ruin. Traditional plan, elevation and perspective drawings may be produced from the model, in addition to the realistic textured renderings and wireframe representations. The model developed in this study provides opportunity for presenting photorealistic details of historical morphologies in scale. Compared to conventional drawings, the renders based on the 3d models provide an opportunity for conceiving architectural details such as color, material and texture. From these documents, relatively more detailed restitution hypothesis can be developed and intervention decisions can be taken. Finally, the principles derived from the case study can be used for 3d documentation of historical structures with irregular surfaces.

  18. Risk prediction for chronic kidney disease progression using heterogeneous electronic health record data and time series analysis.

    PubMed

    Perotte, Adler; Ranganath, Rajesh; Hirsch, Jamie S; Blei, David; Elhadad, Noémie

    2015-07-01

    As adoption of electronic health records continues to increase, there is an opportunity to incorporate clinical documentation as well as laboratory values and demographics into risk prediction modeling. The authors develop a risk prediction model for chronic kidney disease (CKD) progression from stage III to stage IV that includes longitudinal data and features drawn from clinical documentation. The study cohort consisted of 2908 primary-care clinic patients who had at least three visits prior to January 1, 2013 and developed CKD stage III during their documented history. Development and validation cohorts were randomly selected from this cohort and the study datasets included longitudinal inpatient and outpatient data from these populations. Time series analysis (Kalman filter) and survival analysis (Cox proportional hazards) were combined to produce a range of risk models. These models were evaluated using concordance, a discriminatory statistic. A risk model incorporating longitudinal data on clinical documentation and laboratory test results (concordance 0.849) predicts progression from state III CKD to stage IV CKD more accurately when compared to a similar model without laboratory test results (concordance 0.733, P<.001), a model that only considers the most recent laboratory test results (concordance 0.819, P < .031) and a model based on estimated glomerular filtration rate (concordance 0.779, P < .001). A risk prediction model that takes longitudinal laboratory test results and clinical documentation into consideration can predict CKD progression from stage III to stage IV more accurately than three models that do not take all of these variables into consideration. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. Robust Reading: Identification and Tracing of Ambiguous Names

    DTIC Science & Technology

    2004-01-01

    document (InDoc, 15% of the pairs) or not (InterDoc). Example 5.1 “Sherman Williams” is mentioned along with the baseball team “Dallas Cowboys” in 8 out of...300 documents, while “Jeff Williams” is mentioned along with “LA Dodgers ” in two documents. In all models but Model III, “Jeff Williams” is judged to...former. Only Model III, due to the co-occurring dependency between “Jeff Williams” and “ Dodgers ”, identi- fies it as corresponding to an entity

  20. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  1. Forecast of long term coal supply and mining conditions: Model documentation and results

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A coal industry model was developed to support the Jet Propulsion Laboratory in its investigation of advanced underground coal extraction systems. The model documentation includes the programming for the coal mining cost models and an accompanying users' manual, and a guide to reading model output. The methodology used in assembling the transportation, demand, and coal reserve components of the model are also described. Results presented for 1986 and 2000, include projections of coal production patterns and marginal prices, differentiated by coal sulfur content.

  2. Aircraft/Air Traffic Management Functional Analysis Model. Version 2.0; User's Guide

    NASA Technical Reports Server (NTRS)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) a National Aeronautics and Space Administration (NASA) contract. This document provides a guide for using the model in analysis. Those interested in making enhancements or modification to the model should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Technical Description.

  3. Solid waste projection model: Model version 1. 0 technical reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkins, M.L.; Crow, V.L.; Buska, D.E.

    1990-11-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software utilized in developing Version 1.0 of the modeling unit of SWPM. This document is intended for use by experienced software engineers and supports programming, code maintenance, and model enhancement. Those interested in using SWPM should refer to the SWPM Modelmore » User's Guide. This document is available from either the PNL project manager (D. L. Stiles, 509-376-4154) or the WHC program monitor (B. C. Anderson, 509-373-2796). 8 figs.« less

  4. Documenting Climate Models and Their Simulations

    DOE PAGES

    Guilyardi, Eric; Balaji, V.; Lawrence, Bryan; ...

    2013-05-01

    The results of climate models are of increasing and widespread importance. No longer is climate model output of sole interest to climate scientists and researchers in the climate change impacts and adaptation fields. Now nonspecialists such as government officials, policy makers, and the general public all have an increasing need to access climate model output and understand its implications. For this host of users, accurate and complete metadata (i.e., information about how and why the data were produced) is required to document the climate modeling results. We describe a pilot community initiative to collect and make available documentation of climatemore » models and their simulations. In an initial application, a metadata repository is being established to provide information of this kind for a major internationally coordinated modeling activity known as CMIP5 (Coupled Model Intercomparison Project, Phase 5). We expected that for a wide range of stakeholders, this and similar community-managed metadata repositories will spur development of analysis tools that facilitate discovery and exploitation of Earth system simulations.« less

  5. Training of Existing Workers: Issues, Incentives and Models. Support Document

    ERIC Educational Resources Information Center

    Mawer, Giselle; Jackson, Elaine

    2005-01-01

    This document was produced by the authors based on their research for the report, "Training of Existing Workers: Issues, Incentives and Models," (ED495138) and is an added resource for further information. This support document is divided into the following sections: (1) The Retail Industry--A Snapshot; (2) Case Studies--Hardware, Retail…

  6. The Earth System Documentation (ES-DOC) project

    NASA Astrophysics Data System (ADS)

    Murphy, S.; Greenslade, M. A.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high quality tools and services in support of Earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation ecosystem that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system. Within this context ES-DOC leverages the emerging Common Information Model (CIM) metadata standard, which has supported the following projects: ** Coupled Model Inter-comparison Project Phase 5 (CMIP5); ** Dynamical Core Model Inter-comparison Project (DCMIP-2012); ** National Climate Predictions and Projections Platforms (NCPP) Quantitative Evaluation of Downscaling Workshop (QED-2013). This presentation will introduce the project to a wider audience and will demonstrate the current production level capabilities of the eco-system: ** An ESM documentation Viewer embeddable into any website; ** An ESM Questionnaire configurable on a project by project basis; ** An ESM comparison tool reusable across projects; ** An ESM visualization tool reusable across projects; ** A search engine for speedily accessing published documentation; ** Libraries for streamlining document creation, validation and publishing pipelines.

  7. Model Child Care Health Policies. Fourth Edition.

    ERIC Educational Resources Information Center

    Aronson, Susan S.

    Drawn from a review of policies at over 100 child care programs nationwide, this document compiles model health policies intended for adaptation and selective use by out-of-home child care facilities. Following an introduction, the document presents model policy forms with blanks for adding individualized information for the following areas: (1)…

  8. Dynamic Gate Product and Artifact Generation from System Models

    NASA Technical Reports Server (NTRS)

    Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris

    2011-01-01

    Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.

  9. 2014 Version 7.0 Technical Support Document (TSD)

    EPA Pesticide Factsheets

    The 2014 Version 7 document describes the processing of emission inventories into inputs for the Community Multiscale Air Quality model for use in the 2014 National Air Toxics Assessment initial modeling.

  10. Generation IV benchmarking of TRISO fuel performance models under accident conditions: Modeling input data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collin, Blaise P.

    2014-09-01

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: the modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release; the modeling of the AGR-1 and HFR-EU1bis safety testing experiments; and, the comparisonmore » of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from ''Case 5'' of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. ''Case 5'' of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to ''effects of the numerical calculation method rather than the physical model''[IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document thoroughly to make sure all the data needed for their calculations is provided in the document. Missing data will be added to a revision of the document if necessary.« less

  11. The Earth System (ES-DOC) Project

    NASA Astrophysics Data System (ADS)

    Greenslade, Mark; Murphy, Sylvia; Treshansky, Allyn; DeLuca, Cecilia; Guilyardi, Eric; Denvil, Sebastien

    2014-05-01

    ESSI1.3 New Paradigms, Modelling, and International Collaboration Strategies for Earth System Sciences Earth System Documentation (ES-DOC) is an international project supplying tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software and places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system. Within this context ES-DOC leverages emerging documentation standards and supports the following projects: Coupled Model Inter-comparison Project Phase 5 (CMIP5); Dynamical Core Model Inter-comparison Project (DCMIP); National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This presentation will introduce the project to a wider audience and demonstrate the range of tools and services currently available for use. It will also demonstrate how international collaborative efforts are essential to the success of ES-DOC.

  12. Fundamental Travel Demand Model Example

    NASA Technical Reports Server (NTRS)

    Hanssen, Joel

    2010-01-01

    Instances of transportation models are abundant and detailed "how to" instruction is available in the form of transportation software help documentation. The purpose of this paper is to look at the fundamental inputs required to build a transportation model by developing an example passenger travel demand model. The example model reduces the scale to a manageable size for the purpose of illustrating the data collection and analysis required before the first step of the model begins. This aspect of the model development would not reasonably be discussed in software help documentation (it is assumed the model developer comes prepared). Recommendations are derived from the example passenger travel demand model to suggest future work regarding the data collection and analysis required for a freight travel demand model.

  13. Directory of Energy Information Administration Model Abstracts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-07-16

    This directory partially fulfills the requirements of Section 8c, of the documentation order, which states in part that: The Office of Statistical Standards will annually publish an EIA document based on the collected abstracts and the appendices. This report contains brief statements about each model's title, acronym, purpose, and status, followed by more detailed information on characteristics, uses, and requirements. Sources for additional information are identified. All models active through March 1985 are included. The main body of this directory is an alphabetical list of all active EIA models. Appendix A identifies major EIA modeling systems and the models withinmore » these systems, and Appendix B identifies active EIA models by type (basic, auxiliary, and developing). EIA also leases models developed by proprietary software vendors. Documentation for these proprietary models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here. The directory is intended for the use of energy and energy-policy analysts in the public and private sectors.« less

  14. Document page structure learning for fixed-layout e-books using conditional random fields

    NASA Astrophysics Data System (ADS)

    Tao, Xin; Tang, Zhi; Xu, Canhui

    2013-12-01

    In this paper, a model is proposed to learn logical structure of fixed-layout document pages by combining support vector machine (SVM) and conditional random fields (CRF). Features related to each logical label and their dependencies are extracted from various original Portable Document Format (PDF) attributes. Both local evidence and contextual dependencies are integrated in the proposed model so as to achieve better logical labeling performance. With the merits of SVM as local discriminative classifier and CRF modeling contextual correlations of adjacent fragments, it is capable of resolving the ambiguities of semantic labels. The experimental results show that CRF based models with both tree and chain graph structures outperform the SVM model with an increase of macro-averaged F1 by about 10%.

  15. Model documentation report: Residential sector demand module of the national energy modeling system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code. This reference document provides a detailed description for energy analysts, other users, and the public. The NEMS Residential Sector Demand Module is currently used for mid-term forecasting purposes and energy policy analysis over the forecast horizon of 1993 through 2020. The model generates forecasts of energy demand for the residential sector by service, fuel, and Census Division. Policy impacts resulting from new technologies,more » market incentives, and regulatory changes can be estimated using the module. 26 refs., 6 figs., 5 tabs.« less

  16. Volume I: fluidized-bed code documentation, for the period February 28, 1983-March 18, 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piperopoulou, H.; Finson, M.; Bloomfield, D.

    1983-03-01

    This documentation supersedes the previous documentation of the Fluidized-Bed Gasifier code. Volume I documents a simulation program of a Fluidized-Bed Gasifier (FBG), and Volume II documents a systems model of the FBG. The FBG simulation program is an updated version of the PSI/FLUBED code which is capable of modeling slugging beds and variable bed diameter. In its present form the code is set up to model a Westinghouse commercial scale gasifier. The fluidized bed gasifier model combines the classical bubbling bed description for the transport and mixing processes with PSI-generated models for coal chemistry. At the distributor plate, the bubblemore » composition is that of the inlet gas and the initial bubble size is set by the details of the distributor plate. Bubbles grow by coalescence as they rise. The bubble composition and temperature change with height due to transport to and from the cloud as well as homogeneous reactions within the bubble. The cloud composition also varies with height due to cloud/bubble exchange, cloud/emulsion, exchange, and heterogeneous coal char reactions. The emulsion phase is considered to be well mixed.« less

  17. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  18. Documentation of the GLAS fourth order general circulation model. Volume 2: Scalar code

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, D.; Pfaendtner, J.; Takacs, L.; Takano, K.

    1983-01-01

    Volume 2, of a 3 volume technical memoranda contains a detailed documentation of the GLAS fourth order general circulation model. Volume 2 contains the CYBER 205 scalar and vector codes of the model, list of variables, and cross references. A variable name dictionary for the scalar code, and code listings are outlined.

  19. Difficult to Document: The History of Physics and Allied Fields in Industrial and Government Labs

    ERIC Educational Resources Information Center

    Anderson, R. Joseph

    2005-01-01

    Approximately thirty years ago archivists began formulating new models to guide archival collecting, creating a literature that continues to grow. In the mid-1980s, the introduction of the documentation strategy collection model put new emphasis on cooperation between repositories and among stakeholders. The model initially focused on the history…

  20. Models of the Behavior of People Searching the Internet: A Petri Net Approach.

    ERIC Educational Resources Information Center

    Kantor, Paul B.; Nordlie, Ragnar

    1999-01-01

    Illustrates how various key abstractions of information finding, such as document relevance, a desired number of relevant documents, discouragement, exhaustion, and satisfaction can be modeled using the Petri Net framework. Shows that this model leads naturally to a new approach to collection of user data, and to analysis of transaction logs.…

  1. 3D documentation and visualization of external injury findings by integration of simple photography in CT/MRI data sets (IprojeCT).

    PubMed

    Campana, Lorenzo; Breitbeck, Robert; Bauer-Kreuz, Regula; Buck, Ursula

    2016-05-01

    This study evaluated the feasibility of documenting patterned injury using three dimensions and true colour photography without complex 3D surface documentation methods. This method is based on a generated 3D surface model using radiologic slice images (CT) while the colour information is derived from photographs taken with commercially available cameras. The external patterned injuries were documented in 16 cases using digital photography as well as highly precise photogrammetry-supported 3D structured light scanning. The internal findings of these deceased were recorded using CT and MRI. For registration of the internal with the external data, two different types of radiographic markers were used and compared. The 3D surface model generated from CT slice images was linked with the photographs, and thereby digital true-colour 3D models of the patterned injuries could be created (Image projection onto CT/IprojeCT). In addition, these external models were merged with the models of the somatic interior. We demonstrated that 3D documentation and visualization of external injury findings by integration of digital photography in CT/MRI data sets is suitable for the 3D documentation of individual patterned injuries to a body. Nevertheless, this documentation method is not a substitution for photogrammetry and surface scanning, especially when the entire bodily surface is to be recorded in three dimensions including all external findings, and when precise data is required for comparing highly detailed injury features with the injury-inflicting tool.

  2. Toward Scientific Numerical Modeling

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2007-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.

  3. A Conceptual Model for Multidimensional Analysis of Documents

    NASA Astrophysics Data System (ADS)

    Ravat, Franck; Teste, Olivier; Tournier, Ronan; Zurlfluh, Gilles

    Data warehousing and OLAP are mainly used for the analysis of transactional data. Nowadays, with the evolution of Internet, and the development of semi-structured data exchange format (such as XML), it is possible to consider entire fragments of data such as documents as analysis sources. As a consequence, an adapted multidimensional analysis framework needs to be provided. In this paper, we introduce an OLAP multidimensional conceptual model without facts. This model is based on the unique concept of dimensions and is adapted for multidimensional document analysis. We also provide a set of manipulation operations.

  4. Directory of Energy Information Administration model abstracts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-08-11

    This report contains brief statements from the model managers about each model's title, acronym, purpose, and status, followed by more detailed information on characteristics, uses, and requirements. Sources for additional information are identified. All models ''active'' through March 1987 are included. The main body of this directory is an alphabetical list of all active EIA models. Appendix A identifies major EIA modeling systems and the models within these systems, and Appendix B identifies active EIA models by type (basic, auxiliary, and developing). A basic model is one designated by the EIA Administrator as being sufficiently important to require sustained supportmore » and public scrutiny. An auxiliary model is one designated by the EIA Administrator as being used only occasionally in analyses, and therefore requires minimal levels of documentation. A developing model is one designated by the EIA Administrator as being under development and yet of sufficient interest to require a basic level of documentation at a future date. EIA also leases models developed by proprietary software vendors. Documentation for these ''proprietary'' models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here. The directory is intended for the use of energy and energy-policy analysts in the public and private sectors.« less

  5. Ecological models supporting environmental decision making: a strategy for the future

    USGS Publications Warehouse

    Schmolke, Amelie; Thorbek, Pernille; DeAngelis, Donald L.; Grimm, Volker

    2010-01-01

    Ecological models are important for environmental decision support because they allow the consequences of alternative policies and management scenarios to be explored. However, current modeling practice is unsatisfactory. A literature review shows that the elements of good modeling practice have long been identified but are widely ignored. The reasons for this might include lack of involvement of decision makers, lack of incentives for modelers to follow good practice, and the use of inconsistent terminologies. As a strategy for the future, we propose a standard format for documenting models and their analyses: transparent and comprehensive ecological modeling (TRACE) documentation. This standard format will disclose all parts of the modeling process to scrutiny and make modeling itself more efficient and coherent.

  6. GEN-IV Benchmarking of Triso Fuel Performance Models under accident conditions modeling input data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collin, Blaise Paul

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: • The modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release. • The modeling of the AGR-1 and HFR-EU1bis safety testing experiments. •more » The comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from “Case 5” of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. “Case 5” of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to “effects of the numerical calculation method rather than the physical model” [IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document thoroughly to make sure all the data needed for their calculations is provided in the document. Missing data will be added to a revision of the document if necessary. 09/2016: Tables 6 and 8 updated. AGR-2 input data added« less

  7. Certificate management entities for a connected vehicle environment : public workshop read-ahead document.

    DOT National Transportation Integrated Search

    2012-04-06

    This document presents an overview of work conducted to date around development and analysis of organizational and operational models for certificate management in the connected vehicle environment. Functions, organizational models, technical backgro...

  8. 2005 v4.3 Technical Support Document

    EPA Pesticide Factsheets

    Emissions Modeling for the Final Mercury and Air Toxics Standards Technical Support Document describes how updated 2005 NEI, version 2 emissions were processed for air quality modeling in support of the final Mercury and Air Toxics Standards (MATS).

  9. Analysis of rocket engine injection combustion processes

    NASA Technical Reports Server (NTRS)

    Salmon, J. W.

    1976-01-01

    A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.

  10. The dynamic development of the muzzle imprint by contact gunshot: high-speed documentation utilizing the "skin-skull-brain model".

    PubMed

    Thali, M J; Kneubuehl, B P; Dirnhofer, R; Zollinger, U

    2002-07-17

    Many contact gunshots produce a muzzle imprint in the skin of the victim. Different mechanisms have been discussed in literature as being responsible for the creation of the muzzle imprint. Experimenting upon the synthetic non biological skin-skull-brain model, our goal was to document and study the creation of the muzzle imprint with the aid of high-speed photography. In our experiments, we could document with our high-speed photography (at exposure rates in the range of nanoseconds) the bulging, the pressing against the muzzle, and the splitting of the artificial skin. Furthermore, it was possible to photographically record the back pattern of synthetic tissue particles. And, the soot and gunpowder cavity could be reproduced experimentally. In conclusion the experiments completed with the skin-skull-brain model, using high-speed photography for documentation, show the promising possibilities of experimental ballistics with body models.

  11. 3D Documentation and BIM Modeling of Cultural Heritage Structures Using UAVs: The Case of the Foinikaria Church

    NASA Astrophysics Data System (ADS)

    Themistocleous, K.; Agapiou, A.; Hadjimitsis, D.

    2016-10-01

    The documentation of architectural cultural heritage sites has traditionally been expensive and labor-intensive. New innovative technologies, such as Unmanned Aerial Vehicles (UAVs), provide an affordable, reliable and straightforward method of capturing cultural heritage sites, thereby providing a more efficient and sustainable approach to documentation of cultural heritage structures. In this study, hundreds of images of the Panagia Chryseleousa church in Foinikaria, Cyprus were taken using a UAV with an attached high resolution camera. The images were processed to generate an accurate digital 3D model by using Structure in Motion techniques. Building Information Model (BIM) was then used to generate drawings of the church. The methodology described in the paper provides an accurate, simple and cost-effective method of documenting cultural heritage sites and generating digital 3D models using novel techniques and innovative methods.

  12. Documenting Models for Interoperability and Reusability (proceedings)

    EPA Science Inventory

    Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration be...

  13. Documenting Models for Interoperability and Reusability

    EPA Science Inventory

    Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration be...

  14. Supporting Air and Space Expeditionary Forces: Analysis of Combat Support Basing Options

    DTIC Science & Technology

    2004-01-01

    Brooke et al., 2003. 13 For more information on Set Covering models, see Daskin , 1995. Analysis Methodology 43 Transportation Model. A detailed...This PDF document was made available from www.rand.org as a public service of the RAND Corporation. 6Jump down to document Visit RAND at...www.rand.org Explore RAND Project AIR FORCE View document details This document and trademark(s) contained herein are protected by law as indicated in a

  15. Methods, media, and systems for detecting attack on a digital processing device

    DOEpatents

    Stolfo, Salvatore J.; Li, Wei-Jen; Keromylis, Angelos D.; Androulaki, Elli

    2014-07-22

    Methods, media, and systems for detecting attack are provided. In some embodiments, the methods include: comparing at least part of a document to a static detection model; determining whether attacking code is included in the document based on the comparison of the document to the static detection model; executing at least part of the document; determining whether attacking code is included in the document based on the execution of the at least part of the document; and if attacking code is determined to be included in the document based on at least one of the comparison of the document to the static detection model and the execution of the at least part of the document, reporting the presence of an attack. In some embodiments, the methods include: selecting a data segment in at least one portion of an electronic document; determining whether the arbitrarily selected data segment can be altered without causing the electronic document to result in an error when processed by a corresponding program; in response to determining that the arbitrarily selected data segment can be altered, arbitrarily altering the data segment in the at least one portion of the electronic document to produce an altered electronic document; and determining whether the corresponding program produces an error state when the altered electronic document is processed by the corresponding program.

  16. Methods, media, and systems for detecting attack on a digital processing device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stolfo, Salvatore J.; Li, Wei-Jen; Keromytis, Angelos D.

    Methods, media, and systems for detecting attack are provided. In some embodiments, the methods include: comparing at least part of a document to a static detection model; determining whether attacking code is included in the document based on the comparison of the document to the static detection model; executing at least part of the document; determining whether attacking code is included in the document based on the execution of the at least part of the document; and if attacking code is determined to be included in the document based on at least one of the comparison of the document tomore » the static detection model and the execution of the at least part of the document, reporting the presence of an attack. In some embodiments, the methods include: selecting a data segment in at least one portion of an electronic document; determining whether the arbitrarily selected data segment can be altered without causing the electronic document to result in an error when processed by a corresponding program; in response to determining that the arbitrarily selected data segment can be altered, arbitrarily altering the data segment in the at least one portion of the electronic document to produce an altered electronic document; and determining whether the corresponding program produces an error state when the altered electronic document is processed by the corresponding program.« less

  17. Study of the Performance of Aids to Navigation Systems - Phase 1, An Empirical Model Approach

    DTIC Science & Technology

    1978-07-19

    Pesch, .. L. /Masakasy, J. G. /Clark Di . A. /Atkins .-. S.... -------- 00o Document I available to the U. S. public through the National Technical...Document is available to the public through PILOTING, FIX, NAVIGATOR, PILOT, the National Technical Information Service, MONTE CARLO MODEL, SHIP SIMULATO...Validation of Entire Navigating and Steering 5-33 Model 5.5 Overview of Model Capabilities and Achieved Goals 5-33 vi SECTION TITLE PAGE 6 PLAN FOR

  18. TKKMOD: A computer simulation program for an integrated wind diesel system. Version 1.0: Document and user guide

    NASA Astrophysics Data System (ADS)

    Manninen, L. M.

    1993-12-01

    The document describes TKKMOD, a simulation model developed at Helsinki University of Technology for a specific wind-diesel system layout, with special emphasis on the battery submodel and its use in simulation. The model has been included into the European wind-diesel modeling software package WDLTOOLS under the CEC JOULE project 'Engineering Design Tools for Wind-Diesel Systems' (JOUR-0078). WDLTOOLS serves as the user interface and processes the input and output data of different logistic simulation models developed by the project participants. TKKMOD cannot be run without this shell. The report only describes the simulation principles and model specific parameters of TKKMOD and gives model specific user instructions. The input and output data processing performed outside this model is described in the documentation of the shell. The simulation model is utilized for calculation of long-term performance of the reference system configuration for given wind and load conditions. The main results are energy flows, losses in the system components, diesel fuel consumption, and the number of diesel engine starts.

  19. Farm-Level Effects of Soil Conservation and Commodity Policy Alternatives: Model and Data Documentation.

    ERIC Educational Resources Information Center

    Sutton, John D.

    This report documents a profit-maximizing linear programming (LP) model of a farm typical of a major corn-soybean producing area in the Southern Michigan-Northern Indiana Drift Plain. Following an introduction, a complete description of the farm is provided. The next section presents the LP model, which is structured to help analyze after-tax…

  20. Air Quality Modeling Technical Support Document for the Final Cross State Air Pollution Rule Update

    EPA Pesticide Factsheets

    In this technical support document (TSD) we describe the air quality modeling performed to support the final Cross State Air Pollution Rule for the 2008 ozone National Ambient Air Quality Standards (NAAQS).

  1. A practical guide on DTA model applications for regional planning

    DOT National Transportation Integrated Search

    2016-06-07

    This document is intended as a guide for use by Metropolitan Planning Organizations (MPO) and other planning agencies that are interested in applying Dynamic Traffic Assignment (DTA) models for planning applications. The objective of this document is...

  2. Common data model for natural language processing based on two existing standard information models: CDA+GrAF.

    PubMed

    Meystre, Stéphane M; Lee, Sanghoon; Jung, Chai Young; Chevrier, Raphaël D

    2012-08-01

    An increasing need for collaboration and resources sharing in the Natural Language Processing (NLP) research and development community motivates efforts to create and share a common data model and a common terminology for all information annotated and extracted from clinical text. We have combined two existing standards: the HL7 Clinical Document Architecture (CDA), and the ISO Graph Annotation Format (GrAF; in development), to develop such a data model entitled "CDA+GrAF". We experimented with several methods to combine these existing standards, and eventually selected a method wrapping separate CDA and GrAF parts in a common standoff annotation (i.e., separate from the annotated text) XML document. Two use cases, clinical document sections, and the 2010 i2b2/VA NLP Challenge (i.e., problems, tests, and treatments, with their assertions and relations), were used to create examples of such standoff annotation documents, and were successfully validated with the XML schemata provided with both standards. We developed a tool to automatically translate annotation documents from the 2010 i2b2/VA NLP Challenge format to GrAF, and automatically generated 50 annotation documents using this tool, all successfully validated. Finally, we adapted the XSL stylesheet provided with HL7 CDA to allow viewing annotation XML documents in a web browser, and plan to adapt existing tools for translating annotation documents between CDA+GrAF and the UIMA and GATE frameworks. This common data model may ease directly comparing NLP tools and applications, combining their output, transforming and "translating" annotations between different NLP applications, and eventually "plug-and-play" of different modules in NLP applications. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. E-documentation as a process management tool for nursing care in hospitals.

    PubMed

    Rajkovic, Uros; Sustersic, Olga; Rajkovic, Vladislav

    2009-01-01

    Appropriate documentation plays a key role in process management in nursing care. It includes holistic data management based on patient's data along the clinical path with regard to nursing care. We developed an e-documentation model that follows the process method of work in nursing care. It assesses the patient's status on the basis of Henderson's theoretical model of 14 basic living activities and is aligned with internationally recognized nursing classifications. E-documentation development requires reengineering of existing documentation and facilitates process reengineering. A prototype solution of an e-nursing documentation, already being in testing process at University medical centres in Ljubljana and Maribor, will be described.

  4. Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0

    NASA Technical Reports Server (NTRS)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.

  5. User's guide to the MESOI diffusion model and to the utility programs UPDATE and LOGRVU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Athey, G.F.; Allwine, K.J.; Ramsdell, J.V.

    MESOI is an interactive, Lagrangian puff trajectory diffusion model. The model is documented separately (Ramsdell and Athey, 1981); this report is intended to provide MESOI users with the information needed to successfully conduct model simulations. The user is also provided with guidance in the use of the data file maintenance and review programs; UPDATE and LOGRVU. Complete examples are given for the operaton of all three programs and an appendix documents UPDATE and LOGRVU.

  6. Integrated corridor management analysis, modeling, and simulation results for the test corridor.

    DOT National Transportation Integrated Search

    2008-06-01

    This report documents the Integrated Corridor Management (ICM) Analysis Modeling and Simulation (AMS) tools and strategies used on a Test Corridor, presents results and lessons-learned, and documents the relative capability of AMS to support benefit-...

  7. Air Quality Modeling Technical Support Document for the 2008 Ozone NAAQS Cross-State Air Pollution Rule Proposal

    EPA Pesticide Factsheets

    In this technical support document (TSD) we describe the air quality modeling performed to support the proposed Cross-State Air Pollution Rule for the 2008 ozone National Ambient Air Quality Standards (NAAQS)

  8. Air Quality Modeling Technical Support Document for the 2015 Ozone NAAQS Preliminary Interstate Transport Assessment

    EPA Pesticide Factsheets

    In this technical support document (TSD) EPA describes the air quality modeling performed to support the 2015 ozone National Ambient Air Quality Standards (NAAQS) preliminary interstate transport assessment Notice of Data Availability (NODA).

  9. Improved simulation of driver behavior : modeling protected and permitted left-turn operations at signalized intersections.

    DOT National Transportation Integrated Search

    2011-04-01

    "This report documents the findings from a research project that is focused on modeling protected and permitted left-turn operations at signalized intersection approaches. The projects primary objective is to document the microscopic characteristi...

  10. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  11. Advocating for School Psychologists in Response to the APA's Proposed "Model Act for State Licensure of Psychologists"

    ERIC Educational Resources Information Center

    Skalski, Anastasia Kalamaros

    2009-01-01

    On March 6, 2009, the APA Model Licensure Act Task Force released its second draft of the policy document known as the proposed "Model Act for State Licensure of Psychologists". This policy document serves as guidance to state legislatures for how they should set up their psychology licensing laws. The general expectations promoted in the model…

  12. STS-2: SAIL non-avionics subsystems math model requirements

    NASA Technical Reports Server (NTRS)

    Bennett, W. P.; Herold, R. W.

    1980-01-01

    Simulation of the STS-2 Shuttle nonavionics subsystems in the shuttle avionics integration laboratory (SAIL) is necessary for verification of the integrated shuttle avionics system. The math model (simulation) requirements for each of the nonavionics subsystems that interfaces with the Shuttle avionics system is documented and a single source document for controlling approved changes (by the SAIL change control panel) to the math models is provided.

  13. Econ's optimal decision model of wheat production and distribution-documentation

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The report documents the computer programs written to implement the ECON optical decision model. The programs were written in APL, an extremely compact and powerful language particularly well suited to this model, which makes extensive use of matrix manipulations. The algorithms used are presented and listings of and descriptive information on the APL programs used are given. Possible changes in input data are also given.

  14. Directory of Energy Information Administration model abstracts 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-01-01

    This directory contains descriptions about each basic and auxiliary model, including the title, acronym, purpose, and type, followed by more detailed information on characteristics, uses, and requirements. For developing models, limited information is provided. Sources for additional information are identified. Included in this directory are 44 EIA models active as of February 1, 1988; 16 of which operate on personal computers. Models that run on personal computers are identified by ''PC'' as part of the acronyms. The main body of this directory is an alphabetical listing of all basic and auxiliary EIA models. Appendix A identifies major EIA modeling systemsmore » and the models within these systems, and Appendix B identifies EIA models by type (basic or auxiliary). Appendix C lists developing models and contact persons for those models. A basic model is one designated by the EIA Administrator as being sufficiently important to require sustained support and public scrutiny. An auxiliary model is one designated by the EIA Administrator as being used only occasionally in analyses, and therefore requires minimal levels of documentation. A developing model is one designated by the EIA Administrator as being under development and yet of sufficient interest to require a basic level of documentation at a future date. EIA also leases models developed by proprietary software vendors. Documentation for these ''proprietary'' models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here.« less

  15. NASA geometry data exchange specification for computational fluid dynamics (NASA IGES)

    NASA Technical Reports Server (NTRS)

    Blake, Matthew W.; Kerr, Patricia A.; Thorp, Scott A.; Jou, Jin J.

    1994-01-01

    This document specifies a subset of an existing product data exchange specification that is widely used in industry and government. The existing document is called the Initial Graphics Exchange Specification. This document, a subset of IGES, is intended for engineers analyzing product performance using tools such as computational fluid dynamics (CFD) software. This document specifies how to define mathematically and exchange the geometric model of an object. The geometry is represented utilizing nonuniform rational B-splines (NURBS) curves and surfaces. Only surface models are represented; no solid model representation is included. This specification does not include most of the other types of product information available in IGES (e.g., no material properties or surface finish properties) and does not provide all the specific file format details of IGES. The data exchange protocol specified in this document is fully conforming to the American National Standard (ANSI) IGES 5.2.

  16. Ensemble LUT classification for degraded document enhancement

    NASA Astrophysics Data System (ADS)

    Obafemi-Ajayi, Tayo; Agam, Gady; Frieder, Ophir

    2008-01-01

    The fast evolution of scanning and computing technologies have led to the creation of large collections of scanned paper documents. Examples of such collections include historical collections, legal depositories, medical archives, and business archives. Moreover, in many situations such as legal litigation and security investigations scanned collections are being used to facilitate systematic exploration of the data. It is almost always the case that scanned documents suffer from some form of degradation. Large degradations make documents hard to read and substantially deteriorate the performance of automated document processing systems. Enhancement of degraded document images is normally performed assuming global degradation models. When the degradation is large, global degradation models do not perform well. In contrast, we propose to estimate local degradation models and use them in enhancing degraded document images. Using a semi-automated enhancement system we have labeled a subset of the Frieder diaries collection.1 This labeled subset was then used to train an ensemble classifier. The component classifiers are based on lookup tables (LUT) in conjunction with the approximated nearest neighbor algorithm. The resulting algorithm is highly effcient. Experimental evaluation results are provided using the Frieder diaries collection.1

  17. Learning Supervised Topic Models for Classification and Regression from Crowds.

    PubMed

    Rodrigues, Filipe; Lourenco, Mariana; Ribeiro, Bernardete; Pereira, Francisco C

    2017-12-01

    The growing need to analyze large collections of documents has led to great developments in topic modeling. Since documents are frequently associated with other related variables, such as labels or ratings, much interest has been placed on supervised topic models. However, the nature of most annotation tasks, prone to ambiguity and noise, often with high volumes of documents, deem learning under a single-annotator assumption unrealistic or unpractical for most real-world applications. In this article, we propose two supervised topic models, one for classification and another for regression problems, which account for the heterogeneity and biases among different annotators that are encountered in practice when learning from crowds. We develop an efficient stochastic variational inference algorithm that is able to scale to very large datasets, and we empirically demonstrate the advantages of the proposed model over state-of-the-art approaches.

  18. Modelling of Operative Report Documents for Data Integration into an openEHR-Based Enterprise Data Warehouse.

    PubMed

    Haarbrandt, Birger; Wilschko, Andreas; Marschollek, Michael

    2016-01-01

    In order to integrate operative report documents from two operating room management systems into a data warehouse, we investigated the application of the two-level modelling approach of openEHR to create a shared data model. Based on the systems' analyses, a template consisting of 13 archetypes has been developed. Of these 13 archetypes, 3 have been obtained from the international archetype repository of the openEHR foundation. The remaining 10 archetypes have been newly created. The template was evaluated by an application system expert and through conducting a first test mapping of real-world data from one of the systems. The evaluation showed that by using the two-level modelling approach of openEHR, we succeeded to represent an integrated and shared information model for operative report documents. More research is needed to learn about the limitations of this approach in other data integration scenarios.

  19. Some guidance on preparing validation plans for the DART Full System Models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy

    2009-03-01

    Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generallymore » applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.« less

  20. HYDROLOGIC EVALUATION OF LANDFILL PERFORMANCE (HELP) MODEL - USER'S GUIDE FOR VERSION 3

    EPA Science Inventory

    This report documents the solution methods and process descriptions used in the Version 3 of the HELP model. Program documentation including program options, system and operating requirements, file structures, program structure and variable descriptions are provided in a separat...

  1. MSW Time to Tumor Model and Supporting Documentation

    EPA Science Inventory

    The multistage Weibull (MSW) time-to-tumor model and related documentation were developed principally (but not exclusively) for conducting time-to-tumor analyses to support risk assessments under the IRIS program. These programs and related docum...

  2. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    NASA Technical Reports Server (NTRS)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  3. Centipod WEC, Advanced Controls, Resultant LCOE

    DOE Data Explorer

    McCall, Alan

    2016-02-15

    Project resultant LCOE model after implementation of MPC controller. Contains AEP, CBS, model documentation, and LCOE content model. This is meant for comparison with this project's baseline LCOE model.

  4. Guide to solar reference spectra and irradiance models

    NASA Astrophysics Data System (ADS)

    Tobiska, W. Kent

    The international standard for determining solar irradiances was published by the International Standards Organization (ISO) in May 2007. The document, ISO 21348 Space Environment (natural and artificial) - Process for determining solar irradiances, describes the process for representing solar irradiances. We report on the next progression of standards work, i.e., the development of a guide that identifies solar reference spectra and irradiance models for use in engineering design or scientific research. This document will be produced as an AIAA Guideline and ISO Technical Report. It will describe the content of the reference spectra and models, uncertainties and limitations, technical basis, data bases from which the reference spectra and models are formed, publication references, and sources of computer code for reference spectra and solar irradiance models, including those which provide spectrally-resolved lines as well as solar indices and proxies and which are generally recognized in the solar sciences. The document is intended to assist aircraft and space vehicle designers and developers, heliophysicists, geophysicists, aeronomers, meteorologists, and climatologists in understanding available models, comparing sources of data, and interpreting engineering and scientific results based on different solar reference spectra and irradiance models.

  5. A System for Automated Extraction of Metadata from Scanned Documents using Layout Recognition and String Pattern Search Models.

    PubMed

    Misra, Dharitri; Chen, Siyuan; Thoma, George R

    2009-01-01

    One of the most expensive aspects of archiving digital documents is the manual acquisition of context-sensitive metadata useful for the subsequent discovery of, and access to, the archived items. For certain types of textual documents, such as journal articles, pamphlets, official government records, etc., where the metadata is contained within the body of the documents, a cost effective method is to identify and extract the metadata in an automated way, applying machine learning and string pattern search techniques.At the U. S. National Library of Medicine (NLM) we have developed an automated metadata extraction (AME) system that employs layout classification and recognition models with a metadata pattern search model for a text corpus with structured or semi-structured information. A combination of Support Vector Machine and Hidden Markov Model is used to create the layout recognition models from a training set of the corpus, following which a rule-based metadata search model is used to extract the embedded metadata by analyzing the string patterns within and surrounding each field in the recognized layouts.In this paper, we describe the design of our AME system, with focus on the metadata search model. We present the extraction results for a historic collection from the Food and Drug Administration, and outline how the system may be adapted for similar collections. Finally, we discuss some ongoing enhancements to our AME system.

  6. Care zoning in a psychiatric intensive care unit: strengthening ongoing clinical risk assessment.

    PubMed

    Mullen, Antony; Drinkwater, Vincent; Lewin, Terry J

    2014-03-01

    To implement and evaluate the care zoning model in an eight-bed psychiatric intensive care unit and, specifically, to examine the model's ability to improve the documentation and communication of clinical risk assessment and management. Care zoning guides nurses in assessing clinical risk and planning care within a mental health context. Concerns about the varying quality of clinical risk assessment prompted a trial of the care zoning model in a psychiatric intensive care unit within a regional mental health facility. The care zoning model assigns patients to one of 3 'zones' according to their clinical risk, encouraging nurses to document and implement targeted interventions required to manage those risks. An implementation trial framework was used for this research to refine, implement and evaluate the impact of the model on nurses' clinical practice within the psychiatric intensive care unit, predominantly as a quality improvement initiative. The model was trialled for three months using a pre- and postimplementation staff survey, a pretrial file audit and a weekly file audit. Informal staff feedback was also sought via surveys and regular staff meetings. This trial demonstrated improvement in the quality of mental state documentation, and clinical risk information was identified more accurately. There was limited improvement in the quality of care planning and the documentation of clinical interventions. Nurses' initial concerns over the introduction of the model shifted into overall acceptance and recognition of the benefits. The results of this trial demonstrate that the care zoning model was able to improve the consistency and quality of risk assessment information documented. Care planning and evaluation of associated outcomes showed less improvement. Care zoning remains a highly applicable model for the psychiatric intensive care unit environment and is a useful tool in guiding nurses to carry out routine patient risk assessments. © 2013 John Wiley & Sons Ltd.

  7. Technical Manual for the SAM Physical Trough Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, M. J.; Gilman, P.

    2011-06-01

    NREL, in conjunction with Sandia National Lab and the U.S Department of Energy, developed the System Advisor Model (SAM) analysis tool for renewable energy system performance and economic analysis. This paper documents the technical background and engineering formulation for one of SAM's two parabolic trough system models in SAM. The Physical Trough model calculates performance relationships based on physical first principles where possible, allowing the modeler to predict electricity production for a wider range of component geometries than is possible in the Empirical Trough model. This document describes the major parabolic trough plant subsystems in detail including the solar field,more » power block, thermal storage, piping, auxiliary heating, and control systems. This model makes use of both existing subsystem performance modeling approaches, and new approaches developed specifically for SAM.« less

  8. A Hybrid Method for Opinion Finding Task (KUNLP at TREC 2008 Blog Track)

    DTIC Science & Technology

    2008-11-01

    retrieve relevant documents. For the Opinion Retrieval subtask, we propose a hybrid model of lexicon-based approach and machine learning approach for...estimating and ranking the opinionated documents. For the Polarized Opinion Retrieval subtask, we employ machine learning for predicting the polarity...and linear combination technique for ranking polar documents. The hybrid model which utilize both lexicon-based approach and machine learning approach

  9. Testing a Nursing-Specific Model of Electronic Patient Record documentation with regard to information completeness, comprehensiveness and consistency.

    PubMed

    von Krogh, Gunn; Nåden, Dagfinn; Aasland, Olaf Gjerløw

    2012-10-01

    To present the results from the test site application of the documentation model KPO (quality assurance, problem solving and caring) designed to impact the quality of nursing information in electronic patient record (EPR). The KPO model was developed by means of consensus group and clinical testing. Four documentation arenas and eight content categories, nursing terminologies and a decision-support system were designed to impact the completeness, comprehensiveness and consistency of nursing information. The testing was performed in a pre-test/post-test time series design, three times at a one-year interval. Content analysis of nursing documentation was accomplished through the identification, interpretation and coding of information units. Data from the pre-test and post-test 2 were subjected to statistical analyses. To estimate the differences, paired t-tests were used. At post-test 2, the information is found to be more complete, comprehensive and consistent than at pre-test. The findings indicate that documentation arenas combining work flow and content categories deduced from theories on nursing practice can influence the quality of nursing information. The KPO model can be used as guide when shifting from paper-based to electronic-based nursing documentation with the aim of obtaining complete, comprehensive and consistent nursing information. © 2012 Blackwell Publishing Ltd.

  10. Degraded document image enhancement

    NASA Astrophysics Data System (ADS)

    Agam, G.; Bal, G.; Frieder, G.; Frieder, O.

    2007-01-01

    Poor quality documents are obtained in various situations such as historical document collections, legal archives, security investigations, and documents found in clandestine locations. Such documents are often scanned for automated analysis, further processing, and archiving. Due to the nature of such documents, degraded document images are often hard to read, have low contrast, and are corrupted by various artifacts. We describe a novel approach for the enhancement of such documents based on probabilistic models which increases the contrast, and thus, readability of such documents under various degradations. The enhancement produced by the proposed approach can be viewed under different viewing conditions if desired. The proposed approach was evaluated qualitatively and compared to standard enhancement techniques on a subset of historical documents obtained from the Yad Vashem Holocaust museum. In addition, quantitative performance was evaluated based on synthetically generated data corrupted under various degradation models. Preliminary results demonstrate the effectiveness of the proposed approach.

  11. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  12. A Compendium Of Traffic Model Validation Documentation And Recommendations - Phase 1 - Tasks A-H

    DOT National Transportation Integrated Search

    1996-12-01

    THE INTENT OF THIS REPORT IS TO CONSOLIDATE THE DOCUMENTATION DELIVERED TO FHWA FOR THE DATABASES FOR ASSESSMENT OF OPERATION TESTS AND TRAFFIC MODELS CONTRACT. : SOME INTRODUCTORY REMARKS ARE REQUIRED TO UNDERSTAND THE RATIONAL USED IN THE WHITE ...

  13. Text Summarization Model based on Facility Location Problem

    NASA Astrophysics Data System (ADS)

    Takamura, Hiroya; Okumura, Manabu

    e propose a novel multi-document generic summarization model based on the budgeted median problem, which is a facility location problem. The summarization method based on our model is an extractive method, which selects sentences from the given document cluster and generates a summary. Each sentence in the document cluster will be assigned to one of the selected sentences, where the former sentece is supposed to be represented by the latter. Our method selects sentences to generate a summary that yields a good sentence assignment and hence covers the whole content of the document cluster. An advantage of this method is that it can incorporate asymmetric relations between sentences such as textual entailment. Through experiments, we showed that the proposed method yields good summaries on the dataset of DUC'04.

  14. Rotorcraft Performance Model (RPM) for use in AEDT.

    DOT National Transportation Integrated Search

    2015-11-01

    This report documents a rotorcraft performance model for use in the FAAs Aviation Environmental Design Tool. The new rotorcraft performance model is physics-based. This new model replaces the existing helicopter trajectory modeling methods in the ...

  15. Peer Review Documents Related to the Evaluation of ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review and expert summaries of the BMDS application and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer reviews and expert summaries of the BMDS applications and its models as they were developed and eventually released.

  16. Querying and Ranking XML Documents.

    ERIC Educational Resources Information Center

    Schlieder, Torsten; Meuss, Holger

    2002-01-01

    Discussion of XML, information retrieval, precision, and recall focuses on a retrieval technique that adopts the similarity measure of the vector space model, incorporates the document structure, and supports structured queries. Topics include a query model based on tree matching; structured queries and term-based ranking; and term frequency and…

  17. Doclet To Synthesize UML

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2005-01-01

    The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.

  18. An integrated conceptual framework for evaluating and improving 'understanding' in informed consent.

    PubMed

    Bossert, Sabine; Strech, Daniel

    2017-10-17

    The development of understandable informed consent (IC) documents has proven to be one of the most important challenges in research with humans as well as in healthcare settings. Therefore, evaluating and improving understanding has been of increasing interest for empirical research on IC. However, several conceptual and practical challenges for the development of understandable IC documents remain unresolved. In this paper, we will outline and systematize some of these challenges. On the basis of our own experiences in empirical user testing of IC documents as well as the relevant literature on understanding in IC, we propose an integrated conceptual model for the development of understandable IC documents. The proposed conceptual model integrates different methods for the participatory improvement of written information, including IC, as well as quantitative methods for measuring understanding in IC. In most IC processes, understandable written information is an important prerequisite for valid IC. To improve the quality of IC documents, a conceptual model for participatory procedures of testing, revising, and retesting can be applied. However, the model presented in this paper needs further theoretical and empirical elaboration and clarification of several conceptual and practical challenges.

  19. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  20. PIMMS tools for capturing metadata about simulations

    NASA Astrophysics Data System (ADS)

    Pascoe, Charlotte; Devine, Gerard; Tourte, Gregory; Pascoe, Stephen; Lawrence, Bryan; Barjat, Hannah

    2013-04-01

    PIMMS (Portable Infrastructure for the Metafor Metadata System) provides a method for consistent and comprehensive documentation of modelling activities that enables the sharing of simulation data and model configuration information. The aim of PIMMS is to package the metadata infrastructure developed by Metafor for CMIP5 so that it can be used by climate modelling groups in UK Universities. PIMMS tools capture information about simulations from the design of experiments to the implementation of experiments via simulations that run models. PIMMS uses the Metafor methodology which consists of a Common Information Model (CIM), Controlled Vocabularies (CV) and software tools. PIMMS software tools provide for the creation and consumption of CIM content via a web services infrastructure and portal developed by the ES-DOC community. PIMMS metadata integrates with the ESGF data infrastructure via the mapping of vocabularies onto ESGF facets. There are three paradigms of PIMMS metadata collection: Model Intercomparision Projects (MIPs) where a standard set of questions is asked of all models which perform standard sets of experiments. Disciplinary level metadata collection where a standard set of questions is asked of all models but experiments are specified by users. Bespoke metadata creation where the users define questions about both models and experiments. Examples will be shown of how PIMMS has been configured to suit each of these three paradigms. In each case PIMMS allows users to provide additional metadata beyond that which is asked for in an initial deployment. The primary target for PIMMS is the UK climate modelling community where it is common practice to reuse model configurations from other researchers. This culture of collaboration exists in part because climate models are very complex with many variables that can be modified. Therefore it has become common practice to begin a series of experiments by using another climate model configuration as a starting point. Usually this other configuration is provided by a researcher in the same research group or by a previous collaborator with whom there is an existing scientific relationship. Some efforts have been made at the university department level to create documentation but there is a wide diversity in the scope and purpose of this information. The consistent and comprehensive documentation enabled by PIMMS will enable the wider sharing of climate model data and configuration information. The PIMMS methodology assumes an initial effort to document standard model configurations. Once these descriptions have been created users need only describe the specific way in which their model configuration is different from the standard. Thus the documentation burden on the user is specific to the experiment they are performing and fits easily into the workflow of doing their science. PIMMS metadata is independent of data and as such is ideally suited for documenting model development. PIMMS provides a framework for sharing information about failed model configurations for which data are not kept, the negative results that don't appear in scientific literature. PIMMS is a UK project funded by JISC, The University of Reading, The University of Bristol and STFC.

  1. Empirical flow parameters - a tool for hydraulic model validity assessment : [summary].

    DOT National Transportation Integrated Search

    2013-10-01

    Hydraulic modeling assembles models based on generalizations of parameter values from textbooks, professional literature, computer program documentation, and engineering experience. Actual measurements adjacent to the model location are seldom availa...

  2. The CMIP5 Model Documentation Questionnaire: Development of a Metadata Retrieval System for the METAFOR Common Information Model

    NASA Astrophysics Data System (ADS)

    Pascoe, Charlotte; Lawrence, Bryan; Moine, Marie-Pierre; Ford, Rupert; Devine, Gerry

    2010-05-01

    The EU METAFOR Project (http://metaforclimate.eu) has created a web-based model documentation questionnaire to collect metadata from the modelling groups that are running simulations in support of the Coupled Model Intercomparison Project - 5 (CMIP5). The CMIP5 model documentation questionnaire will retrieve information about the details of the models used, how the simulations were carried out, how the simulations conformed to the CMIP5 experiment requirements and details of the hardware used to perform the simulations. The metadata collected by the CMIP5 questionnaire will allow CMIP5 data to be compared in a scientifically meaningful way. This paper describes the life-cycle of the CMIP5 questionnaire development which starts with relatively unstructured input from domain specialists and ends with formal XML documents that comply with the METAFOR Common Information Model (CIM). Each development step is associated with a specific tool. (1) Mind maps are used to capture information requirements from domain experts and build a controlled vocabulary, (2) a python parser processes the XML files generated by the mind maps, (3) Django (python) is used to generate the dynamic structure and content of the web based questionnaire from processed xml and the METAFOR CIM, (4) Python parsers ensure that information entered into the CMIP5 questionnaire is output as CIM compliant xml, (5) CIM compliant output allows automatic information capture tools to harvest questionnaire content into databases such as the Earth System Grid (ESG) metadata catalogue. This paper will focus on how Django (python) and XML input files are used to generate the structure and content of the CMIP5 questionnaire. It will also address how the choice of development tools listed above provided a framework that enabled working scientists (who we would never ordinarily get to interact with UML and XML) to be part the iterative development process and ensure that the CMIP5 model documentation questionnaire reflects what scientists want to know about the models. Keywords: metadata, CMIP5, automatic information capture, tool development

  3. The AE-8 trapped electron model environment

    NASA Technical Reports Server (NTRS)

    Vette, James I.

    1991-01-01

    The machine sensible version of the AE-8 electron model environment was completed in December 1983. It has been sent to users on the model environment distribution list and is made available to new users by the National Space Science Data Center (NSSDC). AE-8 is the last in a series of terrestrial trapped radiation models that includes eight proton and eight electron versions. With the exception of AE-8, all these models were documented in formal reports as well as being available in a machine sensible form. The purpose of this report is to complete the documentation, finally, for AE-8 so that users can understand its construction and see the comparison of the model with the new data used, as well as with the AE-4 model.

  4. WASP7 Stream Transport - Model Theory and User's Guide: Supplement to Water Quality Analysis Simulation Program (WASP) User Documentation

    EPA Science Inventory

    The standard WASP7 stream transport model calculates water flow through a branching stream network that may include both free-flowing and ponded segments. This supplemental user manual documents the hydraulic algorithms, including the transport and hydrogeometry equations, the m...

  5. Student Flow Model SFM-IA: System Documentation. Technical Report 41B. Preliminary Edition.

    ERIC Educational Resources Information Center

    Busby, John C.; Johnson, Richard S.

    Technical specifications, operating procedures, and reference information for the National Center for Higher Education Management Systems' (NCHEMS) Student Flow Model (SFM) computer programs are presented. Included are narrative descriptions of the system and its modules, specific program documentation for each of the modules, system flowcharts,…

  6. DEVELOPMENT AND EVALUATION OF NOVEL DOSE-RESPONSE MODELS FOR USE IN MICROBIAL RISK ASSESSMENT

    EPA Science Inventory

    This document contains a description of dose-response modeling methods designed to provide a robust approach under uncertainty for predicting human population risk from exposure to pathogens in drinking water.

    The purpose of this document is to describe a body of literatu...

  7. A Perspective on Marketing Teacher Education.

    ERIC Educational Resources Information Center

    Turner, John E.

    1990-01-01

    This document presents a model for program planning and decision making for teacher education in marketing and discusses teacher education policy in the context of the model. The first section explains the document's background and perspective. The second section places marketing teacher education in the context of a larger marketing education…

  8. Information Retrieval Using UMLS-based Structured Queries

    PubMed Central

    Fagan, Lawrence M.; Berrios, Daniel C.; Chan, Albert; Cucina, Russell; Datta, Anupam; Shah, Maulik; Surendran, Sujith

    2001-01-01

    During the last three years, we have developed and described components of ELBook, a semantically based information-retrieval system [1-4]. Using these components, domain experts can specify a query model, indexers can use the query model to index documents, and end-users can search these documents for instances of indexed queries.

  9. Issue of Building Information Modelling Implementation into the Czech Republic’s Legislation using the Level of Development

    NASA Astrophysics Data System (ADS)

    Prušková, Kristýna; Nývlt, Vladimír

    2017-10-01

    The object of this paper is the issue of links between the Level of Development of particular project in Building Information Modeling environment and the projects of certain stages of project documentation within the existing Czech Republic’s Legislation. This research article uses the experiences from the initiative of active working group „WG#03: BIM & Realization“, which is the part of the Czech BIM Council, especially the document called “Draft of unified data structure for Building Information Modeling in the Czech Republic”. The findings of this paper are in the defining specific Level of Development of relative parameters, mentioned in this document, connected to the specific level of information and details requested by the Czech Republic’s Legislation. These findings could be used as an underlay to create document called “Level of Development draft assignment to the individual stages of project documentation in the Czech Republic”. The Level of Development is the most useful way of the information visualization, which leads to the most effortless way of exact stated implementation of Building Information Modeling into the practice of designing structures and buildings in the Czech Republic. The Implementation of using Building Information Modeling technology in designing structures and buildings will lead to the enhanced quality of the project documentation and generally to more effective cost savings during whole life cycle of buildings. Moreover, the all over using of the BIM technology in the Czech Republic will be very useful in the Facility Management area, especially in the facility management and maintenance of state buildings.

  10. "What is relevant in a text document?": An interpretable machine learning approach

    PubMed Central

    Arras, Leila; Horn, Franziska; Montavon, Grégoire; Müller, Klaus-Robert

    2017-01-01

    Text documents can be described by a number of abstract concepts such as semantic category, writing style, or sentiment. Machine learning (ML) models have been trained to automatically map documents to these abstract concepts, allowing to annotate very large text collections, more than could be processed by a human in a lifetime. Besides predicting the text’s category very accurately, it is also highly desirable to understand how and why the categorization process takes place. In this paper, we demonstrate that such understanding can be achieved by tracing the classification decision back to individual words using layer-wise relevance propagation (LRP), a recently developed technique for explaining predictions of complex non-linear classifiers. We train two word-based ML models, a convolutional neural network (CNN) and a bag-of-words SVM classifier, on a topic categorization task and adapt the LRP method to decompose the predictions of these models onto words. Resulting scores indicate how much individual words contribute to the overall classification decision. This enables one to distill relevant information from text documents without an explicit semantic information extraction step. We further use the word-wise relevance scores for generating novel vector-based document representations which capture semantic information. Based on these document vectors, we introduce a measure of model explanatory power and show that, although the SVM and CNN models perform similarly in terms of classification accuracy, the latter exhibits a higher level of explainability which makes it more comprehensible for humans and potentially more useful for other applications. PMID:28800619

  11. Components of life model in practice.

    PubMed

    Mitchell, D; Hicks, M

    1995-10-01

    This paper attempts to cover the stages recognised as being part of the research process in order to investigate the problem related to the lack of individualised patient care documentation and supporting theoretical framework, which is both understood and accepted by the staff within Accident and Emergency (A & E). With the advent of the United Kingdom Central Council's (UKCC) document Standards of Record Keeping (1993), there is now greater need for a model to be implemented and accepted by those working in the department. The Components of Life model was introduced following a literature search, as this seemed to be a potential solution to the problem, since it emphasises the individual practising self-care activities in order to maintain independence. To initiate staff to the Components of Life model, a half study day was organised on the subject of models of care within A & E. Jones was invited to discuss his approach to A & E nursing care. Subsequently, a draft document relating to nursing care was created using the Components of Life model as a framework. The initial draft was followed with a printed document which was put into use for a trial period of 4 weeks, followed by a review. The review collected both positive and negative comments from the staff, the negative proved to be the most constructive as they served to make improvements within the care plan. Perhaps the most important success as a result of completing this project is that of increased staff enthusiasm and motivation--especially in wanting to make the documentation work.

  12. Redundancy-Aware Topic Modeling for Patient Record Notes

    PubMed Central

    Cohen, Raphael; Aviram, Iddo; Elhadad, Michael; Elhadad, Noémie

    2014-01-01

    The clinical notes in a given patient record contain much redundancy, in large part due to clinicians’ documentation habit of copying from previous notes in the record and pasting into a new note. Previous work has shown that this redundancy has a negative impact on the quality of text mining and topic modeling in particular. In this paper we describe a novel variant of Latent Dirichlet Allocation (LDA) topic modeling, Red-LDA, which takes into account the inherent redundancy of patient records when modeling content of clinical notes. To assess the value of Red-LDA, we experiment with three baselines and our novel redundancy-aware topic modeling method: given a large collection of patient records, (i) apply vanilla LDA to all documents in all input records; (ii) identify and remove all redundancy by chosing a single representative document for each record as input to LDA; (iii) identify and remove all redundant paragraphs in each record, leaving partial, non-redundant documents as input to LDA; and (iv) apply Red-LDA to all documents in all input records. Both quantitative evaluation carried out through log-likelihood on held-out data and topic coherence of produced topics and qualitative assessement of topics carried out by physicians show that Red-LDA produces superior models to all three baseline strategies. This research contributes to the emerging field of understanding the characteristics of the electronic health record and how to account for them in the framework of data mining. The code for the two redundancy-elimination baselines and Red-LDA is made publicly available to the community. PMID:24551060

  13. Redundancy-aware topic modeling for patient record notes.

    PubMed

    Cohen, Raphael; Aviram, Iddo; Elhadad, Michael; Elhadad, Noémie

    2014-01-01

    The clinical notes in a given patient record contain much redundancy, in large part due to clinicians' documentation habit of copying from previous notes in the record and pasting into a new note. Previous work has shown that this redundancy has a negative impact on the quality of text mining and topic modeling in particular. In this paper we describe a novel variant of Latent Dirichlet Allocation (LDA) topic modeling, Red-LDA, which takes into account the inherent redundancy of patient records when modeling content of clinical notes. To assess the value of Red-LDA, we experiment with three baselines and our novel redundancy-aware topic modeling method: given a large collection of patient records, (i) apply vanilla LDA to all documents in all input records; (ii) identify and remove all redundancy by chosing a single representative document for each record as input to LDA; (iii) identify and remove all redundant paragraphs in each record, leaving partial, non-redundant documents as input to LDA; and (iv) apply Red-LDA to all documents in all input records. Both quantitative evaluation carried out through log-likelihood on held-out data and topic coherence of produced topics and qualitative assessment of topics carried out by physicians show that Red-LDA produces superior models to all three baseline strategies. This research contributes to the emerging field of understanding the characteristics of the electronic health record and how to account for them in the framework of data mining. The code for the two redundancy-elimination baselines and Red-LDA is made publicly available to the community.

  14. Space Station Freedom (SSF) Data Management System (DMS) performance model data base

    NASA Technical Reports Server (NTRS)

    Stovall, John R.

    1993-01-01

    The purpose of this document was originally to be a working document summarizing Space Station Freedom (SSF) Data Management System (DMS) hardware and software design, configuration, performance and estimated loading data from a myriad of source documents such that the parameters provided could be used to build a dynamic performance model of the DMS. The document is published at this time as a close-out of the DMS performance modeling effort resulting from the Clinton Administration mandated Space Station Redesign. The DMS as documented in this report is no longer a part of the redesigned Space Station. The performance modeling effort was a joint undertaking between the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) Flight Data Systems Division (FDSD) and the NASA Ames Research Center (ARC) Spacecraft Data Systems Research Branch. The scope of this document is limited to the DMS core network through the Man Tended Configuration (MTC) as it existed prior to the 1993 Clinton Administration mandated Space Station Redesign. Data is provided for the Standard Data Processors (SDP's), Multiplexer/Demultiplexers (MDM's) and Mass Storage Units (MSU's). Planned future releases would have added the additional hardware and software descriptions needed to describe the complete DMS. Performance and loading data through the Permanent Manned Configuration (PMC) was to have been included as it became available. No future releases of this document are presently planned pending completion of the present Space Station Redesign activities and task reassessment.

  15. A System for Automated Extraction of Metadata from Scanned Documents using Layout Recognition and String Pattern Search Models

    PubMed Central

    Misra, Dharitri; Chen, Siyuan; Thoma, George R.

    2010-01-01

    One of the most expensive aspects of archiving digital documents is the manual acquisition of context-sensitive metadata useful for the subsequent discovery of, and access to, the archived items. For certain types of textual documents, such as journal articles, pamphlets, official government records, etc., where the metadata is contained within the body of the documents, a cost effective method is to identify and extract the metadata in an automated way, applying machine learning and string pattern search techniques. At the U. S. National Library of Medicine (NLM) we have developed an automated metadata extraction (AME) system that employs layout classification and recognition models with a metadata pattern search model for a text corpus with structured or semi-structured information. A combination of Support Vector Machine and Hidden Markov Model is used to create the layout recognition models from a training set of the corpus, following which a rule-based metadata search model is used to extract the embedded metadata by analyzing the string patterns within and surrounding each field in the recognized layouts. In this paper, we describe the design of our AME system, with focus on the metadata search model. We present the extraction results for a historic collection from the Food and Drug Administration, and outline how the system may be adapted for similar collections. Finally, we discuss some ongoing enhancements to our AME system. PMID:21179386

  16. Burn injury models of care: A review of quality and cultural safety for care of Indigenous children.

    PubMed

    Fraser, Sarah; Grant, Julian; Mackean, Tamara; Hunter, Kate; Holland, Andrew J A; Clapham, Kathleen; Teague, Warwick J; Ivers, Rebecca Q

    2018-05-01

    Safety and quality in the systematic management of burn care is important to ensure optimal outcomes. It is not clear if or how burn injury models of care uphold these qualities, or if they provide a space for culturally safe healthcare for Indigenous peoples, especially for children. This review is a critique of publically available models of care analysing their ability to facilitate safe, high-quality burn care for Indigenous children. Models of care were identified and mapped against cultural safety principles in healthcare, and against the National Health and Medical Research Council standard for clinical practice guidelines. An initial search and appraisal of tools was conducted to assess suitability of the tools in providing a mechanism to address quality and cultural safety. From the 53 documents found, 6 were eligible for review. Aspects of cultural safety were addressed in the models, but not explicitly, and were recorded very differently across all models. There was also limited or no cultural consultation documented in the models of care reviewed. Quality in the documents against National Health and Medical Research Council guidelines was evident; however, description or application of quality measures was inconsistent and incomplete. Gaps concerning safety and quality in the documented care pathways for Indigenous peoples' who sustain a burn injury and require burn care highlight the need for investigation and reform of current practices. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.

  17. View generated database

    NASA Technical Reports Server (NTRS)

    Downward, James G.

    1992-01-01

    This document represents the final report for the View Generated Database (VGD) project, NAS7-1066. It documents the work done on the project up to the point at which all project work was terminated due to lack of project funds. The VGD was to provide the capability to accurately represent any real-world object or scene as a computer model. Such models include both an accurate spatial/geometric representation of surfaces of the object or scene, as well as any surface detail present on the object. Applications of such models are numerous, including acquisition and maintenance of work models for tele-autonomous systems, generation of accurate 3-D geometric/photometric models for various 3-D vision systems, and graphical models for realistic rendering of 3-D scenes via computer graphics.

  18. NCAR global model topography generation software for unstructured grids

    NASA Astrophysics Data System (ADS)

    Lauritzen, P. H.; Bacmeister, J. T.; Callaghan, P. F.; Taylor, M. A.

    2015-06-01

    It is the purpose of this paper to document the NCAR global model topography generation software for unstructured grids. Given a model grid, the software computes the fraction of the grid box covered by land, the gridbox mean elevation, and associated sub-grid scale variances commonly used for gravity wave and turbulent mountain stress parameterizations. The software supports regular latitude-longitude grids as well as unstructured grids; e.g. icosahedral, Voronoi, cubed-sphere and variable resolution grids. As an example application and in the spirit of documenting model development, exploratory simulations illustrating the impacts of topographic smoothing with the NCAR-DOE CESM (Community Earth System Model) CAM5.2-SE (Community Atmosphere Model version 5.2 - Spectral Elements dynamical core) are shown.

  19. Energy Modeling for the Artisan Food Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goel, Supriya

    2013-05-01

    The Artisan Food Center is a 6912 sq.ft food processing plant located in Dayton, Washington. PNNL was contacted by Strecker Engineering to assist with the building’s energy analysis as a part of the project’s U.S. Green Building Council’s Leadership in Energy and Environmental Design (LEED) submittal requirements. The project is aiming for LEED Silver certification, one of the prerequisites to which is a whole building energy model to demonstrate compliance with American Society of Heating Refrigeration and Air Conditioning Engineers (ASHRAE) 90.1 2007 Appendix G, Performance Rating Method. The building incorporates a number of energy efficiency measures as part ofmore » its design and the energy analysis aimed at providing Strecker Engineering with the know-how of developing an energy model for the project as well as an estimate of energy savings of the proposed design over the baseline design, which could be used to document points in the LEED documentation. This report documents the ASHRAE 90.1 2007 baseline model design, the proposed model design, the modeling assumptions and procedures as well as the energy savings results in order to inform the Strecker Engineering team on a possible whole building energy model.« less

  20. A Review of Equation of State Models, Chemical Equilibrium Calculations and CERV Code Requirements for SHS Detonation Modelling

    DTIC Science & Technology

    2009-10-01

    parameters for a large number of species. These authors provide many sample calculations with the JCZS database incorporated in CHEETAH 2.0, including...FORM (highest classification of Title, Abstract, Keywords) DOCUMENT CONTROL DATA (Security classification of title, body of abstract and...CLASSIFICATION OF FORM 13. ABSTRACT (a brief and factual summary of the document. It may also appear elsewhere in the body of the document itself

  1. Medical Writing Competency Model - Section 1: Functions, Tasks, and Activities.

    PubMed

    Clemow, David B; Wagner, Bertil; Marshallsay, Christopher; Benau, Dan; L'Heureux, Darryl; Brown, David H; Dasgupta, Devjani Ghosh; Girten, Eileen; Hubbard, Frank; Gawrylewski, Helle-Mai; Ebina, Hiroko; Stoltenborg, Janet; York, J P; Green, Kim; Wood, Linda Fossati; Toth, Lisa; Mihm, Michael; Katz, Nancy R; Vasconcelos, Nina-Maria; Sakiyama, Norihisa; Whitsell, Robin; Gopalakrishnan, Shobha; Bairnsfather, Susan; Wanderer, Tatyana; Schindler, Thomas M; Mikyas, Yeshi; Aoyama, Yumiko

    2018-01-01

    This article provides Section 1 of the 2017 Edition 2 Medical Writing Competency Model that describes the core work functions and associated tasks and activities related to professional medical writing within the life sciences industry. The functions in the Model are scientific communication strategy; document preparation, development, and finalization; document project management; document template, standard, format, and style development and maintenance; outsourcing, alliance partner, and client management; knowledge, skill, ability, and behavior development and sharing; and process improvement. The full Model also includes Section 2, which covers the knowledge, skills, abilities, and behaviors needed for medical writers to be effective in their roles; Section 2 is presented in a companion article. Regulatory, publication, and other scientific writing as well as management of writing activities are covered. The Model was developed to aid medical writers and managers within the life sciences industry regarding medical writing hiring, training, expectation and goal setting, performance evaluation, career development, retention, and role value sharing to cross-functional partners.

  2. The Multimedia Environmental Pollutant Assessment System (MEPAS){reg_sign}: Atmospheric pathway formulations. Revision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Droppo, J.G.; Buck, J.W.

    1996-03-01

    The Multimedia Environmental Pollutant Assessment System (MEPAS) is an integrated software implementation of physics-based fate and transport models for health and environmental risk assessments of both radioactive and hazardous pollutants. This atmospheric component report is one of a series of formulation reports that document the MEPAS mathematical models. MEPAS is a multimedia model; pollutant transport is modeled within, through, and between multiple media (air, soil, groundwater, and surface water). The estimated concentrations in the various media are used to compute exposures and impacts to the environment, to maximum individuals, and to populations. The MEPAS atmospheric component for the air mediamore » documented in this report includes models for emission from a source to the air, initial plume rise and dispersion, airborne pollutant transport and dispersion, and deposition to soils and crops. The material in this report is documentation for MEPAS Versions 3.0 and 3.1 and the MEPAS version used in the Remedial Action Assessment System (RAAS) Version 1.0.« less

  3. Universal Rate Model Selector: A Method to Quickly Find the Best-Fit Kinetic Rate Model for an Experimental Rate Profile

    DTIC Science & Technology

    2017-08-01

    as an official Department of the Army position unless so designated by other authorizing documents. REPORT DOCUMENTATION PAGE Form Approved OMB...processes to find a kinetic rate model that provides a high degree of correlation with experimental data. Furthermore, the use of kinetic rate... correlation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON Renu B

  4. Final Design Documentation for the Wartime Personnel Assessment Model (WARPAM) (Version 1.0)

    DTIC Science & Technology

    1991-03-25

    Bldg 401B) Ft. Benjamin Harrison, IN 46216-5000 Accesion For DTI& NTIS CRA& I J DTIC 1A;3 A Uta,I.ou- i Justilicatluol .... . .. . By...GENERATOR FIGURE 2: WARPAN OPERATIONAL ARCHITECTURE 10 WARPAN DESIGN DOCUMENTATION WARPAM is programmed in FORTRAN 77, except for the CRC model which is...to enter directly into a specific model and utilize data currently in the system. The modular architecture of WARPAM is depicted in Figure 3

  5. Technical Support Document for Version 3.6.1 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2009-09-29

    This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.

  6. Automatic mathematical modeling for real time simulation program (AI application)

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1989-01-01

    A methodology is described for automatic mathematical modeling and generating simulation models. The major objective was to create a user friendly environment for engineers to design, maintain, and verify their models; to automatically convert the mathematical models into conventional code for computation; and finally, to document the model automatically.

  7. Milestones of mathematical model for business process management related to cost estimate documentation in petroleum industry

    NASA Astrophysics Data System (ADS)

    Khamidullin, R. I.

    2018-05-01

    The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.

  8. BOREAS TE-17 Production Efficiency Model Images

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G.; Papagno, Andrea (Editor); Goetz, Scott J.; Goward, Samual N.; Prince, Stephen D.; Czajkowski, Kevin; Dubayah, Ralph O.

    2000-01-01

    A Boreal Ecosystem-Atmospheric Study (BOREAS) version of the Global Production Efficiency Model (http://www.inform.umd.edu/glopem/) was developed by TE-17 (Terrestrial Ecology) to generate maps of gross and net primary production, autotrophic respiration, and light use efficiency for the BOREAS region. This document provides basic information on the model and how the maps were generated. The data generated by the model are stored in binary image-format files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  9. Models of unit operations used for solid-waste processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savage, G.M.; Glaub, J.C.; Diaz, L.F.

    1984-09-01

    This report documents the unit operations models that have been developed for typical refuse-derived-fuel (RDF) processing systems. These models, which represent the mass balances, energy requirements, and economics of the unit operations, are derived, where possible, from basic principles. Empiricism has been invoked where a governing theory has yet to be developed. Field test data and manufacturers' information, where available, supplement the analytical development of the models. A literature review has also been included for the purpose of compiling and discussing in one document the available information pertaining to the modeling of front-end unit operations. Separate analytics have been donemore » for each task.« less

  10. Documenting AUTOGEN and APGEN Model Files

    NASA Technical Reports Server (NTRS)

    Gladden, Roy E.; Khanampompan, Teerapat; Fisher, Forest W.; DelGuericio, Chris c.

    2008-01-01

    A computer program called "autogen hypertext map generator" satisfies a need for documenting and assisting in visualization of, and navigation through, model files used in the AUTOGEN and APGEN software mentioned in the two immediately preceding articles. This program parses autogen script files, autogen model files, PERL scripts, and apgen activity-definition files and produces a hypertext map of the files to aid in the navigation of the model. This program also provides a facility for adding notes and descriptions, beyond what is in the source model represented by the hypertext map. Further, this program provides access to a summary of the model through variable, function, sub routine, activity and resource declarations as well as providing full access to the source model and source code. The use of the tool enables easy access to the declarations and the ability to traverse routines and calls while analyzing the model.

  11. FOSSIL2 energy policy model documentation: FOSSIL2 documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1980-10-01

    This report discusses the structure, derivations, assumptions, and mathematical formulation of the FOSSIL2 model. Each major facet of the model - supply/demand interactions, industry financing, and production - has been designed to parallel closely the actual cause/effect relationships determining the behavior of the United States energy system. The data base for the FOSSIL2 program is large, as is appropriate for a system dynamics simulation model. When possible, all data were obtained from sources well known to experts in the energy field. Cost and resource estimates are based on DOE data whenever possible. This report presents the FOSSIL2 model at severalmore » levels. Volumes II and III of this report list the equations that comprise the FOSSIL2 model, along with variable definitions and a cross-reference list of the model variables. Volume III lists the model equations and a one line definition for equations, in a short, readable format.« less

  12. System for the Analysis of Global Energy Markets - Vol. I, Model Documentation

    EIA Publications

    2003-01-01

    Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.

  13. Circulation Control Model Experimental Database for CFD Validation

    NASA Technical Reports Server (NTRS)

    Paschal, Keith B.; Neuhart, Danny H.; Beeler, George B.; Allan, Brian G.

    2012-01-01

    A 2D circulation control wing was tested in the Basic Aerodynamic Research Tunnel at the NASA Langley Research Center. A traditional circulation control wing employs tangential blowing along the span over a trailing-edge Coanda surface for the purpose of lift augmentation. This model has been tested extensively at the Georgia Tech Research Institute for the purpose of performance documentation at various blowing rates. The current study seeks to expand on the previous work by documenting additional flow-field data needed for validation of computational fluid dynamics. Two jet momentum coefficients were tested during this entry: 0.047 and 0.114. Boundary-layer transition was investigated and turbulent boundary layers were established on both the upper and lower surfaces of the model. Chordwise and spanwise pressure measurements were made, and tunnel sidewall pressure footprints were documented. Laser Doppler Velocimetry measurements were made on both the upper and lower surface of the model at two chordwise locations (x/c = 0.8 and 0.9) to document the state of the boundary layers near the spanwise blowing slot.

  14. Computational technique and performance of Transient Inundation Model for Rivers--2 Dimensional (TRIM2RD) : a depth-averaged two-dimensional flow model

    USGS Publications Warehouse

    Fulford, Janice M.

    2003-01-01

    A numerical computer model, Transient Inundation Model for Rivers -- 2 Dimensional (TrimR2D), that solves the two-dimensional depth-averaged flow equations is documented and discussed. The model uses a semi-implicit, semi-Lagrangian finite-difference method. It is a variant of the Trim model and has been used successfully in estuarine environments such as San Francisco Bay. The abilities of the model are documented for three scenarios: uniform depth flows, laboratory dam-break flows, and large-scale riverine flows. The model can start computations from a ?dry? bed and converge to accurate solutions. Inflows are expressed as source terms, which limits the use of the model to sufficiently long reaches where the flow reaches equilibrium with the channel. The data sets used by the investigation demonstrate that the model accurately propagates flood waves through long river reaches and simulates dam breaks with abrupt water-surface changes.

  15. High-Quality 3d Models and Their Use in a Cultural Heritage Conservation Project

    NASA Astrophysics Data System (ADS)

    Tucci, G.; Bonora, V.; Conti, A.; Fiorini, L.

    2017-08-01

    Cultural heritage digitization and 3D modelling processes are mainly based on laser scanning and digital photogrammetry techniques to produce complete, detailed and photorealistic three-dimensional surveys: geometric as well as chromatic aspects, in turn testimony of materials, work techniques, state of preservation, etc., are documented using digitization processes. The paper explores the topic of 3D documentation for conservation purposes; it analyses how geomatics contributes in different steps of a restoration process and it presents an overview of different uses of 3D models for the conservation and enhancement of the cultural heritage. The paper reports on the project to digitize the earthenware frieze of the Ospedale del Ceppo in Pistoia (Italy) for 3D documentation, restoration work support, and digital and physical reconstruction and integration purposes. The intent to design an exhibition area suggests new ways to take advantage of 3D data originally acquired for documentation and scientific purposes.

  16. DEVA: An extensible ontology-based annotation model for visual document collections

    NASA Astrophysics Data System (ADS)

    Jelmini, Carlo; Marchand-Maillet, Stephane

    2003-01-01

    The description of visual documents is a fundamental aspect of any efficient information management system, but the process of manually annotating large collections of documents is tedious and far from being perfect. The need for a generic and extensible annotation model therefore arises. In this paper, we present DEVA, an open, generic and expressive multimedia annotation framework. DEVA is an extension of the Dublin Core specification. The model can represent the semantic content of any visual document. It is described in the ontology language DAML+OIL and can easily be extended with external specialized ontologies, adapting the vocabulary to the given application domain. In parallel, we present the Magritte annotation tool, which is an early prototype that validates the DEVA features. Magritte allows to manually annotating image collections. It is designed with a modular and extensible architecture, which enables the user to dynamically adapt the user interface to specialized ontologies merged into DEVA.

  17. Relational Learning via Collective Matrix Factorization

    DTIC Science & Technology

    2008-06-01

    well-known example of such a schema is pLSI- pHITS [13], which models document-word counts and document-document citations: E1 = words and E2 = E3...relational co- clustering include pLSI, pLSI- pHITS , the symmetric block models of Long et. al. [23, 24, 25], and Bregman tensor clustering [5] (which can...to pLSI- pHITS In this section we provide an example where the additional flexibility of collective matrix factorization leads to better results; and

  18. 75 FR 29587 - Notice of Availability of Revised Model Proposed No Significant Hazards Consideration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-26

    ... available in the Agencywide Documents Access and Management System (ADAMS) under Accession Number ML071420428. Documents: You can access publicly available documents related to this notice using the following..., Maryland. NRC's Agencywide Documents Access and Management System (ADAMS): Publicly available documents...

  19. Document Ranking Based upon Markov Chains.

    ERIC Educational Resources Information Center

    Danilowicz, Czeslaw; Balinski, Jaroslaw

    2001-01-01

    Considers how the order of documents in information retrieval responses are determined and introduces a method that uses a probabilistic model of a document set where documents are regarded as states of a Markov chain and where transition probabilities are directly proportional to similarities between documents. (Author/LRW)

  20. A model for indexing medical documents combining statistical and symbolic knowledge.

    PubMed

    Avillach, Paul; Joubert, Michel; Fieschi, Marius

    2007-10-11

    To develop and evaluate an information processing method based on terminologies, in order to index medical documents in any given documentary context. We designed a model using both symbolic general knowledge extracted from the Unified Medical Language System (UMLS) and statistical knowledge extracted from a domain of application. Using statistical knowledge allowed us to contextualize the general knowledge for every particular situation. For each document studied, the extracted terms are ranked to highlight the most significant ones. The model was tested on a set of 17,079 French standardized discharge summaries (SDSs). The most important ICD-10 term of each SDS was ranked 1st or 2nd by the method in nearly 90% of the cases. The use of several terminologies leads to more precise indexing. The improvement achieved in the models implementation performances as a result of using semantic relationships is encouraging.

  1. Simple Spectral Lines Data Model Version 1.0

    NASA Astrophysics Data System (ADS)

    Osuna, Pedro; Salgado, Jesus; Guainazzi, Matteo; Dubernet, Marie-Lise; Roueff, Evelyne; Osuna, Pedro; Salgado, Jesus

    2010-12-01

    This document presents a Data Model to describe Spectral Line Transitions in the context of the Simple Line Access Protocol defined by the IVOA (c.f. Ref[13] IVOA Simple Line Access protocol) The main objective of the model is to integrate with and support the Simple Line Access Protocol, with which it forms a compact unit. This integration allows seamless access to Spectral Line Transitions available worldwide in the VO context. This model does not provide a complete description of Atomic and Molecular Physics, which scope is outside of this document. In the astrophysical sense, a line is considered as the result of a transition between two energy levels. Under the basis of this assumption, a whole set of objects and attributes have been derived to define properly the necessary information to describe lines appearing in astrophysical contexts. The document has been written taking into account available information from many different Line data providers (see acknowledgments section).

  2. Exploiting salient semantic analysis for information retrieval

    NASA Astrophysics Data System (ADS)

    Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui

    2016-11-01

    Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.

  3. 76 FR 5467 - Airworthiness Directives; Pilatus Aircraft Ltd. Models PC-6, PC-6-H1, PC-6-H2, PC-6/350, PC-6/350...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-01

    ... models does not include a Chapter 04 in the Airworthiness Limitations Section (ALS). For PC-6 models other than B2-H2 and B2- H4, no ALS at all is included in the AMM. With the latest Revision 12 of the... other than B2-H2 and B2-H4, a new ALS document has been implemented as well. These documents include the...

  4. Documentation of the GLAS fourth order general calculation model. Volume 3: Vectorized code for the Cyber 205

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, D.; Pfaendtner, J.; Takacs, L.; Takano, K.

    1983-01-01

    Volume 3 of a 3-volume technical memoranda which contains documentation of the GLAS fourth order genera circulation model is presented. The volume contains the CYBER 205 scalar and vector codes of the model, list of variables, and cross references. A dictionary of FORTRAN variables used in the Scalar Version, and listings of the FORTRAN Code compiled with the C-option, are included. Cross reference maps of local variables are included for each subroutine.

  5. Space station ECLSS integration analysis: Simplified General Cluster Systems Model, ECLS System Assessment Program enhancements

    NASA Technical Reports Server (NTRS)

    Ferguson, R. E.

    1985-01-01

    The data base verification of the ECLS Systems Assessment Program (ESAP) was documented and changes made to enhance the flexibility of the water recovery subsystem simulations are given. All changes which were made to the data base values are described and the software enhancements performed. The refined model documented herein constitutes the submittal of the General Cluster Systems Model. A source listing of the current version of ESAP is provided in Appendix A.

  6. Documentation and virtual reconstruction of historical objects in Peru damaged by an earthquake and climatic events

    NASA Astrophysics Data System (ADS)

    Hanzalová, K.; Pavelka, K.

    2013-07-01

    This paper deals with the possibilities of creating a 3-D model and a visualization technique for a presentation of historical buildings and sites in Peru. The project Nasca/CTU is documenting historical objects by using several techniques. This paper describes the documentation and the visualization of two historical churches (San Jose and San Xavier Churches) and the pre-Hispanic archaeological site La Ciudad Perdida de Huayuri (Abandoned town near Huayuri) in Nasca region by using photogrammetry and remote sensing. Both churches were damaged by an earthquake. We use different process for the documentation of these objects. Firstly, PhotoModeler software was used for the photogrammetric data processing of the acquired images. The subsequent making models of both churches were different too. Google SketchUp software was used for the San Jose Church and the 3-D model of San Xavier Church was created in MicroStation software. While in the modelling of the "Abandoned town" near Huayuri, which was destroyed by a climatic event (El Niño), the terrestrial photogrammetry, satellite data and GNSS measurement were applied. The general output of the project is a thematic map of this archaeological site; C14 method was used for dating.

  7. Improving automation standards via semantic modelling: Application to ISA88.

    PubMed

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  8. An ontology model for nursing narratives with natural language generation technology.

    PubMed

    Min, Yul Ha; Park, Hyeoun-Ae; Jeon, Eunjoo; Lee, Joo Yun; Jo, Soo Jung

    2013-01-01

    The purpose of this study was to develop an ontology model to generate nursing narratives as natural as human language from the entity-attribute-value triplets of a detailed clinical model using natural language generation technology. The model was based on the types of information and documentation time of the information along the nursing process. The typesof information are data characterizing the patient status, inferences made by the nurse from the patient data, and nursing actions selected by the nurse to change the patient status. This information was linked to the nursing process based on the time of documentation. We describe a case study illustrating the application of this model in an acute-care setting. The proposed model provides a strategy for designing an electronic nursing record system.

  9. PV_LIB Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-11

    While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less

  10. Generic solar photovoltaic system dynamic simulation model specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellis, Abraham; Behnke, Michael Robert; Elliott, Ryan Thomas

    This document is intended to serve as a specification for generic solar photovoltaic (PV) system positive-sequence dynamic models to be implemented by software developers and approved by the WECC MVWG for use in bulk system dynamic simulations in accordance with NERC MOD standards. Two specific dynamic models are included in the scope of this document. The first, a Central Station PV System model, is intended to capture the most important dynamic characteristics of large scale (> 10 MW) PV systems with a central Point of Interconnection (POI) at the transmission level. The second, a Distributed PV System model, is intendedmore » to represent an aggregation of smaller, distribution-connected systems that comprise a portion of a composite load that might be modeled at a transmission load bus.« less

  11. Electrical utilities model for determining electrical distribution capacity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritz, R. L.

    1997-09-03

    In its simplest form, this model was to obtain meaningful data on the current state of the Site`s electrical transmission and distribution assets, and turn this vast collection of data into useful information. The resulting product is an Electrical Utilities Model for Determining Electrical Distribution Capacity which provides: current state of the electrical transmission and distribution systems; critical Hanford Site needs based on outyear planning documents; decision factor model. This model will enable Electrical Utilities management to improve forecasting requirements for service levels, budget, schedule, scope, and staffing, and recommend the best path forward to satisfy customer demands at themore » minimum risk and least cost to the government. A dynamic document, the model will be updated annually to reflect changes in Hanford Site activities.« less

  12. Improving the Interoperability of Disaster Models: a Case Study of Proposing Fireml for Forest Fire Model

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Wang, F.; Meng, Q.; Li, Z.; Liu, B.; Zheng, X.

    2018-04-01

    This paper presents a new standardized data format named Fire Markup Language (FireML), extended by the Geography Markup Language (GML) of OGC, to elaborate upon the fire hazard model. The proposed FireML is able to standardize the input and output documents of a fire model for effectively communicating with different disaster management systems to ensure a good interoperability. To demonstrate the usage of FireML and testify its feasibility, an adopted forest fire spread model being compatible with FireML is described. And a 3DGIS disaster management system is developed to simulate the dynamic procedure of forest fire spread with the defined FireML documents. The proposed approach will enlighten ones who work on other disaster models' standardization work.

  13. Structuring Legacy Pathology Reports by openEHR Archetypes to Enable Semantic Querying.

    PubMed

    Kropf, Stefan; Krücken, Peter; Mueller, Wolf; Denecke, Kerstin

    2017-05-18

    Clinical information is often stored as free text, e.g. in discharge summaries or pathology reports. These documents are semi-structured using section headers, numbered lists, items and classification strings. However, it is still challenging to retrieve relevant documents since keyword searches applied on complete unstructured documents result in many false positive retrieval results. We are concentrating on the processing of pathology reports as an example for unstructured clinical documents. The objective is to transform reports semi-automatically into an information structure that enables an improved access and retrieval of relevant data. The data is expected to be stored in a standardized, structured way to make it accessible for queries that are applied to specific sections of a document (section-sensitive queries) and for information reuse. Our processing pipeline comprises information modelling, section boundary detection and section-sensitive queries. For enabling a focused search in unstructured data, documents are automatically structured and transformed into a patient information model specified through openEHR archetypes. The resulting XML-based pathology electronic health records (PEHRs) are queried by XQuery and visualized by XSLT in HTML. Pathology reports (PRs) can be reliably structured into sections by a keyword-based approach. The information modelling using openEHR allows saving time in the modelling process since many archetypes can be reused. The resulting standardized, structured PEHRs allow accessing relevant data by retrieving data matching user queries. Mapping unstructured reports into a standardized information model is a practical solution for a better access to data. Archetype-based XML enables section-sensitive retrieval and visualisation by well-established XML techniques. Focussing the retrieval to particular sections has the potential of saving retrieval time and improving the accuracy of the retrieval.

  14. INEEL Subregional Conceptual Model Report Volume 2: Summary of Existing Knowledge of Geochemical Influences on the Fate and Transport of Contaminants in the Subsurface at the INEEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul L. Wichlacz; Robert C. Starr; Brennon Orr

    2003-09-01

    This document summarizes previous descriptions of geochemical system conceptual models for the vadose zone and groundwater zone (aquifer) beneath the Idaho National Engineering and Environmental Laboratory (INEEL). The primary focus is on groundwater because contaminants derived from wastes disposed at INEEL are present in groundwater, groundwater provides a pathway for potential migration to receptors, and because geochemical characteristics in and processes in the aquifer can substantially affect the movement, attenuation, and toxicity of contaminants. The secondary emphasis is perched water bodies in the vadose zone. Perched water eventually reaches the regional groundwater system, and thus processes that affect contaminants inmore » the perched water bodies are important relative to the migration of contaminants into groundwater. Similarly, processes that affect solutes during transport from nearsurface disposal facilities downward through the vadose zone to the aquifer are relevant. Sediments in the vadose zone can affect both water and solute transport by restricting the downward migration of water sufficiently that a perched water body forms, and by retarding solute migration via ion exchange. Geochemical conceptual models have been prepared by a variety of researchers for different purposes. They have been published in documents prepared by INEEL contractors, the United States Geological Survey (USGS), academic researchers, and others. The documents themselves are INEEL and USGS reports, and articles in technical journals. The documents reviewed were selected from citation lists generated by searching the INEEL Technical Library, the INEEL Environmental Restoration Optical Imaging System, and the ISI Web of Science databases. The citation lists were generated using the keywords ground water, groundwater, chemistry, geochemistry, contaminant, INEL, INEEL, and Idaho. In addition, a list of USGS documents that pertain to the INEEL was obtained and manually searched. The documents that appeared to be the most pertinent were selected from further review. These documents are tabulated in the citation list. This report summarizes existing geochemical conceptual models, but does not attempt to generate a new conceptual model or select the ''right'' model. This document is organized as follows. Geochemical models are described in general in Section 2. Geochemical processes that control the transport and fate of contaminants introduced into groundwater are described in Section 3. The natural geochemistry of the Eastern Snake River Plain Aquifer (SRPA) is described in Section 4. The effect of waste disposal on the INEEL subsurface is described in Section 5. The geochemical behavior of the major contaminants is described in Section 6. Section 7 describes the site-specific geochemical models developed for various INEEL facilities.« less

  15. Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.

    The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAEmore » by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort, (Elliott et al. 2004b). The areas most changed since the publication of that previous document are those discussing the calculation of lighting and HVAC interactive effects (for both lighting and envelope/whole-building projects). This report does not attempt to convey inputs to BEAMS or the methodology of their derivation.« less

  16. Linguistic Extensions of Topic Models

    ERIC Educational Resources Information Center

    Boyd-Graber, Jordan

    2010-01-01

    Topic models like latent Dirichlet allocation (LDA) provide a framework for analyzing large datasets where observations are collected into groups. Although topic modeling has been fruitfully applied to problems social science, biology, and computer vision, it has been most widely used to model datasets where documents are modeled as exchangeable…

  17. Documenting and Examining Practices in Creating Learning Communities: Exemplars and Non-Exemplars.

    ERIC Educational Resources Information Center

    Hipp, Kristine A.; Huffman, Jane B.

    Though promising as an implement for school reform, few professional learning centers (PLCs) have turned the vision into reality. This paper looks at how PLCs develop and maintain momentum. It examines and documents efforts in schools actively engaged in creating PLCs based on Hord's model. Hord's model employs external change agents, or…

  18. ENHANCED STREAM WATER QUALITY MODELS QUAL2E AND QUAL2E-UNCAS: DOCUMENTATION AND USER MANUAL

    EPA Science Inventory

    The manual is a major revision of the original QUAL2E program documentation released in 1985. It includes a description of the recent modifications and improvements to the widely used water quality models QUAL-II and QUAL2E. The enhancements include an extensive capability for un...

  19. Liquid Fuels Market Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Defines the objectives of the Liquid Fuels Market Model (LFMM), describes its basic approach, and provides detail on how it works. This report is intended as a reference document for model analysts, users, and the public. This edition of the LFMM reflects changes made to the module over the past two years for the Annual Energy Outlook 2016.

  20. Structured Hypermedia Application Development Model (SHADM): A structured Model for Technical Documentation Application Design

    DTIC Science & Technology

    1991-12-01

    effective (19:15) Figure 2 details a flowchart of the basic steps in prototyping. The basic concept behind prototyping is to quickly produce a working...One approach to overcoming this is to structure the document relative to the experience level of the user (14:49). A "novice" or beginner would

  1. 2005-2008

    ERIC Educational Resources Information Center

    Council of Chief State School Officers, 2009

    2009-01-01

    The purpose of this document is to describe the US ED growth model pilot program from its inception to the point at which the program was no longer a pilot and was opened up to all states. In addition, the paper is designed to help states in the process of planning to submit a growth model proposal. The information in this document is mostly…

  2. Documentation of a finite-element two-layer model for simulation of ground-water flow

    USGS Publications Warehouse

    Mallory, Michael J.

    1979-01-01

    This report documents a finite-element model for simulation of ground-water flow in a two-aquifer system where the two aquifers are coupled by a leakage term that represents flow through a confining layer separating the two aquifers. The model was developed by Timothy J. Durbin (U.S. Geological Survey) for use in ground-water investigations in southern California. The documentation assumes that the reader is familiar with the physics of ground-water flow, numerical methods of solving partial-differential equations, and the FORTRAN IV computer language. It was prepared as part of the investigations made by the U.S. Geological Survey in cooperation with the San Bernardino Valley Municipal Water District. (Kosco-USGS)

  3. An Inter-Personal Information Sharing Model Based on Personalized Recommendations

    NASA Astrophysics Data System (ADS)

    Kamei, Koji; Funakoshi, Kaname; Akahani, Jun-Ichi; Satoh, Tetsuji

    In this paper, we propose an inter-personal information sharing model among individuals based on personalized recommendations. In the proposed model, we define an information resource as shared between people when both of them consider it important --- not merely when they both possess it. In other words, the model defines the importance of information resources based on personalized recommendations from identifiable acquaintances. The proposed method is based on a collaborative filtering system that focuses on evaluations from identifiable acquaintances. It utilizes both user evaluations for documents and their contents. In other words, each user profile is represented as a matrix of credibility to the other users' evaluations on each domain of interests. We extended the content-based collaborative filtering method to distinguish other users to whom the documents should be recommended. We also applied a concept-based vector space model to represent the domain of interests instead of the previous method which represented them by a term-based vector space model. We introduce a personalized concept-base compiled from each user's information repository to improve the information retrieval in the user's environment. Furthermore, the concept-spaces change from user to user since they reflect the personalities of the users. Because of different concept-spaces, the similarity between a document and a user's interest varies for each user. As a result, a user receives recommendations from other users who have different view points, achieving inter-personal information sharing based on personalized recommendations. This paper also describes an experimental simulation of our information sharing model. In our laboratory, five participants accumulated a personal repository of e-mails and web pages from which they built their own concept-base. Then we estimated the user profiles according to personalized concept-bases and sets of documents which others evaluated. We simulated inter-personal recommendation based on the user profiles and evaluated the performance of the recommendation method by comparing the recommended documents to the result of the content-based collaborative filtering.

  4. A process-based model for cattle manure compost windrows: Model performance and application

    USDA-ARS?s Scientific Manuscript database

    A model was developed and incorporated in the Integrated Farm System Model (IFSM, v.4.3) that simulates important processes occurring during windrow composting of manure. The model, documented in an accompanying paper, predicts changes in windrow properties and conditions and the resulting emissions...

  5. DEVELOPMENT OF GUIDELINES FOR CALIBRATING, VALIDATING, AND EVALUATING HYDROLOGIC AND WATER QUALITY MODELS: ASABE ENGINEERING PRACTICE 621

    USDA-ARS?s Scientific Manuscript database

    Information to support application of hydrologic and water quality (H/WQ) models abounds, yet modelers commonly use arbitrary, ad hoc methods to conduct, document, and report model calibration, validation, and evaluation. Consistent methods are needed to improve model calibration, validation, and e...

  6. FARSITE: Fire Area Simulator-model development and evaluation

    Treesearch

    Mark A. Finney

    1998-01-01

    A computer simulation model, FARSITE, includes existing fire behavior models for surface, crown, spotting, point-source fire acceleration, and fuel moisture. The model's components and assumptions are documented. Simulations were run for simple conditions that illustrate the effect of individual fire behavior models on two-dimensional fire growth.

  7. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  8. Environmental Models as a Service: Enabling Interoperability ...

    EPA Pesticide Factsheets

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the

  9. C3 System Performance Simulation and User Manual. Getting Started: Guidelines for Users

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This document is a User's Manual describing the C3 Simulation capabilities. The subject work was designed to simulate the communications involved in the flight of a Remotely Operated Aircraft (ROA) using the Opnet software. Opnet provides a comprehensive development environment supporting the modeling of communication networks and distributed systems. It has tools for model design, simulation, data collection, and data analysis. Opnet models are hierarchical -- consisting of a project which contains node models which in turn contain process models. Nodes can be fixed, mobile, or satellite. Links between nodes can be physical or wireless. Communications are packet based. The model is very generic in its current form. Attributes such as frequency and bandwidth can easily be modified to better reflect a specific platform. The model is not fully developed at this stage -- there are still more enhancements to be added. Current issues are documented throughout this guide.

  10. Integrating O/S models during conceptual design, part 2

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    This report documents the procedures for utilizing and maintaining the Reliability & Maintainability Model (RAM) developed by the University of Dayton for the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC) under NASA research grant NAG-1-1327. The purpose of the grant is to provide support to NASA in establishing operational and support parameters and costs of proposed space systems. As part of this research objective, the model described here was developed. Additional documentation concerning the development of this model may be found in Part 1 of this report. This is the 2nd part of a 3 part technical report.

  11. A method for three-dimensional modeling of wind-shear environments for flight simulator applications

    NASA Technical Reports Server (NTRS)

    Bray, R. S.

    1984-01-01

    A computational method for modeling severe wind shears of the type that have been documented during severe convective atmospheric conditions is offered for use in research and training flight simulation. The procedure was developed with the objectives of operational flexibility and minimum computer load. From one to five, simple down burst wind models can be configured and located to produce the wind field desired for specific simulated flight scenarios. A definition of related turbulence parameters is offered as an additional product of the computations. The use of the method to model several documented examples of severe wind shear is demonstrated.

  12. Link-topic model for biomedical abbreviation disambiguation.

    PubMed

    Kim, Seonho; Yoon, Juntae

    2015-02-01

    The ambiguity of biomedical abbreviations is one of the challenges in biomedical text mining systems. In particular, the handling of term variants and abbreviations without nearby definitions is a critical issue. In this study, we adopt the concepts of topic of document and word link to disambiguate biomedical abbreviations. We newly suggest the link topic model inspired by the latent Dirichlet allocation model, in which each document is perceived as a random mixture of topics, where each topic is characterized by a distribution over words. Thus, the most probable expansions with respect to abbreviations of a given abstract are determined by word-topic, document-topic, and word-link distributions estimated from a document collection through the link topic model. The model allows two distinct modes of word generation to incorporate semantic dependencies among words, particularly long form words of abbreviations and their sentential co-occurring words; a word can be generated either dependently on the long form of the abbreviation or independently. The semantic dependency between two words is defined as a link and a new random parameter for the link is assigned to each word as well as a topic parameter. Because the link status indicates whether the word constitutes a link with a given specific long form, it has the effect of determining whether a word forms a unigram or a skipping/consecutive bigram with respect to the long form. Furthermore, we place a constraint on the model so that a word has the same topic as a specific long form if it is generated in reference to the long form. Consequently, documents are generated from the two hidden parameters, i.e. topic and link, and the most probable expansion of a specific abbreviation is estimated from the parameters. Our model relaxes the bag-of-words assumption of the standard topic model in which the word order is neglected, and it captures a richer structure of text than does the standard topic model by considering unigrams and semantically associated bigrams simultaneously. The addition of semantic links improves the disambiguation accuracy without removing irrelevant contextual words and reduces the parameter space of massive skipping or consecutive bigrams. The link topic model achieves 98.42% disambiguation accuracy on 73,505 MEDLINE abstracts with respect to 21 three letter abbreviations and their 139 distinct long forms. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Predicate Argument Structure Analysis for Use Case Description Modeling

    NASA Astrophysics Data System (ADS)

    Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira

    In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.

  14. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    The purpose of this Model Report (REV02) is to document the unsaturated zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrological-chemical (THC) processes on UZ flow and transport. This Model Report has been developed in accordance with the ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (Bechtel SAIC Company, LLC (BSC) 2002 [160819]). The technical work plan (TWP) describes planning information pertaining to the technical scope, content, and management of this Model Report in Section 1.12, Work Package AUZM08, ''Coupled Effects on Flow and Seepage''. The plan for validation of the models documented in this Model Reportmore » is given in Attachment I, Model Validation Plans, Section I-3-4, of the TWP. Except for variations in acceptance criteria (Section 4.2), there were no deviations from this TWP. This report was developed in accordance with AP-SIII.10Q, ''Models''. This Model Report documents the THC Seepage Model and the Drift Scale Test (DST) THC Model. The THC Seepage Model is a drift-scale process model for predicting the composition of gas and water that could enter waste emplacement drifts and the effects of mineral alteration on flow in rocks surrounding drifts. The DST THC model is a drift-scale process model relying on the same conceptual model and much of the same input data (i.e., physical, hydrological, thermodynamic, and kinetic) as the THC Seepage Model. The DST THC Model is the primary method for validating the THC Seepage Model. The DST THC Model compares predicted water and gas compositions, as well as mineral alteration patterns, with observed data from the DST. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal-loading conditions, and predict the evolution of mineral alteration and fluid chemistry around potential waste emplacement drifts. The DST THC Model is used solely for the validation of the THC Seepage Model and is not used for calibration to measured data.« less

  15. A user's guide to the combined stand prognosis and Douglas-fir tussock moth outbreak model

    Treesearch

    Robert A. Monserud; Nicholas L. Crookston

    1982-01-01

    Documentation is given for using a simulation model combining the Stand Prognosis Model and the Douglas-fir Tussock Moth Outbreak Model. Four major areas are addressed: (1) an overview and discussion of the combined model; (2) description of input options; (3) discussion of model output, and (4) numerous examples illustrating model behavior and sensitivity.

  16. Phase I Hydrologic Data for the Groundwater Flow and Contaminant Transport Model of Corrective Action Unit 97: Yucca Flat/Climax Mine, Nevada Test Site, Nye County, Nevada, Rev. No.: 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John McCord

    2006-06-01

    The U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) initiated the Underground Test Area (UGTA) Project to assess and evaluate the effects of the underground nuclear weapons tests on groundwater beneath the Nevada Test Site (NTS) and vicinity. The framework for this evaluation is provided in Appendix VI, Revision No. 1 (December 7, 2000) of the Federal Facility Agreement and Consent Order (FFACO, 1996). Section 3.0 of Appendix VI ''Corrective Action Strategy'' of the FFACO describes the process that will be used to complete corrective actions specifically for the UGTA Project. The objective of themore » UGTA corrective action strategy is to define contaminant boundaries for each UGTA corrective action unit (CAU) where groundwater may have become contaminated from the underground nuclear weapons tests. The contaminant boundaries are determined based on modeling of groundwater flow and contaminant transport. A summary of the FFACO corrective action process and the UGTA corrective action strategy is provided in Section 1.5. The FFACO (1996) corrective action process for the Yucca Flat/Climax Mine CAU 97 was initiated with the Corrective Action Investigation Plan (CAIP) (DOE/NV, 2000a). The CAIP included a review of existing data on the CAU and proposed a set of data collection activities to collect additional characterization data. These recommendations were based on a value of information analysis (VOIA) (IT, 1999), which evaluated the value of different possible data collection activities, with respect to reduction in uncertainty of the contaminant boundary, through simplified transport modeling. The Yucca Flat/Climax Mine CAIP identifies a three-step model development process to evaluate the impact of underground nuclear testing on groundwater to determine a contaminant boundary (DOE/NV, 2000a). The three steps are as follows: (1) Data compilation and analysis that provides the necessary modeling data that is completed in two parts: the first addressing the groundwater flow model, and the second the transport model. (2) Development of a groundwater flow model. (3) Development of a groundwater transport model. This report presents the results of the first part of the first step, documenting the data compilation, evaluation, and analysis for the groundwater flow model. The second part, documentation of transport model data will be the subject of a separate report. The purpose of this document is to present the compilation and evaluation of the available hydrologic data and information relevant to the development of the Yucca Flat/Climax Mine CAU groundwater flow model, which is a fundamental tool in the prediction of the extent of contaminant migration. Where appropriate, data and information documented elsewhere are summarized with reference to the complete documentation. The specific task objectives for hydrologic data documentation are as follows: (1) Identify and compile available hydrologic data and supporting information required to develop and validate the groundwater flow model for the Yucca Flat/Climax Mine CAU. (2) Assess the quality of the data and associated documentation, and assign qualifiers to denote levels of quality. (3) Analyze the data to derive expected values or spatial distributions and estimates of the associated uncertainty and variability.« less

  17. User Delay Cost Model and Facilities Maintenance Cost Model for a Terminal Control Area : Volume 3. User's Manual and Program Documentation for the Facilities Maintenance Cost Model

    DOT National Transportation Integrated Search

    1978-05-01

    The Facilities Maintenance Cost Model (FMCM) is an analytic model designed to calculate expected annual labor costs of maintenance within a given FAA maintenance sector. The model is programmed in FORTRAN IV and has been demonstrated on the CDC Krono...

  18. Reading, Writing, and Documentation and Managing the Development of User Documentation.

    ERIC Educational Resources Information Center

    Lindberg, Wayne; Hoffman, Terrye

    1987-01-01

    The first of two articles addressing the issue of user documentation for computer software discusses the need to teach users how to read documentation. The second presents a guide for writing documentation that is based on the instructional systems design model, and makes suggestions for the desktop publishing of user manuals. (CLB)

  19. Entity Profiling for Intelligence Using the Graphical Overview of Social and Semantic Interactions of People (GOSSIP) Software Tool

    DTIC Science & Technology

    2010-11-01

    TR 2010-188; R & D pour la défense Canada – Toronto; Novembre 2010. Introduction ou contexte : En règle générale, l’analyste du renseignement ou...model humans Series3 DRDC Toronto TR 2010-188 13 Figure 4. continued. Profiles for famous names generated by subjects and the model...document is classified) 13 . ABSTRACT (A brief and factual summary of the document. It may also appear elsewhere in the body of the document itself. It is

  20. Software Engineering Laboratory (SEL) relationships, models, and management rules

    NASA Technical Reports Server (NTRS)

    Decker, William; Hendrick, Robert; Valett, Jon D.

    1991-01-01

    Over 50 individual Software Engineering Laboratory (SEL) research results, extracted from a review of published SEL documentation, that can be applied directly to managing software development projects are captured. Four basic categories of results are defined and discussed - environment profiles, relationships, models, and management rules. In each category, research results are presented as a single page that summarizes the individual result, lists potential uses of the result by managers, and references the original SEL documentation where the result was found. The document serves as a concise reference summary of applicable research for SEL managers.

  1. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3, Part 4.

    DTIC Science & Technology

    1983-09-01

    6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA

  2. Modelo de Alfabetizacion: A Poblacion Urbana y Rural. Documento General (Literacy Model: Urban and Rural Populations. General Document).

    ERIC Educational Resources Information Center

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    This document describes literacy models for urban and rural populations in Mexico. It contains four sections. The first two sections (generalizations about the population and considerations about the teaching of adults) discuss the environment that creates illiterate adults and also describe some of the conditions under which learning takes place…

  3. "Role Models Are Real People": Speakers and Field Trips for Chicago's American Indian Elementary School Children.

    ERIC Educational Resources Information Center

    Hill, Lola L.

    This two-part document describes the background and development of "Role Models Are Real People," a speakers' program for at-risk American Indian students, grades 6-8, in Chicago. The first part of the document includes the program proposal, outlining dropout statistics and other data showing reason for concern about American Indian…

  4. SPRAYTRAN USER'S GUIDE: A GIS-BASED ATMOSPHERIC SPRAY DROPLET DISPERSION MODELING SYSTEM

    EPA Science Inventory

    The offsite drift of pesticide from spray operations is an ongoing source of concern. The SPRAY TRANsport (SPRAYTRAN) system, documented in this report, incorporates the near-field spray application model, AGDISP, into a meso-scale atmospheric transport model. The AGDISP model ...

  5. Studying the Accuracy of Software Process Elicitation: The User Articulated Model

    ERIC Educational Resources Information Center

    Crabtree, Carlton A.

    2010-01-01

    Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…

  6. User's guide to the weather model: a component of the western spruce budworm modeling system.

    Treesearch

    W. P. Kemp; N. L. Crookston; P. W. Thomas

    1989-01-01

    A stochastic model useful in simulating daily maximum and minimum temperature and precipitation developed by Bruhn and others has been adapted for use in the western spruce budworm modeling system. This document describes how to use the weather model and illustrates some aspects of its behavior.

  7. FOSSIL2 energy policy model documentation: FOSSIL2 documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1980-10-01

    This report discusses the structure, derivations, assumptions, and mathematical formulation of the FOSSIL2 model. Each major facet of the model - supply/demand interactions, industry financing, and production - has been designed to parallel closely the actual cause/effect relationships determining the behavior of the United States energy system. The data base for the FOSSIL2 program is large, as is appropriate for a system dynamics simulation model. When possible, all data were obtained from sources well known to experts in the energy field. Cost and resource estimates are based on DOE data whenever possible. This report presents the FOSSIL2 model at severalmore » levels. Volumes II and III of this report list the equations that comprise the FOSSIL2 model, along with variable definitions and a cross-reference list of the model variables. Volume II provides the model equations with each of their variables defined, while Volume III lists the equations, and a one line definition for equations, in a shorter, more readable format.« less

  8. The GFDL global atmosphere and land model AM4.0/LM4.0: 2. Model description, sensitivity studies, and tuning strategies

    USGS Publications Warehouse

    Zhao, M.; Golaz, J.-C.; Held, I. M.; Guo, H.; Balaji, V.; Benson, R.; Chen, J.-H.; Chen, X.; Donner, L. J.; Dunne, J. P.; Dunne, Krista A.; Durachta, J.; Fan, S.-M.; Freidenreich, S. M.; Garner, S. T.; Ginoux, P.; Harris, L. M.; Horowitz, L. W.; Krasting, J. P.; Langenhorst, A. R.; Liang, Z.; Lin, P.; Lin, S.-J.; Malyshev, S. L.; Mason, E.; Milly, Paul C.D.; Ming, Y.; Naik, V.; Paulot, F.; Paynter, D.; Phillipps, P.; Radhakrishnan, A.; Ramaswamy, V.; Robinson, T.; Schwarzkopf, D.; Seman, C. J.; Shevliakova, E.; Shen, Z.; Shin, H.; Silvers, L.; Wilson, J. R.; Winton, M.; Wittenberg, A. T.; Wyman, B.; Xiang, B.

    2018-01-01

    In Part 2 of this two‐part paper, documentation is provided of key aspects of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). The quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode has been provided in Part 1. Part 2 provides documentation of key components and some sensitivities to choices of model formulation and values of parameters, highlighting the convection parameterization and orographic gravity wave drag. The approach taken to tune the model's clouds to observations is a particular focal point. Care is taken to describe the extent to which aerosol effective forcing and Cess sensitivity have been tuned through the model development process, both of which are relevant to the ability of the model to simulate the evolution of temperatures over the last century when coupled to an ocean model.

  9. The GFDL Global Atmosphere and Land Model AM4.0/LM4.0: 2. Model Description, Sensitivity Studies, and Tuning Strategies

    DOE PAGES

    Zhao, Ming; Golaz, J. -C.; Held, I. M.; ...

    2018-02-19

    Here, in Part 2 of this two–part paper, documentation is provided of key aspects of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). The quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode has been provided in Part 1. Part 2 provides documentation of key components and some sensitivities to choices of model formulation and values of parameters, highlighting the convection parameterization and orographic gravity wave drag. The approach taken tomore » tune the model's clouds to observations is a particular focal point. Care is taken to describe the extent to which aerosol effective forcing and Cess sensitivity have been tuned through the model development process, both of which are relevant to the ability of the model to simulate the evolution of temperatures over the last century when coupled to an ocean model.« less

  10. The GFDL Global Atmosphere and Land Model AM4.0/LM4.0: 2. Model Description, Sensitivity Studies, and Tuning Strategies

    NASA Astrophysics Data System (ADS)

    Zhao, M.; Golaz, J.-C.; Held, I. M.; Guo, H.; Balaji, V.; Benson, R.; Chen, J.-H.; Chen, X.; Donner, L. J.; Dunne, J. P.; Dunne, K.; Durachta, J.; Fan, S.-M.; Freidenreich, S. M.; Garner, S. T.; Ginoux, P.; Harris, L. M.; Horowitz, L. W.; Krasting, J. P.; Langenhorst, A. R.; Liang, Z.; Lin, P.; Lin, S.-J.; Malyshev, S. L.; Mason, E.; Milly, P. C. D.; Ming, Y.; Naik, V.; Paulot, F.; Paynter, D.; Phillipps, P.; Radhakrishnan, A.; Ramaswamy, V.; Robinson, T.; Schwarzkopf, D.; Seman, C. J.; Shevliakova, E.; Shen, Z.; Shin, H.; Silvers, L. G.; Wilson, J. R.; Winton, M.; Wittenberg, A. T.; Wyman, B.; Xiang, B.

    2018-03-01

    In Part 2 of this two-part paper, documentation is provided of key aspects of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). The quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode has been provided in Part 1. Part 2 provides documentation of key components and some sensitivities to choices of model formulation and values of parameters, highlighting the convection parameterization and orographic gravity wave drag. The approach taken to tune the model's clouds to observations is a particular focal point. Care is taken to describe the extent to which aerosol effective forcing and Cess sensitivity have been tuned through the model development process, both of which are relevant to the ability of the model to simulate the evolution of temperatures over the last century when coupled to an ocean model.

  11. The GFDL Global Atmosphere and Land Model AM4.0/LM4.0: 2. Model Description, Sensitivity Studies, and Tuning Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Ming; Golaz, J. -C.; Held, I. M.

    Here, in Part 2 of this two–part paper, documentation is provided of key aspects of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). The quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode has been provided in Part 1. Part 2 provides documentation of key components and some sensitivities to choices of model formulation and values of parameters, highlighting the convection parameterization and orographic gravity wave drag. The approach taken tomore » tune the model's clouds to observations is a particular focal point. Care is taken to describe the extent to which aerosol effective forcing and Cess sensitivity have been tuned through the model development process, both of which are relevant to the ability of the model to simulate the evolution of temperatures over the last century when coupled to an ocean model.« less

  12. Appropriate evidence sources for populating decision analytic models within health technology assessment (HTA): a systematic review of HTA manuals and health economic guidelines.

    PubMed

    Zechmeister-Koss, Ingrid; Schnell-Inderst, Petra; Zauner, Günther

    2014-04-01

    An increasing number of evidence sources are relevant for populating decision analytic models. What is needed is detailed methodological advice on which type of data is to be used for what type of model parameter. We aim to identify standards in health technology assessment manuals and economic (modeling) guidelines on appropriate evidence sources and on the role different types of data play within a model. Documents were identified via a call among members of the International Network of Agencies for Health Technology Assessment and by hand search. We included documents from Europe, the United States, Canada, Australia, and New Zealand as well as transnational guidelines written in English or German. We systematically summarized in a narrative manner information on appropriate evidence sources for model parameters, their advantages and limitations, data identification methods, and data quality issues. A large variety of evidence sources for populating models are mentioned in the 28 documents included. They comprise research- and non-research-based sources. Valid and less appropriate sources are identified for informing different types of model parameters, such as clinical effect size, natural history of disease, resource use, unit costs, and health state utility values. Guidelines do not provide structured and detailed advice on this issue. The article does not include information from guidelines in languages other than English or German, and the information is not tailored to specific modeling techniques. The usability of guidelines and manuals for modeling could be improved by addressing the issue of evidence sources in a more structured and comprehensive format.

  13. Quantifying Selection Bias in National Institute of Health Stroke Scale Data Documented in an Acute Stroke Registry.

    PubMed

    Thompson, Michael P; Luo, Zhehui; Gardiner, Joseph; Burke, James F; Nickles, Adrienne; Reeves, Mathew J

    2016-05-01

    As a measure of stroke severity, the National Institutes of Health Stroke Scale (NIHSS) is an important predictor of patient- and hospital-level outcomes, yet is often undocumented. The purpose of this study is to quantify and correct for potential selection bias in observed NIHSS data. Data were obtained from the Michigan Stroke Registry and included 10 262 patients with ischemic stroke aged ≥65 years discharged from 23 hospitals from 2009 to 2012, of which 74.6% of patients had documented NIHSS. We estimated models predicting NIHSS documentation and NIHSS score and used the Heckman selection model to estimate a correlation coefficient (ρ) between the 2 model error terms, which quantifies the degree of selection bias in the documentation of NIHSS. The Heckman model found modest, but significant, selection bias (ρ=0.19; 95% confidence interval: 0.09, 0.29; P<0.001), indicating that because NIHSS score increased (ie, strokes were more severe), the probability of documentation also increased. We also estimated a selection bias-corrected population mean NIHSS score of 4.8, which was substantially lower than the observed mean NIHSS score of 7.4. Evidence of selection bias was also identified using hospital-level analysis, where increased NIHSS documentation was correlated with lower mean NIHSS scores (r=-0.39; P<0.001). We demonstrate modest, but important, selection bias in documented NIHSS data, which are missing more often in patients with less severe stroke. The population mean NIHSS score was overestimated by >2 points, which could significantly alter the risk profile of hospitals treating patients with ischemic stroke and subsequent hospital risk-adjusted outcomes. © 2016 American Heart Association, Inc.

  14. A method for integrating and ranking the evidence for biochemical pathways by mining reactions from text

    PubMed Central

    Miwa, Makoto; Ohta, Tomoko; Rak, Rafal; Rowley, Andrew; Kell, Douglas B.; Pyysalo, Sampo; Ananiadou, Sophia

    2013-01-01

    Motivation: To create, verify and maintain pathway models, curators must discover and assess knowledge distributed over the vast body of biological literature. Methods supporting these tasks must understand both the pathway model representations and the natural language in the literature. These methods should identify and order documents by relevance to any given pathway reaction. No existing system has addressed all aspects of this challenge. Method: We present novel methods for associating pathway model reactions with relevant publications. Our approach extracts the reactions directly from the models and then turns them into queries for three text mining-based MEDLINE literature search systems. These queries are executed, and the resulting documents are combined and ranked according to their relevance to the reactions of interest. We manually annotate document-reaction pairs with the relevance of the document to the reaction and use this annotation to study several ranking methods, using various heuristic and machine-learning approaches. Results: Our evaluation shows that the annotated document-reaction pairs can be used to create a rule-based document ranking system, and that machine learning can be used to rank documents by their relevance to pathway reactions. We find that a Support Vector Machine-based system outperforms several baselines and matches the performance of the rule-based system. The success of the query extraction and ranking methods are used to update our existing pathway search system, PathText. Availability: An online demonstration of PathText 2 and the annotated corpus are available for research purposes at http://www.nactem.ac.uk/pathtext2/. Contact: makoto.miwa@manchester.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23813008

  15. iFlorida model deployment final evaluation report.

    DOT National Transportation Integrated Search

    2009-01-01

    This document is the final report for the evaluation of the USDOT-sponsored Surface Transportation Security and Reliability Information System Model Deployment, or iFlorida Model Deployment. This report discusses findings in the following areas: ITS ...

  16. iFlorida model deployment final evaluation report

    DOT National Transportation Integrated Search

    2009-01-01

    This document is the final report for the evaluation of the USDOT-sponsored Surface Transportation Security and Reliability Information System Model Deployment, or iFlorida Model Deployment. This report discusses findings in the following areas: ITS ...

  17. Short-Term Energy Outlook Model Documentation: Regional Residential Propane Price Model

    EIA Publications

    2009-01-01

    The regional residential propane price module of the Short-Term Energy Outlook (STEO) model is designed to provide residential retail price forecasts for the 4 Census regions: Northeast, South, Midwest, and West.

  18. A Study of Acute and Chronic Tissue Changes in Surgical and Traumatically-Induced Experimental Models of Knee Joint Injury Using Magnetic Resonance Imaging and Micro-Computed Tomography

    PubMed Central

    Fischenich, Kristine M.; Pauly, Hannah M.; Button, Keith D.; Fajardo, Ryan S.; DeCamp, Charles E.; Haut, Roger C.; Haut Donahue, Tammy L.

    2016-01-01

    Objective The objective of this study was to monitor the progression of joint damage in two animal models of knee joint trauma using two non-invasive, clinically available imaging modalities. Methods A 3-T clinical magnet and micro-computed tomography (mCT) was used to document changes immediately following injury (acute) and post-injury (chronic) at time points of 4, 8, or 12 weeks. Joint damage was recorded at dissection and compared to the chronic magnetic resonance imaging (MRI) record. Fifteen Flemish Giant rabbits were subjected to a single tibiofemoral compressive impact (ACLF), and 18 underwent a combination of anterior cruciate ligament (ACL) and meniscal transection (mACLT). Results All ACLF animals experienced ACL rupture, and 13 also experienced acute meniscal damage. All ACLF and mACLT animals showed meniscal and articular cartilage damages at dissection. Meniscal damage was documented as early as 4 weeks and worsened in 87% of the ACLF animals and 71% of the mACLT animals. Acute cartilage damage also developed further and increased in occurrence with time in both models. A progressive decrease in bone quantity and quality was documented in both models. The MRI data closely aligned with dissection notes suggesting this clinical tool may be a non-invasive method for documenting joint damage in lapine models of knee joint trauma. Conclusions The study investigates the acute to chronic progression of meniscal and cartilage damage at various time points, and chronic changes to the underlying bone in two models of posttraumatic osteoarthritis (PTOA), and highlights the dependency of the model on the location, type, and progression of damage over time. PMID:27756698

  19. Building Information Modelling for Cultural Heritage: A review

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Delinasiou, A.; Stylianidis, E.

    2015-08-01

    We discuss the evolution and state-of-the-art of the use of Building Information Modelling (BIM) in the field of culture heritage documentation. BIM is a hot theme involving different characteristics including principles, technology, even privacy rights for the cultural heritage objects. Modern documentation needs identified the potential of BIM in the recent years. Many architects, archaeologists, conservationists, engineers regard BIM as a disruptive force, changing the way professionals can document and manage a cultural heritage structure. The latest years, there are many developments in the BIM field while the developed technology and methods challenged the cultural heritage community in the documentation framework. In this review article, following a brief historic background for the BIM, we review the recent developments focusing in the cultural heritage documentation perspective.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crain, Steven P.; Yang, Shuang-Hong; Zha, Hongyuan

    Access to health information by consumers is ham- pered by a fundamental language gap. Current attempts to close the gap leverage consumer oriented health information, which does not, however, have good coverage of slang medical terminology. In this paper, we present a Bayesian model to automatically align documents with different dialects (slang, com- mon and technical) while extracting their semantic topics. The proposed diaTM model enables effective information retrieval, even when the query contains slang words, by explicitly modeling the mixtures of dialects in documents and the joint influence of dialects and topics on word selection. Simulations us- ing consumermore » questions to retrieve medical information from a corpus of medical documents show that diaTM achieves a 25% improvement in information retrieval relevance by nDCG@5 over an LDA baseline.« less

  1. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Gonnenthal; N. Spyoher

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THCmore » Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required to fully document and address the requirements of the TWPs.« less

  2. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Sonnenthale

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M&O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THCmore » seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required to fully document and address the requirements of the TWPs.« less

  3. Lake Representations in Global Climate Models: An End-User Perspective

    NASA Astrophysics Data System (ADS)

    Rood, R. B.; Briley, L.; Steiner, A.; Wells, K.

    2017-12-01

    The weather and climate in the Great Lakes region of the United States and Canada are strongly influenced by the lakes. Within global climate models, lakes are incorporated in many ways. If one is interested in quantitative climate information for the Great Lakes, then it is a first principle requirement that end-users of climate model simulation data, whether scientists or practitioners, need to know if and how lakes are incorporated into models. We pose the basic question, how are lakes represented in CMIP models? Despite significant efforts by the climate community to document and publish basic information about climate models, it is unclear how to answer the question about lake representations? With significant knowledge of the practice of the field, then a reasonable starting point is to use the ES-DOC Comparator (https://compare.es-doc.org/ ). Once at this interface to model information, the end-user is faced with the need for more knowledge about the practice and culture of the discipline. For example, lakes are often categorized as a type of land, a counterintuitive concept. In some models, though, lakes are specified in ocean models. There is little evidence and little confidence that the information obtained through this process is complete or accurate. In fact, it is verifiably not accurate. This experience, then, motivates identifying and finding either human experts or technical documentation for each model. The conclusion from this exercise is that it can take months or longer to provide a defensible answer to if and how lakes are represented in climate models. Our experience with lake finding is that this is not a unique experience. This talk documents our experience and explores barriers we have identified and strategies for reducing those barriers.

  4. Beyond the rhetoric: what do we mean by a 'model of care'?

    PubMed

    Davidson, Patricia; Halcomb, Elizabeth; Hickman, L; Phillips, J; Graham, B

    2006-01-01

    Contemporary health care systems are constantly challenged to revise traditional methods of health care delivery. These challenges are multifaceted and stem from: (1) novel pharmacological and non-pharmacological treatments; (2) changes in consumer demands and expectations; (3) fiscal and resource constraints; (4) changes in societal demographics in particular the ageing of society; (5) an increasing burden of chronic disease; (6) documentation of limitations in traditional health care delivery; (7) increased emphasis on transparency, accountability, evidence-based practice (EBP) and clinical governance structures; and (8) the increasing cultural diversity of the community. These challenges provoke discussion of potential alternative models of care, with scant reference to defining what constitutes a model of care. This paper aims to define what is meant by the term 'model of care' and document the pragmatic systems and processes necessary to develop, plan, implement and evaluate novel models of care delivery. Searches of electronic databases, the reference lists of published materials, policy documents and the Internet were conducted using key words including 'model*', 'framework*', 'models, theoretical' and 'nursing models, theoretical'. The collated material was then analysed and synthesised into this review. This review determined that in addition to key conceptual and theoretical perspectives, quality improvement theory (eg. collaborative methodology), project management methods and change management theory inform both pragmatic and conceptual elements of a model of care. Crucial elements in changing health care delivery through the development of innovative models of care include the planning, development, implementation, evaluation and assessment of the sustainability of the new model. Regardless of whether change in health care delivery is attempted on a micro basis (eg. ward level) or macro basis (eg. national or state system) in order to achieve sustainable, effective and efficient changes a well-planned, systematic process is essential.

  5. National Centers for Environmental Prediction

    Science.gov Websites

    Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Model Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment

  6. USERS MANUAL: LANDFILL GAS EMISSIONS MODEL - VERSION 2.0

    EPA Science Inventory

    The document is a user's guide for a computer model, Version 2.0 of the Landfill Gas Emissions Model (LandGEM), for estimating air pollution emissions from municipal solid waste (MSW) landfills. The model can be used to estimate emission rates for methane, carbon dioxide, nonmet...

  7. Modeling and performance data for heaving bouy wave energy converter with a compressible degree of freedom

    DOE Data Explorer

    Bacelli, Giorgio

    2016-09-28

    Modeling and performance data in Matlab data file (.mat) containing 3 structures (WEC model, simRes_sr and simRes_fix), and a pdf document describing the model, the simulations, and the analysis that has been carried out.

  8. Microdamage healing in asphalt and asphalt concrete, volume 3 : a micromechanics fracture and healing model for asphalt concrete.

    DOT National Transportation Integrated Search

    2001-06-01

    Volume 3 documents the development of a micromechanics fracture and healing model for asphalt : concrete. This model can be used to calculate the density and growth of microcracks during repeated direct : tensile controlled-strain loading. The model ...

  9. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    ERIC Educational Resources Information Center

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  10. NCAR CSM ocean model by the NCAR oceanography section. Technical note

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This technical note documents the ocean component of the NCAR Climate System Model (CSM). The ocean code has been developed from the Modular Ocean Model (version 1.1) which was developed and maintained at the NOAA Geophysical Fluid Dynamics Laboratory in Princeton. As a tribute to Mike Cox, and because the material is still relevant, the first four sections of this technical note are a straight reproduction from the GFDL Technical Report that Mike wrote in 1984. The remaining sections document how the NCAR Oceanography Section members have developed the MOM 1.1 code, and how it is forced, in order tomore » produce the NCAR CSM Ocean Model.« less

  11. Generating Models of Surgical Procedures using UMLS Concepts and Multiple Sequence Alignment

    PubMed Central

    Meng, Frank; D’Avolio, Leonard W.; Chen, Andrew A.; Taira, Ricky K.; Kangarloo, Hooshang

    2005-01-01

    Surgical procedures can be viewed as a process composed of a sequence of steps performed on, by, or with the patient’s anatomy. This sequence is typically the pattern followed by surgeons when generating surgical report narratives for documenting surgical procedures. This paper describes a methodology for semi-automatically deriving a model of conducted surgeries, utilizing a sequence of derived Unified Medical Language System (UMLS) concepts for representing surgical procedures. A multiple sequence alignment was computed from a collection of such sequences and was used for generating the model. These models have the potential of being useful in a variety of informatics applications such as information retrieval and automatic document generation. PMID:16779094

  12. An analysis of electronic document management in oncology care.

    PubMed

    Poulter, Thomas; Gannon, Brian; Bath, Peter A

    2012-06-01

    In this research in progress, a reference model for the use of electronic patient record (EPR) systems in oncology is described. The model, termed CICERO, comprises technical and functional components, and emphasises usability, clinical safety and user acceptance. One of the functional components of the model-an electronic document and records management (EDRM) system-is monitored in the course of its deployment at a leading oncology centre in the UK. Specifically, the user requirements and design of the EDRM solution are described.The study is interpretative and forms part a wider research programme to define and validate the CICERO model. Preliminary conclusions confirm the importance of a socio-technical perspective in Onco-EPR system design.

  13. Latent Dirichlet Allocation (LDA) Model and kNN Algorithm to Classify Research Project Selection

    NASA Astrophysics Data System (ADS)

    Safi’ie, M. A.; Utami, E.; Fatta, H. A.

    2018-03-01

    Universitas Sebelas Maret has a teaching staff more than 1500 people, and one of its tasks is to carry out research. In the other side, the funding support for research and service is limited, so there is need to be evaluated to determine the Research proposal submission and devotion on society (P2M). At the selection stage, research proposal documents are collected as unstructured data and the data stored is very large. To extract information contained in the documents therein required text mining technology. This technology applied to gain knowledge to the documents by automating the information extraction. In this articles we use Latent Dirichlet Allocation (LDA) to the documents as a model in feature extraction process, to get terms that represent its documents. Hereafter we use k-Nearest Neighbour (kNN) algorithm to classify the documents based on its terms.

  14. Mapping annotations with textual evidence using an scLDA model.

    PubMed

    Jin, Bo; Chen, Vicky; Chen, Lujia; Lu, Xinghua

    2011-01-01

    Most of the knowledge regarding genes and proteins is stored in biomedical literature as free text. Extracting information from complex biomedical texts demands techniques capable of inferring biological concepts from local text regions and mapping them to controlled vocabularies. To this end, we present a sentence-based correspondence latent Dirichlet allocation (scLDA) model which, when trained with a corpus of PubMed documents with known GO annotations, performs the following tasks: 1) learning major biological concepts from the corpus, 2) inferring the biological concepts existing within text regions (sentences), and 3) identifying the text regions in a document that provides evidence for the observed annotations. When applied to new gene-related documents, a trained scLDA model is capable of predicting GO annotations and identifying text regions as textual evidence supporting the predicted annotations. This study uses GO annotation data as a testbed; the approach can be generalized to other annotated data, such as MeSH and MEDLINE documents.

  15. Use of Image Based Modelling for Documentation of Intricately Shaped Objects

    NASA Astrophysics Data System (ADS)

    Marčiš, M.; Barták, P.; Valaška, D.; Fraštia, M.; Trhan, O.

    2016-06-01

    In the documentation of cultural heritage, we can encounter three dimensional shapes and structures which are complicated to measure. Such objects are for example spiral staircases, timber roof trusses, historical furniture or folk costume where it is nearly impossible to effectively use the traditional surveying or the terrestrial laser scanning due to the shape of the object, its dimensions and the crowded environment. The actual methods of digital photogrammetry can be very helpful in such cases with the emphasis on the automated processing of the extensive image data. The created high resolution 3D models and 2D orthophotos are very important for the documentation of architectural elements and they can serve as an ideal base for the vectorization and 2D drawing documentation. This contribution wants to describe the various usage of image based modelling in specific interior spaces and specific objects. The advantages and disadvantages of the photogrammetric measurement of such objects in comparison to other surveying methods are reviewed.

  16. Implementation of a next-generation electronic nursing records system based on detailed clinical models and integration of clinical practice guidelines.

    PubMed

    Min, Yul Ha; Park, Hyeoun-Ae; Chung, Eunja; Lee, Hyunsook

    2013-12-01

    The purpose of this paper is to describe the components of a next-generation electronic nursing records system ensuring full semantic interoperability and integrating evidence into the nursing records system. A next-generation electronic nursing records system based on detailed clinical models and clinical practice guidelines was developed at Seoul National University Bundang Hospital in 2013. This system has two components, a terminology server and a nursing documentation system. The terminology server manages nursing narratives generated from entity-attribute-value triplets of detailed clinical models using a natural language generation system. The nursing documentation system provides nurses with a set of nursing narratives arranged around the recommendations extracted from clinical practice guidelines. An electronic nursing records system based on detailed clinical models and clinical practice guidelines was successfully implemented in a hospital in Korea. The next-generation electronic nursing records system can support nursing practice and nursing documentation, which in turn will improve data quality.

  17. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    PubMed

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  18. Constitutive relations in TRAC-P1A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, U.S.; Saha, P.

    1980-08-01

    The purpose of this document is to describe the basic thermal-hydraulic models and correlations that are in the TRAC-P1A code, as released in March 1979. It is divided into two parts, A and B. Part A describes the models in the three-dimensional vessel module of TRAC, whereas Part B focuses on the loop components that are treated by one-dimensional formulations. The report follows the format of the questions prepared by the Analysis Development Branch of USNRC and the questionnaire has been attached to this document for completeness. Concerted efforts have been made in understanding the present models in TRAC-P1A bymore » going through the FORTRAN listing of the code. Some discrepancies between the code and the TRAC-P1A manual have been found. These are pointed out in this document. Efforts have also been made to check the TRAC references for the range of applicability of the models and correlations used in the code. 26 refs., 5 figs., 1 tab.« less

  19. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    PubMed Central

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments. PMID:26504899

  20. Archaeological predictive model set.

    DOT National Transportation Integrated Search

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  1. Overview of Computer-Based Models Applicable to Freight Car Utilization

    DOT National Transportation Integrated Search

    1977-10-01

    This report documents a study performed to identify and analyze twenty-two of the important computer-based models of railroad operations. The models are divided into three categories: network simulations, yard simulations, and network optimizations. ...

  2. Short-Term Energy Outlook Model Documentation: Regional Residential Heating Oil Price Model

    EIA Publications

    2009-01-01

    The regional residential heating oil price module of the Short-Term Energy Outlook (STEO) model is designed to provide residential retail price forecasts for the 4 census regions: Northeast, South, Midwest, and West.

  3. SWAT Model Configuration, Calibration and Validation for Lake Champlain Basin

    EPA Pesticide Factsheets

    The Soil and Water Assessment Tool (SWAT) model was used to develop phosphorus loading estimates for sources in the Lake Champlain Basin. This document describes the model setup and parameterization, and presents calibration results.

  4. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    NASA Technical Reports Server (NTRS)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

  5. Dynamic response characteristics of two transport models tested in the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Young, Clarence P., Jr.

    1993-01-01

    This paper documents recent experiences with measuring the dynamic response characteristics of a commercial transport and a military transport model during full scale Reynolds number tests in the National Transonic Facility. Both models were limited in angle of attack while testing at full scale Reynolds number and cruise Mach number due to pitch or stall buffet response. Roll buffet (wing buzz) was observed for both models at certain Mach numbers while testing at high Reynolds number. Roll buffet was more severe and more repeatable for the military transport model at cruise Mach number. Miniature strain-gage type accelerometers were used for the first time for obtaining dynamic data as a part of the continuing development of miniature dynamic measurements instrumentation for cryogenic applications. This paper presents the results of vibration measurements obtained for both the commercial and military transport models and documents the experience gained in the use of miniature strain gage type accelerometers.

  6. Validation of the Activities of Community Transportation model for individuals with cognitive impairments.

    PubMed

    Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Hung, Pei-Fang

    2009-01-01

    To develop a theoretical, functional model of community navigation for individuals with cognitive impairments: the Activities of Community Transportation (ACTs). Iterative design using qualitative methods (i.e. document review, focus groups and observations). Four agencies providing travel training to adults with cognitive impairments in the USA participated in the validation study. A thorough document review and series of focus groups led to the development of a comprehensive model (ACTs Wheels) delineating the requisite steps and skills for community navigation. The model was validated and updated based on observations of 395 actual trips by travellers with navigational challenges from the four participating agencies. Results revealed that the 'ACTs Wheel' models were complete and comprehensive. The 'ACTs Wheels' represent a comprehensive model of the steps needed to navigate to destinations using paratransit and fixed-route public transportation systems for travellers with cognitive impairments. Suggestions are made for future investigations of community transportation for this population.

  7. The relationship of attitude, subjective norm, and behavioral intent to the documentation behavior of nurses.

    PubMed

    Renfroe, D H; O'Sullivan, P S; McGee, G W

    1990-01-01

    Ajzen and Fishbein's theory of reasoned action was used to assess the relationship of nurses' attitude, subjective norm, and behavioral intention to their documentation behavior. Attitudes, subjective norms, and behavioral intentions toward documentation were elicited from 108 staff nurses. Documentation behavior was based on what should be documented in any hospitalized patient's chart during a shift. This exploratory model was analyzed with LISREL VI. The overall fit of the final model to the data was good, as judged by a chi-square (df = 7, p = .845). The total coefficient of determination for the structural equation was .461. Attitude toward documentation did not relate significantly to intention to document optimally. Subjective norm did have a significant effect on behavioral intent. Attitude and subjective norm accounted for 46.1% of the variance in behavioral intent. Behavioral intent had a significant effect on documentation behavior, accounting for 15.2% of the variance. It appears that subjective norm, which is the influence of others, is what directs the intention to document and thus relates to subsequent documentation. Recommendations for practice include the communication of high ideals and expectations of important others to the staff nurse in order to improve the quality of documentation.

  8. 75 FR 39991 - Notice of Availability of the Proposed Models For Plant-Specific Adoption of Technical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-13

    ... Agencywide Documents Access and Management System (ADAMS) under Accession Number ML080510164. The proposed..., Maryland. NRC's Agencywide Documents Access and Management System (ADAMS): Publicly available documents..., Revision 3. You can access publicly available documents related to this notice using the following methods...

  9. Content Recognition and Context Modeling for Document Analysis and Retrieval

    ERIC Educational Resources Information Center

    Zhu, Guangyu

    2009-01-01

    The nature and scope of available documents are changing significantly in many areas of document analysis and retrieval as complex, heterogeneous collections become accessible to virtually everyone via the web. The increasing level of diversity presents a great challenge for document image content categorization, indexing, and retrieval.…

  10. Automatic Classification Using Supervised Learning in a Medical Document Filtering Application.

    ERIC Educational Resources Information Center

    Mostafa, J.; Lam, W.

    2000-01-01

    Presents a multilevel model of the information filtering process that permits document classification. Evaluates a document classification approach based on a supervised learning algorithm, measures the accuracy of the algorithm in a neural network that was trained to classify medical documents on cell biology, and discusses filtering…

  11. A Compositional Relevance Model for Adaptive Information Retrieval

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James; Lu, Henry, Jr. (Technical Monitor)

    1994-01-01

    There is a growing need for rapid and effective access to information in large electronic documentation systems. Access can be facilitated if information relevant in the current problem solving context can be automatically supplied to the user. This includes information relevant to particular user profiles, tasks being performed, and problems being solved. However most of this knowledge on contextual relevance is not found within the contents of documents, and current hypermedia tools do not provide any easy mechanism to let users add this knowledge to their documents. We propose a compositional relevance network to automatically acquire the context in which previous information was found relevant. The model records information on the relevance of references based on user feedback for specific queries and contexts. It also generalizes such information to derive relevant references for similar queries and contexts. This model lets users filter information by context of relevance, build personalized views of documents over time, and share their views with other users. It also applies to any type of multimedia information. Compared to other approaches, it is less costly and doesn't require any a priori statistical computation, nor an extended training period. It is currently being implemented into the Computer Integrated Documentation system which enables integration of various technical documents in a hypertext framework.

  12. Biotea: RDFizing PubMed Central in support for the paper as an interface to the Web of Data

    PubMed Central

    2013-01-01

    Background The World Wide Web has become a dissemination platform for scientific and non-scientific publications. However, most of the information remains locked up in discrete documents that are not always interconnected or machine-readable. The connectivity tissue provided by RDF technology has not yet been widely used to support the generation of self-describing, machine-readable documents. Results In this paper, we present our approach to the generation of self-describing machine-readable scholarly documents. We understand the scientific document as an entry point and interface to the Web of Data. We have semantically processed the full-text, open-access subset of PubMed Central. Our RDF model and resulting dataset make extensive use of existing ontologies and semantic enrichment services. We expose our model, services, prototype, and datasets at http://biotea.idiginfo.org/ Conclusions The semantic processing of biomedical literature presented in this paper embeds documents within the Web of Data and facilitates the execution of concept-based queries against the entire digital library. Our approach delivers a flexible and adaptable set of tools for metadata enrichment and semantic processing of biomedical documents. Our model delivers a semantically rich and highly interconnected dataset with self-describing content so that software can make effective use of it. PMID:23734622

  13. Learning About Wisconsin: Activities, Historical Documents, and Resources Linked to Wisconsin's Model Academic Standards for Social Studies in Grades 4-12. Bulletin No. 99238.

    ERIC Educational Resources Information Center

    Fortier, John D.; Grady, Susan M.; Prickette, Karen R.

    Wisconsin's Model Academic Standards for Social Studies provide direction for curriculum, instruction, assessment, and professional development. The standards identify eras and themes in Wisconsin history. Many of these standards can be taught using content related to the study of Wisconsin. The sample lessons included in this document identify…

  14. Models of Financing the Continuing Vocational Training of Employees and Unemployed. Documentation of a LEONARDO-Project in Cooperation with Denmark, Germany, the Netherlands and Norway.

    ERIC Educational Resources Information Center

    Grunewald, Uwe, Ed.; Moraal, Dick, Ed.

    This document contains papers from an international project in which models of financing the continuing vocational training (CVT) in Denmark, Germany, the Netherlands, and Norway were identified and examined. The following are among the papers included: "Important Results of the LEONARDO-Project (contributions by all project-partners)";…

  15. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation report for ATDM program. [supporting datasets - Pasadena Testbed

    DOT National Transportation Integrated Search

    2017-07-26

    This zip file contains POSTDATA.ATT (.ATT); Print to File (.PRN); Portable Document Format (.PDF); and document (.DOCX) files of data to support FHWA-JPO-16-385, Analysis, modeling, and simulation (AMS) testbed development and evaluation to support d...

  16. An open annotation ontology for science on web 3.0

    PubMed Central

    2011-01-01

    Background There is currently a gap between the rich and expressive collection of published biomedical ontologies, and the natural language expression of biomedical papers consumed on a daily basis by scientific researchers. The purpose of this paper is to provide an open, shareable structure for dynamic integration of biomedical domain ontologies with the scientific document, in the form of an Annotation Ontology (AO), thus closing this gap and enabling application of formal biomedical ontologies directly to the literature as it emerges. Methods Initial requirements for AO were elicited by analysis of integration needs between biomedical web communities, and of needs for representing and integrating results of biomedical text mining. Analysis of strengths and weaknesses of previous efforts in this area was also performed. A series of increasingly refined annotation tools were then developed along with a metadata model in OWL, and deployed for feedback and additional requirements the ontology to users at a major pharmaceutical company and a major academic center. Further requirements and critiques of the model were also elicited through discussions with many colleagues and incorporated into the work. Results This paper presents Annotation Ontology (AO), an open ontology in OWL-DL for annotating scientific documents on the web. AO supports both human and algorithmic content annotation. It enables “stand-off” or independent metadata anchored to specific positions in a web document by any one of several methods. In AO, the document may be annotated but is not required to be under update control of the annotator. AO contains a provenance model to support versioning, and a set model for specifying groups and containers of annotation. AO is freely available under open source license at http://purl.org/ao/, and extensive documentation including screencasts is available on AO’s Google Code page: http://code.google.com/p/annotation-ontology/ . Conclusions The Annotation Ontology meets critical requirements for an open, freely shareable model in OWL, of annotation metadata created against scientific documents on the Web. We believe AO can become a very useful common model for annotation metadata on Web documents, and will enable biomedical domain ontologies to be used quite widely to annotate the scientific literature. Potential collaborators and those with new relevant use cases are invited to contact the authors. PMID:21624159

  17. An open annotation ontology for science on web 3.0.

    PubMed

    Ciccarese, Paolo; Ocana, Marco; Garcia Castro, Leyla Jael; Das, Sudeshna; Clark, Tim

    2011-05-17

    There is currently a gap between the rich and expressive collection of published biomedical ontologies, and the natural language expression of biomedical papers consumed on a daily basis by scientific researchers. The purpose of this paper is to provide an open, shareable structure for dynamic integration of biomedical domain ontologies with the scientific document, in the form of an Annotation Ontology (AO), thus closing this gap and enabling application of formal biomedical ontologies directly to the literature as it emerges. Initial requirements for AO were elicited by analysis of integration needs between biomedical web communities, and of needs for representing and integrating results of biomedical text mining. Analysis of strengths and weaknesses of previous efforts in this area was also performed. A series of increasingly refined annotation tools were then developed along with a metadata model in OWL, and deployed for feedback and additional requirements the ontology to users at a major pharmaceutical company and a major academic center. Further requirements and critiques of the model were also elicited through discussions with many colleagues and incorporated into the work. This paper presents Annotation Ontology (AO), an open ontology in OWL-DL for annotating scientific documents on the web. AO supports both human and algorithmic content annotation. It enables "stand-off" or independent metadata anchored to specific positions in a web document by any one of several methods. In AO, the document may be annotated but is not required to be under update control of the annotator. AO contains a provenance model to support versioning, and a set model for specifying groups and containers of annotation. AO is freely available under open source license at http://purl.org/ao/, and extensive documentation including screencasts is available on AO's Google Code page: http://code.google.com/p/annotation-ontology/ . The Annotation Ontology meets critical requirements for an open, freely shareable model in OWL, of annotation metadata created against scientific documents on the Web. We believe AO can become a very useful common model for annotation metadata on Web documents, and will enable biomedical domain ontologies to be used quite widely to annotate the scientific literature. Potential collaborators and those with new relevant use cases are invited to contact the authors.

  18. Developing Teachers' Models for Assessing Students' Competence in Mathematical Modelling through Lesson Study

    ERIC Educational Resources Information Center

    Aydogan Yenmez, Arzu; Erbas, Ayhan Kursat; Cakiroglu, Erdinc; Alacaci, Cengiz; Cetinkaya, Bulent

    2017-01-01

    Applications and modelling have gained a prominent role in mathematics education reform documents and curricula. Thus, there is a growing need for studies focusing on the effective use of mathematical modelling in classrooms. Assessment is an integral part of using modelling activities in classrooms, since it allows teachers to identify and manage…

  19. Cascading disaster models in postburn flash flood

    Treesearch

    Fred May

    2007-01-01

    A useful method of modeling threats from hazards and documenting their disaster causation sequences is called “cascading threat modeling.” This type of modeling enables emergency planners to address hazard and risk assessments systematically. This paper describes a cascading threat modeling and analysis process. Wildfire and an associated postburn flash flood disaster...

  20. What "best practice" could be in Palliative Care: an analysis of statements on practice and ethics expressed by the main Health Organizations

    PubMed Central

    2010-01-01

    Background In palliative care it would be necessary to refer to a model. Nevertheless it seems that there are no official statements which state and describe that model. We carried out an analysis of the statements on practice and ethics of palliative care expressed by the main health organizations to show which dimensions of end-of-life care are taken into consideration. Methods The official documents by the most representative health organisations committed to the definition of policies and guidelines for palliative and end-of-life care had been considered. The documents were analysed through a framework of the components of end-of-life care derived from literature, which was composed of 4 main "areas" and of 12 "sub-areas". Results Overall, 34 organizations were identified, 7 international organisations, and 27 organisations operating on the national level in four different countries (Australia, Canada, UK and United States). Up to 56 documents were selected and analysed. Most of them (38) are position statements. Relevant quotations from the documents were presented by "areas" and "sub-areas". In general, the "sub-areas" of symptoms control as well as those referring to relational and social issues are more widely covered by the documents than the "sub-areas" related to "preparation" and to "existential condition". Indeed, the consistency of end-of-life choices with the patient's wishes, as well as completion and meaningfulness at the end of life is given only a minor relevance. Conclusions An integrated model of the best palliative care practice is generally lacking in the documents. It might be argued that the lack of a fixed and coherent model is due to the relevance of unavoidable context issues in palliative care, such as specific cultural settings, patient-centred variables, and family specificity. The implication is that palliative care staff have continuously to adapt their model of caring to the specific needs and values of each patient, more than applying a fixed, although maybe comprehensive, care model. PMID:20205778

  1. Peridynamics with LAMMPS : a user guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehoucq, Richard B.; Silling, Stewart Andrew; Seleson, Pablo

    Peridynamics is a nonlocal extension of classical continuum mechanics. The discrete peridynamic model has the same computational structure as a molecular dynamics model. This document provides a brief overview of the peridynamic model of a continuum, then discusses how the peridynamic model is discretized within LAMMPS. An example problem is also included.

  2. Integrated corridor management (ICM) analysis, modeling, and simulation (AMS) for Minneapolis site : model calibration and validation report.

    DOT National Transportation Integrated Search

    2010-02-01

    This technical report documents the calibration and validation of the baseline (2008) mesoscopic model for the I-394 Minneapolis, Minnesota, Pioneer Site. DynusT was selected as the mesoscopic model for analyzing operating conditions in the I-394 cor...

  3. Revisions to the Wharton EFA Automobile Demand Model : The Wharton EFA Motor Vehicle Demand Model (Mark I)

    DOT National Transportation Integrated Search

    1980-12-01

    The report documents revisions made to the Wharton EFA Automobile Demand Model to produce the Wharton EFA Motor Vehicle Demand Model (Mark I). Equations are reestimated for the total desired stock of autos and for desired shares by size class, includ...

  4. An Overview of Customer Satisfaction Models.

    ERIC Educational Resources Information Center

    Hom, Willard

    This document is a report on how California community colleges can incorporate customer satisfaction models and theories from business to better serve students. Emphasis is given to two levels of customer satisfaction: macro- and micro-models. Macro-models look at how customer satisfaction relates to other elements or priorities of community…

  5. Representing Uncertainty on Model Analysis Plots

    ERIC Educational Resources Information Center

    Smith, Trevor I.

    2016-01-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…

  6. 7 CFR 1718.104 - Availability of model loan contract.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., DEPARTMENT OF AGRICULTURE LOAN SECURITY DOCUMENTS FOR ELECTRIC BORROWERS Loan Contracts With Distribution Borrowers § 1718.104 Availability of model loan contract. Single copies of the model loan contract (RUS... 7 Agriculture 11 2010-01-01 2010-01-01 false Availability of model loan contract. 1718.104 Section...

  7. Human Judgment and Decision Making: Models and Applications.

    ERIC Educational Resources Information Center

    Loke, Wing Hong

    This document notes that researchers study the processes involved in judgment and decision making and prescribe theories and models that reflect the behavior of the decision makers. It addresses the various models that are used to represent judgment and decision making, with particular interest in models that more accurately represent human…

  8. Microdamage healing in asphalt and asphalt concrete, Volume 3 : a micromechanics fracture and healing model for asphalt concrete

    DOT National Transportation Integrated Search

    2001-06-01

    Volume 3 documents the development of a micromechanics fracture and healing model for asphalt concrete. This model can be used to calculate the density and growth of microcracks during repeated direct tensile controlled-strain loading. The model is b...

  9. Comprehensive Career Guidance. Postsecondary & Adult. Programs and Model.

    ERIC Educational Resources Information Center

    Moore, Earl J.; Miller, Thomas B.

    Divided into four parts, this document describes a comprehensive career guidance model for postsecondary and adult programs. In part 1, the rationale for extending career guidance and counseling into the lifelong learning perspective is explained, the Georgia Life Career Development Model is described, and the components of a process model for…

  10. Differential Topic Models.

    PubMed

    Chen, Changyou; Buntine, Wray; Ding, Nan; Xie, Lexing; Du, Lan

    2015-02-01

    In applications we may want to compare different document collections: they could have shared content but also different and unique aspects in particular collections. This task has been called comparative text mining or cross-collection modeling. We present a differential topic model for this application that models both topic differences and similarities. For this we use hierarchical Bayesian nonparametric models. Moreover, we found it was important to properly model power-law phenomena in topic-word distributions and thus we used the full Pitman-Yor process rather than just a Dirichlet process. Furthermore, we propose the transformed Pitman-Yor process (TPYP) to incorporate prior knowledge such as vocabulary variations in different collections into the model. To deal with the non-conjugate issue between model prior and likelihood in the TPYP, we thus propose an efficient sampling algorithm using a data augmentation technique based on the multinomial theorem. Experimental results show the model discovers interesting aspects of different collections. We also show the proposed MCMC based algorithm achieves a dramatically reduced test perplexity compared to some existing topic models. Finally, we show our model outperforms the state-of-the-art for document classification/ideology prediction on a number of text collections.

  11. Appendices to the model description document for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.

  12. Application of multi-resolution 3D techniques in crime scene documentation with bloodstain pattern analysis.

    PubMed

    Hołowko, Elwira; Januszkiewicz, Kamil; Bolewicki, Paweł; Sitnik, Robert; Michoński, Jakub

    2016-10-01

    In forensic documentation with bloodstain pattern analysis (BPA) it is highly desirable to obtain non-invasively overall documentation of a crime scene, but also register in high resolution single evidence objects, like bloodstains. In this study, we propose a hierarchical 3D scanning platform designed according to the top-down approach known from the traditional forensic photography. The overall 3D model of a scene is obtained via integration of laser scans registered from different positions. Some parts of a scene being particularly interesting are documented using midrange scanner, and the smallest details are added in the highest resolution as close-up scans. The scanning devices are controlled using developed software equipped with advanced algorithms for point cloud processing. To verify the feasibility and effectiveness of multi-resolution 3D scanning in crime scene documentation, our platform was applied to document a murder scene simulated by the BPA experts from the Central Forensic Laboratory of the Police R&D, Warsaw, Poland. Applying the 3D scanning platform proved beneficial in the documentation of a crime scene combined with BPA. The multi-resolution 3D model enables virtual exploration of a scene in a three-dimensional environment, distance measurement, and gives a more realistic preservation of the evidences together with their surroundings. Moreover, high-resolution close-up scans aligned in a 3D model can be used to analyze bloodstains revealed at the crime scene. The result of BPA such as trajectories, and the area of origin are visualized and analyzed in an accurate model of a scene. At this stage, a simplified approach considering the trajectory of blood drop as a straight line is applied. Although the 3D scanning platform offers a new quality of crime scene documentation with BPA, some of the limitations of the technique are also mentioned. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Determining Fuzzy Membership for Sentiment Classification: A Three-Layer Sentiment Propagation Model

    PubMed Central

    Zhao, Chuanjun; Wang, Suge; Li, Deyu

    2016-01-01

    Enormous quantities of review documents exist in forums, blogs, twitter accounts, and shopping web sites. Analysis of the sentiment information hidden in these review documents is very useful for consumers and manufacturers. The sentiment orientation and sentiment intensity of a review can be described in more detail by using a sentiment score than by using bipolar sentiment polarity. Existing methods for calculating review sentiment scores frequently use a sentiment lexicon or the locations of features in a sentence, a paragraph, and a document. In order to achieve more accurate sentiment scores of review documents, a three-layer sentiment propagation model (TLSPM) is proposed that uses three kinds of interrelations, those among documents, topics, and words. First, we use nine relationship pairwise matrices between documents, topics, and words. In TLSPM, we suppose that sentiment neighbors tend to have the same sentiment polarity and similar sentiment intensity in the sentiment propagation network. Then, we implement the sentiment propagation processes among the documents, topics, and words in turn. Finally, we can obtain the steady sentiment scores of documents by a continuous iteration process. Intuition might suggest that documents with strong sentiment intensity make larger contributions to classification than those with weak sentiment intensity. Therefore, we use the fuzzy membership of documents obtained by TLSPM as the weight of the text to train a fuzzy support vector machine model (FSVM). As compared with a support vector machine (SVM) and four other fuzzy membership determination methods, the results show that FSVM trained with TLSPM can enhance the effectiveness of sentiment classification. In addition, FSVM trained with TLSPM can reduce the mean square error (MSE) on seven sentiment rating prediction data sets. PMID:27846225

  14. Documentation of the Retail Price Model

    EPA Pesticide Factsheets

    The Retail Price Model (RPM) provides a first‐order estimate of average retail electricity prices using information from the EPA Base Case v.5.13 Base Case or other scenarios for each of the 64 Integrated Planing Model (IPM) regions.

  15. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    EPA Science Inventory

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  16. Freight Transportation Energy Use : Volume 2. Methodology and Program Documentation.

    DOT National Transportation Integrated Search

    1978-07-01

    The structure and logic of the transportation network model component of the TSC Freight Energy Model are presented. The model assigns given origin-destination commodity flows to specific transport modes and routes, thereby determining the traffic lo...

  17. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > Mesoscale Modeling > PEOPLE Home Mission Models R & D Collaborators Documentation Change Log People Calendar References Verification/Diagnostics Tropical & Extratropical Cyclone Tracks & Verification Implementation Info FAQ Disclaimer More Info MESOSCALE MODELING PEOPLE

  18. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  19. Explicit Pharmacokinetic Modeling: Tools for Documentation, Verification, and Portability

    EPA Science Inventory

    Quantitative estimates of tissue dosimetry of environmental chemicals due to multiple exposure pathways require the use of complex mathematical models, such as physiologically-based pharmacokinetic (PBPK) models. The process of translating the abstract mathematics of a PBPK mode...

  20. The potential of artificial aging for modelling of natural aging processes of ballpoint ink.

    PubMed

    Weyermann, Céline; Spengler, Bernhard

    2008-08-25

    Artificial aging has been used to reproduce natural aging processes in an accelerated pace. Questioned documents were exposed to light or high temperature in a well-defined manner in order to simulate an increased age. This may be used to study the aging processes or to date documents by reproducing their aging curve. Ink was studied especially because it is deposited on the paper when a document, such as a contract, is produced. Once on the paper, aging processes start through degradation of dyes, solvents drying and resins polymerisation. Modelling of dye's and solvent's aging was attempted. These processes, however, follow complex pathways, influenced by many factors which can be classified as three major groups: ink composition, paper type and storage conditions. The influence of these factors is such that different aging states can be obtained for an identical point in time. Storage conditions in particular are difficult to simulate, as they are dependent on environmental conditions (e.g. intensity and dose of light, temperature, air flow, humidity) and cannot be controlled in the natural aging of questioned documents. The problem therefore lies more in the variety of different conditions a questioned document might be exposed to during its natural aging, rather than in the simulation of such conditions in the laboratory. Nevertheless, a precise modelling of natural aging curves based on artificial aging curves is obtained when performed on the same paper and ink. A standard model for aging processes of ink on paper is therefore presented that is based on a fit of aging curves to a power law of solvent concentrations as a function of time. A mathematical transformation of artificial aging curves into modelled natural aging curves results in excellent overlap with data from real natural aging processes.

  1. a Historical Timber Frame Model for Diagnosis and Documentation Before Building Restoration

    NASA Astrophysics Data System (ADS)

    Koehl, M.; Viale, A.; Reeb, S.

    2013-09-01

    The aim of the project that is described in this paper was to define a four-level timber frame survey mode of a historical building: the so-called "Andlau's Seigniory", Alsace, France. This historical building (domain) was built in the late XVIth century and is now in a stage of renovation in order to become a heritage interpretation centre. The used measurement methods combine Total Station measurements, Photogrammetry and 3D Terrestrial Laser scanner. Different modelling workflows were tested and compared according to the data acquisition method, but also according to the characteristics of the reconstructed model in terms of accuracy and level of detail. 3D geometric modelling of the entire structure was performed including modelling the degree of detail adapted to the needs. The described 3D timber framework exists now in different versions, from a theoretical and geometrical one up to a very detailed one, in which measurements and evaluation of deformation by time are potentially allowed. The virtually generated models involving archaeologists, architects, historians and specialists in historical crafts, are intended to be used during the four stages of the project: (i) knowledge of the current state of needs for diagnosis and understanding of former construction techniques; (ii) preparation and evaluation of restoration steps; (iii) knowledge and documentation concerning the archaeological object; (iv) transmission and dissemination of knowledge through the implementation of museum animations. Among the generated models we can also find a documentation of the site in the form of virtual tours created from panoramic photographs before and during the restoration works. Finally, the timber framework model was structured and integrated into a 3D GIS, where the association of descriptive and complementary digital documents was possible. Both offer tools leading to the diagnosis, the understanding of the structure, knowledge dissemination, documentation and the creation of educational activities. The integration of these measurements in a historical information system will lead to the creation of an interactive model and the creation of a digital visual display unit for consultation. It will be offered to any public to understand interactively the art of constructing a Renaissance structure, with detailed photos, descriptive texts and graphics. The 3D digital model of the framework will be used directly in the interpretation path, within the space dedicated to "Seigniory" of Andlau. An interactive touch-screen will be installed. It will incorporate several levels of playgrounds (playful, evocative and teaching). In a virtual way, it will deal with the different stages of building a wooden framework and clarify the art of construction.

  2. Dealing with Multiple Documents on the WWW: The Role of Metacognition in the Formation of Documents Models

    ERIC Educational Resources Information Center

    Stadtler, Marc; Bromme, Rainer

    2007-01-01

    Drawing on the theory of documents representation (Perfetti et al., Toward a theory of documents representation. In: H. v. Oostendorp & S. R. Goldman (Eds.), "The construction of mental representations during reading." Mahwah, NJ: Erlbaum, 1999), we argue that successfully dealing with multiple documents on the World Wide Web requires readers to…

  3. External Tank Liquid Hydrogen (LH2) Prepress Regression Analysis Independent Review Technical Consultation Report

    NASA Technical Reports Server (NTRS)

    Parsons, Vickie s.

    2009-01-01

    The request to conduct an independent review of regression models, developed for determining the expected Launch Commit Criteria (LCC) External Tank (ET)-04 cycle count for the Space Shuttle ET tanking process, was submitted to the NASA Engineering and Safety Center NESC on September 20, 2005. The NESC team performed an independent review of regression models documented in Prepress Regression Analysis, Tom Clark and Angela Krenn, 10/27/05. This consultation consisted of a peer review by statistical experts of the proposed regression models provided in the Prepress Regression Analysis. This document is the consultation's final report.

  4. Overview of Threats and Failure Models for Safety-Relevant Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This document presents a high-level overview of the threats to safety-relevant computer-based systems, including (1) a description of the introduction and activation of physical and logical faults; (2) the propagation of their effects; and (3) function-level and component-level error and failure mode models. These models can be used in the definition of fault hypotheses (i.e., assumptions) for threat-risk mitigation strategies. This document is a contribution to a guide currently under development that is intended to provide a general technical foundation for designers and evaluators of safety-relevant systems.

  5. Disability in Mexico: a comparative analysis between descriptive models and historical periods using a timeline.

    PubMed

    Sandoval, Hugo; Pérez-Neri, Iván; Martínez-Flores, Francisco; Valle-Cabrera, Martha Griselda Del; Pineda, Carlos

    2017-01-01

    Some interpretations frequently argue that three Disability Models (DM) (Charity, Medical/Rehabilitation, and Social) correspond to historical periods in terms of chronological succession. These views permeate a priori within major official documents on the subject in Mexico. This paper intends to test whether this association is plausible by applying a timeline method. A document search was made with inclusion and exclusion criteria in databases to select representative studies with which to depict milestones in the timelines for each period. The following is demonstrated: 1) models should be considered as categories of analysis and not as historical periods, in that the prevalence of elements of the three models is present to date, and 2) the association between disability models and historical periods results in teleological interpretations of the history of disability in Mexico.

  6. Temperature and solute-transport simulation in streamflow using a Lagrangian reference frame

    USGS Publications Warehouse

    Jobson, Harvey E.

    1980-01-01

    A computer program for simulating one-dimensional, unsteady temperature and solute transport in a river has been developed and documented for general use. The solution approach to the convective-diffusion equation uses a moving reference frame (Lagrangian) which greatly simplifies the mathematics of the solution procedure and dramatically reduces errors caused by numerical dispersion. The model documentation is presented as a series of four programs of increasing complexity. The conservative transport model can be used to route a single conservative substance. The simplified temperature model is used to predict water temperature in rivers when only temperature and windspeed data are available. The complete temperature model is highly accurate but requires rather complete meteorological data. Finally, the 10-parameter model can be used to route as many as 10 interacting constituents through a river reach. (USGS)

  7. 39 CFR 3050.23 - Documentation supporting incremental cost estimates in the Postal Service's section 3652 report.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... incremental cost model shall be reported. ... 39 Postal Service 1 2010-07-01 2010-07-01 false Documentation supporting incremental cost... REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.23 Documentation supporting incremental cost...

  8. 75 FR 29588 - Notice of Availability of the Models for Plant-Specific Adoption of Technical Specifications Task...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-26

    ... Documents Access and Management System (ADAMS) under Accession Number ML090510686. The proposed changes... Documents Access and Management System (ADAMS): Publicly available documents created or received at the NRC... expedited approval of plant-specific adoption of TSTF- 501, Revision 1. Documents: You can access publicly...

  9. 75 FR 26294 - Notice of Availability of the Models for Plant-Specific Adoption of Technical Specifications Task...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-11

    ... an errata sheet are available in the Agencywide Documents Access and Management System (ADAMS) under... Agencywide Documents Access and Management System (ADAMS): Publicly available documents created or received... facilitate expedited approval of plant-specific adoption of TSTF-493, Revision 4. Documents: You can access...

  10. U-10Mo Baseline Fuel Fabrication Process Description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hubbard, Lance R.; Arendt, Christina L.; Dye, Daniel F.

    This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle ofmore » the USHPRR program. This document, along with the accompanying PFD, is updated regularly« less

  11. Semantic Clinical Guideline Documents

    PubMed Central

    Eriksson, Henrik; Tu, Samson W.; Musen, Mark

    2005-01-01

    Decision-support systems based on clinical practice guidelines can support physicians and other health-care personnel in the process of following best practice consistently. A knowledge-based approach to represent guidelines makes it possible to encode computer-interpretable guidelines in a formal manner, perform consistency checks, and use the guidelines directly in decision-support systems. Decision-support authors and guideline users require guidelines in human-readable formats in addition to computer-interpretable ones (e.g., for guideline review and quality assurance). We propose a new document-oriented information architecture that combines knowledge-representation models with electronic and paper documents. The approach integrates decision-support modes with standard document formats to create a combined clinical-guideline model that supports on-line viewing, printing, and decision support. PMID:16779037

  12. DOUBLE SHELL TANK (DST) HYDROXIDE DEPLETION MODEL FOR CARBON DIOXIDE ABSORPTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OGDEN DM; KIRCH NW

    2007-10-31

    This document generates a supernatant hydroxide ion depletion model based on mechanistic principles. The carbon dioxide absorption mechanistic model is developed in this report. The report also benchmarks the model against historical tank supernatant hydroxide data and vapor space carbon dioxide data. A comparison of the newly generated mechanistic model with previously applied empirical hydroxide depletion equations is also performed.

  13. Process Engineering with the Evolutionary Spiral Process Model. Version 01.00.06

    DTIC Science & Technology

    1994-01-01

    program . Process Definition and SPC-92041-CMC Provides methods for defining and Modeling Guidebook documenting processes so they can be analyzed, modified...and Program Evaluation and Review Technique (PERT) support the activity of developing a project schedule. A variety of automated tools, such as...keep the organiza- tion from becoming disoriented during the improvement program (Curtis, Kellner, and Over 1992). Analyzing and documenting how

  14. Short-Term Energy Outlook Model Documentation: Macro Bridge Procedure to Update Regional Macroeconomic Forecasts with National Macroeconomic Forecasts

    EIA Publications

    2010-01-01

    The Regional Short-Term Energy Model (RSTEM) uses macroeconomic variables such as income, employment, industrial production and consumer prices at both the national and regional1 levels as explanatory variables in the generation of the Short-Term Energy Outlook (STEO). This documentation explains how national macroeconomic forecasts are used to update regional macroeconomic forecasts through the RSTEM Macro Bridge procedure.

  15. Re-Conceptualizing Teachers' Continuous Professional Development within a New Paradigm of Change in the Indian Context: An Analysis of Literature and Policy Documents

    ERIC Educational Resources Information Center

    Subitha, G. V.

    2018-01-01

    Located within the context of Indian education reforms, this study is a critique of the current model of continuous professional development of teachers. The study, by reviewing national policy documents and research literature, argues that there is a need to re-conceptualize and re-define the current model of professional development of teachers.…

  16. CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 3

    DTIC Science & Technology

    2012-06-01

    OMG) standard Business Process Modeling and Nota- tion ( BPMN ) [6] graphical notation. I will address each of these: identify and document steps...to a value stream map using BPMN and textual process narratives. The resulting process narratives or process metadata includes key information...objectives. Once the processes are identified we can graphically document them capturing the process using BPMN (see Figure 1). The BPMN models

  17. Summary of fuel economy performance

    DOT National Transportation Integrated Search

    2009-03-30

    This report contains estimated fleet production numbers and CAFE figures obtained from pre-model year (source 1) and mid-model year (source 2) documents assembled prior to or during the model year. The actual mpg values reported to EPA at the end of ...

  18. Summary of fuel economy performance

    DOT National Transportation Integrated Search

    2010-04-20

    This report contains estimated fleet production numbers and CAFE figures obtained from pre-model year (source I) and mid-model year (source 2) documents assembled prior to or during the model year. The actual mpg values reported to EPA at the end of ...

  19. Summary of fuel economy performance

    DOT National Transportation Integrated Search

    2009-12-09

    This report contains estimated fleet production numbers and CAFE figures obtained from pre-model year (source I) and mid-model year (source 2) documents assembled prior to or during the model year. The actual mpg values reported to EPA at the end of ...

  20. Manual for LS-DYNA Wood Material Model 143

    DOT National Transportation Integrated Search

    2007-08-01

    An elastoplastic damage model with rate effects was developed for wood and was implemented into LS-DYNA, a commercially available finite element code. This manual documents the theory of the wood material model, describes the LS-DYNA input and output...

  1. BEHAVE: fire behavior prediction and fuel modeling system--FUEL subsystem

    Treesearch

    Robert E. Burgan; Richard C. Rothermel

    1984-01-01

    This manual documents the fuel modeling procedures of BEHAVE--a state-of-the-art wildland fire behavior prediction system. Described are procedures for collecting fuel data, using the data with the program, and testing and adjusting the fuel model.

  2. Short-Term Energy Outlook Model Documentation: Motor Gasoline Consumption Model

    EIA Publications

    2011-01-01

    The motor gasoline consumption module of the Short-Term Energy Outlook (STEO) model is designed to provide forecasts of total U.S. consumption of motor gasolien based on estimates of vehicle miles traveled and average vehicle fuel economy.

  3. The Compass Rose Effectiveness Model

    ERIC Educational Resources Information Center

    Spiers, Cynthia E.; Kiel, Dorothy; Hohenrink, Brad

    2008-01-01

    The effectiveness model focuses the institution on mission achievement through assessment and improvement planning. Eleven mission criteria, measured by key performance indicators, are aligned with the accountability interest of internal and external stakeholders. A Web-based performance assessment application supports the model, documenting the…

  4. Observation model and parameter partials for the JPL VLBI parameter estimation software MASTERFIT-1987

    NASA Technical Reports Server (NTRS)

    Sovers, O. J.; Fanselow, J. L.

    1987-01-01

    This report is a revision of the document of the same title (1986), dated August 1, which it supersedes. Model changes during 1986 and 1987 included corrections for antenna feed rotation, refraction in modelling antenna axis offsets, and an option to employ improved values of the semiannual and annual nutation amplitudes. Partial derivatives of the observables with respect to an additional parameter (surface temperature) are now available. New versions of two figures representing the geometric delay are incorporated. The expressions for the partial derivatives with respect to the nutation parameters have been corrected to include contributions from the dependence of UTI on nutation. The authors hope to publish revisions of this document in the future, as modeling improvements warrant.

  5. Observation model and parameter partials for the JPL VLBI parameter estimation software MASTERFIT-1987

    NASA Astrophysics Data System (ADS)

    Sovers, O. J.; Fanselow, J. L.

    1987-12-01

    This report is a revision of the document of the same title (1986), dated August 1, which it supersedes. Model changes during 1986 and 1987 included corrections for antenna feed rotation, refraction in modelling antenna axis offsets, and an option to employ improved values of the semiannual and annual nutation amplitudes. Partial derivatives of the observables with respect to an additional parameter (surface temperature) are now available. New versions of two figures representing the geometric delay are incorporated. The expressions for the partial derivatives with respect to the nutation parameters have been corrected to include contributions from the dependence of UTI on nutation. The authors hope to publish revisions of this document in the future, as modeling improvements warrant.

  6. Analysis of structural dynamic data from Skylab. Volume 1: Technical discussion

    NASA Technical Reports Server (NTRS)

    Demchak, L.; Harcrow, H.

    1976-01-01

    The results of a study to analyze data and document dynamic program highlights of the Skylab Program are presented. Included are structural model sources, illustration of the analytical models, utilization of models and the resultant derived data, data supplied to organization and subsequent utilization, and specifications of model cycles.

  7. SSDA code to apply data assimilation in soil water flow modeling: Documentation and user manual

    USDA-ARS?s Scientific Manuscript database

    Soil water flow models are based on simplified assumptions about the mechanisms, processes, and parameters of water retention and flow. That causes errors in soil water flow model predictions. Data assimilation (DA) with the ensemble Kalman filter (EnKF) corrects modeling results based on measured s...

  8. Description of the General Equilibrium Model of Ecosystem Services (GEMES)

    Treesearch

    Travis Warziniack; David Finnoff; Jenny Apriesnig

    2017-01-01

    This paper serves as documentation for the General Equilibrium Model of Ecosystem Services (GEMES). GEMES is a regional computable general equilibrium model that is composed of values derived from natural capital and ecosystem services. It models households, producing sectors, and governments, linked to one another through commodity and factor markets. GEMES was...

  9. Mathematics in Marine Botany: Examples of the Modelling Process. Part II: Continuous Models.

    ERIC Educational Resources Information Center

    Nyman, Melvin A.; Brown, Murray T.

    1996-01-01

    Describes some continuous models for growth of the seaweed Macrocystis pyrifera. Uses observed growth rates over several months to derive first-order differential equations as models for growth rates of individual fronds. The nature of the solutions is analyzed and comparison between these theoretical results and documented characteristics of…

  10. UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.

    PubMed

    Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun

    2013-12-01

    Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.

  11. Report of the LSPI/NASA Workshop on Lunar Base Methodology Development

    NASA Technical Reports Server (NTRS)

    Nozette, Stewart; Roberts, Barney

    1985-01-01

    Groundwork was laid for computer models which will assist in the design of a manned lunar base. The models, herein described, will provide the following functions for the successful conclusion of that task: strategic planning; sensitivity analyses; impact analyses; and documentation. Topics addressed include: upper level model description; interrelationship matrix; user community; model features; model descriptions; system implementation; model management; and plans for future action.

  12. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Application of Model View Definition Attributes

    DTIC Science & Technology

    2013-06-01

    Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains general information de...develop an information -exchange Model View Definition (MVD) for building electrical systems. The objective of the current work was to document the

  13. Probabilistic topic modeling for the analysis and classification of genomic sequences

    PubMed Central

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  14. Fact Sheet: Documenting Ground-Water Models Selection at Site Contaminated with Radioactive Substance

    EPA Pesticide Factsheets

    This fact sheet summarizes the report by a joint Interagency Environmental Pathway Modeling Working Group. It was designed to be used by technical staff responsible for identifying and implementing flow and transport models to support cleanup decisions.

  15. A review of computer evacuation models and their data needs.

    DOT National Transportation Integrated Search

    1994-05-01

    This document reviews the history and current status of computer models of the evacuation of an airliner cabin. Basic concepts upon which evacuation models are based are discussed, followed by a review of the Civil Aerospace Medical Institute s effor...

  16. Heliport noise model (HNM) version 1 user's guide

    DOT National Transportation Integrated Search

    1988-02-01

    This document contains the instructions to execute the Heliport Noise Model (HNM), Version 1. HNM Version 1 is a computer tool for determining the total impact of helicopter noise at and around heliports. The model runs on IBM PC/XT/AT personal compu...

  17. Stochastic Lanchester Air-to-Air Campaign Model: Model Description and Users Guides

    DTIC Science & Technology

    2009-01-01

    STOCHASTIC LANCHESTER AIR-TO-AIR CAMPAIGN MODEL MODEL DESCRIPTION AND USERS GUIDES—2009 REPORT PA702T1 Rober t V. Hemm Jr. Dav id A . Lee...LMI © 2009. ALL RIGHTS RESERVED. Stochastic Lanchester Air-to-Air Campaign Model: Model Description and Users Guides—2009 PA702T1/JANUARY...2009 Executive Summary This report documents the latest version of the Stochastic Lanchester Air-to-Air Campaign Model (SLAACM), developed by LMI for

  18. A Model of Research Paper Writing Instructional Materials for Academic Writing Course: "Needs & Documents Analysis and Model Design"

    ERIC Educational Resources Information Center

    Ghufron, M. Ali; Saleh, Mursid; Warsono; Sofwan, Ahmad

    2016-01-01

    This study aimed at designing a model of instructional materials for Academic Writing Course focusing on research paper writing. The model was designed based on the Curriculum at the English Education Study Program, Faculty of Language and Art Education of IKIP PGRI Bojonegoro, East Java, Indonesia. This model was developed in order to improve…

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Celia, Michael A.

    This report documents the accomplishments achieved during the project titled “Model complexity and choice of model approaches for practical simulations of CO 2 injection,migration, leakage and long-term fate” funded by the US Department of Energy, Office of Fossil Energy. The objective of the project was to investigate modeling approaches of various levels of complexity relevant to geologic carbon storage (GCS) modeling with the goal to establish guidelines on choice of modeling approach.

  20. 10 CFR 490.704 - Procedures and documentation.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Procedures and documentation. 490.704 Section 490.704 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ALTERNATIVE FUEL TRANSPORTATION PROGRAM Biodiesel Fuel... include written documentation stating the quantity of biodiesel purchased, for the given model year, for...

  1. 10 CFR 490.704 - Procedures and documentation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Procedures and documentation. 490.704 Section 490.704 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ALTERNATIVE FUEL TRANSPORTATION PROGRAM Biodiesel Fuel... include written documentation stating the quantity of biodiesel purchased, for the given model year, for...

  2. 10 CFR 490.704 - Procedures and documentation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Procedures and documentation. 490.704 Section 490.704 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ALTERNATIVE FUEL TRANSPORTATION PROGRAM Biodiesel Fuel... include written documentation stating the quantity of biodiesel purchased, for the given model year, for...

  3. 10 CFR 490.704 - Procedures and documentation.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Procedures and documentation. 490.704 Section 490.704 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ALTERNATIVE FUEL TRANSPORTATION PROGRAM Biodiesel Fuel... include written documentation stating the quantity of biodiesel purchased, for the given model year, for...

  4. 10 CFR 490.704 - Procedures and documentation.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Procedures and documentation. 490.704 Section 490.704 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ALTERNATIVE FUEL TRANSPORTATION PROGRAM Biodiesel Fuel... include written documentation stating the quantity of biodiesel purchased, for the given model year, for...

  5. A Deep and Autoregressive Approach for Topic Modeling of Multimodal Data.

    PubMed

    Zheng, Yin; Zhang, Yu-Jin; Larochelle, Hugo

    2016-06-01

    Topic modeling based on latent Dirichlet allocation (LDA) has been a framework of choice to deal with multimodal data, such as in image annotation tasks. Another popular approach to model the multimodal data is through deep neural networks, such as the deep Boltzmann machine (DBM). Recently, a new type of topic model called the Document Neural Autoregressive Distribution Estimator (DocNADE) was proposed and demonstrated state-of-the-art performance for text document modeling. In this work, we show how to successfully apply and extend this model to multimodal data, such as simultaneous image classification and annotation. First, we propose SupDocNADE, a supervised extension of DocNADE, that increases the discriminative power of the learned hidden topic features and show how to employ it to learn a joint representation from image visual words, annotation words and class label information. We test our model on the LabelMe and UIUC-Sports data sets and show that it compares favorably to other topic models. Second, we propose a deep extension of our model and provide an efficient way of training the deep model. Experimental results show that our deep model outperforms its shallow version and reaches state-of-the-art performance on the Multimedia Information Retrieval (MIR) Flickr data set.

  6. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    NASA Technical Reports Server (NTRS)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  7. Multi-Topic Tracking Model for dynamic social network

    NASA Astrophysics Data System (ADS)

    Li, Yuhua; Liu, Changzheng; Zhao, Ming; Li, Ruixuan; Xiao, Hailing; Wang, Kai; Zhang, Jun

    2016-07-01

    The topic tracking problem has attracted much attention in the last decades. However, existing approaches rarely consider network structures and textual topics together. In this paper, we propose a novel statistical model based on dynamic bayesian network, namely Multi-Topic Tracking Model for Dynamic Social Network (MTTD). It takes influence phenomenon, selection phenomenon, document generative process and the evolution of textual topics into account. Specifically, in our MTTD model, Gibbs Random Field is defined to model the influence of historical status of users in the network and the interdependency between them in order to consider the influence phenomenon. To address the selection phenomenon, a stochastic block model is used to model the link generation process based on the users' interests to topics. Probabilistic Latent Semantic Analysis (PLSA) is used to describe the document generative process according to the users' interests. Finally, the dependence on the historical topic status is also considered to ensure the continuity of the topic itself in topic evolution model. Expectation Maximization (EM) algorithm is utilized to estimate parameters in the proposed MTTD model. Empirical experiments on real datasets show that the MTTD model performs better than Popular Event Tracking (PET) and Dynamic Topic Model (DTM) in generalization performance, topic interpretability performance, topic content evolution and topic popularity evolution performance.

  8. Reproducibility in Computational Neuroscience Models and Simulations

    PubMed Central

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  9. Combining dictionary techniques with extensible markup language (XML)--requirements to a new approach towards flexible and standardized documentation.

    PubMed Central

    Altmann, U.; Tafazzoli, A. G.; Noelle, G.; Huybrechts, T.; Schweiger, R.; Wächter, W.; Dudeck, J. W.

    1999-01-01

    In oncology various international and national standards exist for the documentation of different aspects of a disease. Since elements of these standards are repeated in different contexts, a common data dictionary could support consistent representation in any context. For the construction of such a dictionary existing documents have to be worked up in a complex procedure, that considers aspects of hierarchical decomposition of documents and of domain control as well as aspects of user presentation and models of the underlying model of patient data. In contrast to other thesauri, text chunks like definitions or explanations are very important and have to be preserved, since oncologic documentation often means coding and classification on an aggregate level and the safe use of coding systems is an important precondition for comparability of data. This paper discusses the potentials of the use of XML in combination with a dictionary for the promotion and development of standard conformable applications for tumor documentation. PMID:10566311

  10. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  11. Deep Unfolding for Topic Models.

    PubMed

    Chien, Jen-Tzung; Lee, Chao-Hsi

    2018-02-01

    Deep unfolding provides an approach to integrate the probabilistic generative models and the deterministic neural networks. Such an approach is benefited by deep representation, easy interpretation, flexible learning and stochastic modeling. This study develops the unsupervised and supervised learning of deep unfolded topic models for document representation and classification. Conventionally, the unsupervised and supervised topic models are inferred via the variational inference algorithm where the model parameters are estimated by maximizing the lower bound of logarithm of marginal likelihood using input documents without and with class labels, respectively. The representation capability or classification accuracy is constrained by the variational lower bound and the tied model parameters across inference procedure. This paper aims to relax these constraints by directly maximizing the end performance criterion and continuously untying the parameters in learning process via deep unfolding inference (DUI). The inference procedure is treated as the layer-wise learning in a deep neural network. The end performance is iteratively improved by using the estimated topic parameters according to the exponentiated updates. Deep learning of topic models is therefore implemented through a back-propagation procedure. Experimental results show the merits of DUI with increasing number of layers compared with variational inference in unsupervised as well as supervised topic models.

  12. MODFLOW-2000, The U.S. Geological Survey Modular Ground-Water Model -- GMG Linear Equation Solver Package Documentation

    USGS Publications Warehouse

    Wilson, John D.; Naff, Richard L.

    2004-01-01

    A geometric multigrid solver (GMG), based in the preconditioned conjugate gradient algorithm, has been developed for solving systems of equations resulting from applying the cell-centered finite difference algorithm to flow in porous media. This solver has been adapted to the U.S. Geological Survey ground-water flow model MODFLOW-2000. The documentation herein is a description of the solver and the adaptation to MODFLOW-2000.

  13. Grassroots Genealogy: Exploring, Documenting and Preserving Black Family History. A Pilot Workshop. (Greensboro, North Carolina, January 28-31, 1981 and February 9-10, 23, 1981).

    ERIC Educational Resources Information Center

    Young, Tommie Morton, Ed.

    A workshop model focuses on using lesser-known resources to document black family history and lineage. Although designed for use in North Carolina, this model can be adapted for use in any state or community. Following an introduction which summarizes the workshop format is an overview which outlines the goals, objectives, and focus of the…

  14. FMS: A Format Manipulation System for Automatic Production of Natural Language Documents, Second Edition. Final Report.

    ERIC Educational Resources Information Center

    Silver, Steven S.

    FMS/3 is a system for producing hard copy documentation at high speed from free format text and command input. The system was originally written in assembler language for a 12K IBM 360 model 20 using a high speed 1403 printer with the UCS-TN chain option (upper and lower case). Input was from an IBM 2560 Multi-function Card Machine. The model 20…

  15. Data Warehouse Design from HL7 Clinical Document Architecture Schema.

    PubMed

    Pecoraro, Fabrizio; Luzi, Daniela; Ricci, Fabrizio L

    2015-01-01

    This paper proposes a semi-automatic approach to extract clinical information structured in a HL7 Clinical Document Architecture (CDA) and transform it in a data warehouse dimensional model schema. It is based on a conceptual framework published in a previous work that maps the dimensional model primitives with CDA elements. Its feasibility is demonstrated providing a case study based on the analysis of vital signs gathered during laboratory tests.

  16. Command Center Library Model Document. Comprehensive Approach to Reusable Defense Software (CARDS)

    DTIC Science & Technology

    1992-05-31

    system, and functionality for specifying the layout of the document. 3.7.16.1 FrameMaker FrameMaker is a Commercial Off The Shelf (COTS) component...facilitating WYSIWYG creation of formatted reports with embedded graphics. FrameMaker is an advanced publishing tool that integrates word processing...available for the component FrameMaker : * Product evaluation reports in ASCII and postscript formats • Product assessment on line in model 0 Product

  17. Training for the Future. How Can Trainees Meet Current and Future Needs of Industry? Guidelines and Models for the Development of Interdisciplinary Assignments Based on the Concept of Key Technologies.

    ERIC Educational Resources Information Center

    Bolton, William; Clyde, Albert

    This document provides guidelines for the development of interdisciplinary assignments to help prepare learners for the developing needs of industry; it also contains a collection of model assignments produced by 12 British colleges. An introduction explains how to use the document and offers a checklist for the development of interdisciplinary…

  18. A Model for Indexing Medical Documents Combining Statistical and Symbolic Knowledge.

    PubMed Central

    Avillach, Paul; Joubert, Michel; Fieschi, Marius

    2007-01-01

    OBJECTIVES: To develop and evaluate an information processing method based on terminologies, in order to index medical documents in any given documentary context. METHODS: We designed a model using both symbolic general knowledge extracted from the Unified Medical Language System (UMLS) and statistical knowledge extracted from a domain of application. Using statistical knowledge allowed us to contextualize the general knowledge for every particular situation. For each document studied, the extracted terms are ranked to highlight the most significant ones. The model was tested on a set of 17,079 French standardized discharge summaries (SDSs). RESULTS: The most important ICD-10 term of each SDS was ranked 1st or 2nd by the method in nearly 90% of the cases. CONCLUSIONS: The use of several terminologies leads to more precise indexing. The improvement achieved in the model’s implementation performances as a result of using semantic relationships is encouraging. PMID:18693792

  19. Development and evaluation of nursing user interface screens using multiple methods.

    PubMed

    Hyun, Sookyung; Johnson, Stephen B; Stetson, Peter D; Bakken, Suzanne

    2009-12-01

    Building upon the foundation of the Structured Narrative Electronic Health Record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses' perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses' perspectives, and assess nurses' perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs.

  20. Model Validation Status Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E.L. Hardin

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified,more » and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and engineered barriers, plus the TSPA model itself Description of the model areas is provided in Section 3, and the documents reviewed are described in Section 4. The responsible manager for the Model Validation Status Review was the Chief Science Officer (CSO) for Bechtel-SAIC Co. (BSC). The team lead was assigned by the CSO. A total of 32 technical specialists were engaged to evaluate model validation status in the 21 model areas. The technical specialists were generally independent of the work reviewed, meeting technical qualifications as discussed in Section 5.« less

  1. Modeling Complex Cross-Systems Software Interfaces Using SysML

    NASA Technical Reports Server (NTRS)

    Mandutianu, Sanda; Morillo, Ron; Simpson, Kim; Liepack, Otfrid; Bonanne, Kevin

    2013-01-01

    The complex flight and ground systems for NASA human space exploration are designed, built, operated and managed as separate programs and projects. However, each system relies on one or more of the other systems in order to accomplish specific mission objectives, creating a complex, tightly coupled architecture. Thus, there is a fundamental need to understand how each system interacts with the other. To determine if a model-based system engineering approach could be utilized to assist with understanding the complex system interactions, the NASA Engineering and Safety Center (NESC) sponsored a task to develop an approach for performing cross-system behavior modeling. This paper presents the results of applying Model Based Systems Engineering (MBSE) principles using the System Modeling Language (SysML) to define cross-system behaviors and how they map to crosssystem software interfaces documented in system-level Interface Control Documents (ICDs).

  2. Air Quality Modeling

    EPA Pesticide Factsheets

    In this technical support document (TSD) EPA describes the air quality modeling performed to support the Environmental Protection Agency’s Transport Rule proposal (now known as the Cross-State Air Pollution Rule).

  3. CISNET: Model Documentation

    Cancer.gov

    The Publications pages provide lists of all CISNET publications since the inception of CISNET. Publications are listed by Cancer Site or by Research Topic. The Publication Support and Modeling Resources pages provides access to technical modeling information, raw data, and publication extensions stemming from the work of the CISNET consortium.

  4. Development and validation of deterioration models for concrete bridge decks - phase 1 : artificial intelligence models and bridge management system.

    DOT National Transportation Integrated Search

    2013-06-01

    This research documents the development and evaluation of artificial neural network (ANN) models to predict the condition ratings of concrete highway bridge decks in Michigan. Historical condition assessments chronicled in the national bridge invento...

  5. GREET 1.5 : transportation fuel-cycle model. Vol. 1 : methodology, development, use, and results.

    DOT National Transportation Integrated Search

    1999-10-01

    This report documents the development and use of the most recent version (Version 1.5) of the Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) model. The model, developed in a spreadsheet format, estimates the full fuel...

  6. KABAM Version 1.0 User's Guide and Technical Documentation - Appendix A - Description of Bioaccumulation Model

    EPA Pesticide Factsheets

    The purpose of this model is to estimate chemical concentrations (CB) and BCF and BAF values for aquatic ecosystems. KABAM is a simulation model used to predict pesticide concentrations in aquatic regions for use in exposure assessments.

  7. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  8. National Centers for Environmental Prediction

    Science.gov Websites

    Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Documentation for operational and research users Operational Models All of the secondary bulleted items will be climate MOM4 HYCOM-Wavewatch Modeling Research Global and regional Institutionally supported components

  9. ESPVI 4.0 ELECTROSTATIS PRECIPITATOR V-1 AND PERFORMANCE MODEL: USER'S MANUAL

    EPA Science Inventory

    The manual is the companion document for the microcomputer program ESPVI 4.0, Electrostatic Precipitation VI and Performance Model. The program was developed to provide a user- friendly interface to an advanced model of electrostatic precipitation (ESP) performance. The program i...

  10. Evaluation of atmospheric density models and preliminary functional specifications for the Langley Atmospheric Information Retrieval System (LAIRS)

    NASA Technical Reports Server (NTRS)

    Lee, T.; Boland, D. F., Jr.

    1980-01-01

    This document presents the results of an extensive survey and comparative evaluation of current atmosphere and wind models for inclusion in the Langley Atmospheric Information Retrieval System (LAIRS). It includes recommended models for use in LAIRS, estimated accuracies for the recommended models, and functional specifications for the development of LAIRS.

  11. Hyper-Book: A Formal Model for Electronic Books.

    ERIC Educational Resources Information Center

    Catenazzi, Nadia; Sommaruga, Lorenzo

    1994-01-01

    Presents a model for electronic books based on the paper book metaphor. Discussion includes how the book evolves under the effects of its functional components; the use and impact of the model for organizing and presenting electronic documents in the context of electronic publishing; and the possible applications of a system based on the model.…

  12. The Devereux Model: Intensive Intervention with Children for Competency and Life Enhancement.

    ERIC Educational Resources Information Center

    Silverman, Wade H.; And Others

    The four sections in this document on the Devereux Model were written by different authors who work at the Devereux Center in Georgia and were presented as part of a symposium on the model. "The Devereux Model: Intensive Intervention with Children for Competency and Life Enhancement" (Wade H. Silverman) describes the Devereux Center as a…

  13. 7 CFR Exhibit E to Subpart A of... - Voluntary National Model Building Codes

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 12 2013-01-01 2013-01-01 false Voluntary National Model Building Codes E Exhibit E... National Model Building Codes The following documents address the health and safety aspects of buildings and related structures and are voluntary national model building codes as defined in § 1924.4(h)(2) of...

  14. 7 CFR Exhibit E to Subpart A of... - Voluntary National Model Building Codes

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 12 2014-01-01 2013-01-01 true Voluntary National Model Building Codes E Exhibit E to... Model Building Codes The following documents address the health and safety aspects of buildings and related structures and are voluntary national model building codes as defined in § 1924.4(h)(2) of this...

  15. 7 CFR Exhibit E to Subpart A of... - Voluntary National Model Building Codes

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 12 2012-01-01 2012-01-01 false Voluntary National Model Building Codes E Exhibit E... National Model Building Codes The following documents address the health and safety aspects of buildings and related structures and are voluntary national model building codes as defined in § 1924.4(h)(2) of...

  16. Review and verification of CARE 3 mathematical model and code

    NASA Technical Reports Server (NTRS)

    Rose, D. M.; Altschul, R. E.; Manke, J. W.; Nelson, D. L.

    1983-01-01

    The CARE-III mathematical model and code verification performed by Boeing Computer Services were documented. The mathematical model was verified for permanent and intermittent faults. The transient fault model was not addressed. The code verification was performed on CARE-III, Version 3. A CARE III Version 4, which corrects deficiencies identified in Version 3, is being developed.

  17. Modeling-Oriented Assessment in K-12 Science Education: A Synthesis of Research from 1980 to 2013 and New Directions

    ERIC Educational Resources Information Center

    Namdar, Bahadir; Shen, Ji

    2015-01-01

    Scientific modeling has been advocated as one of the core practices in recent science education policy initiatives. In modeling-based instruction (MBI), students use, construct, and revise models to gain scientific knowledge and inquiry skills. Oftentimes, the benefits of MBI have been documented using assessments targeting students' conceptual…

  18. [Documenting a rehabilitation program using a logic model: an advantage to the assessment process].

    PubMed

    Poncet, Frédérique; Swaine, Bonnie; Pradat-Diehl, Pascale

    2017-03-06

    The cognitive and behavioral disorders after brain injury can result in severe limitations of activities and restrictions of participation. An interdisciplinary rehabilitation program was developed in physical medicine and rehabilitation at the Pitié-Salpêtriere Hospital, Paris, France. Clinicians believe this program decreases activity limitations and improves participation in patients. However, the program’s effectiveness had never been assessed. To do this, we had to define/describe this program. However rehabilitation programs are holistic and thus complex making them difficult to describe. Therefore, to facilitate the evaluation of complex programs, including those for rehabilitation, we illustrate the use of a theoretical logic model, as proposed by Champagne, through the process of documentation of a specific complex and interdisciplinary rehabilitation program. Through participatory/collaborative research, the rehabilitation program was analyzed using three “submodels” of the logic model of intervention: causal model, intervention model and program theory model. This should facilitate the evaluation of programs, including those for rehabilitation.

  19. Crisis Management Systems: A Case Study for Aspect-Oriented Modeling

    NASA Astrophysics Data System (ADS)

    Kienzle, Jörg; Guelfi, Nicolas; Mustafiz, Sadaf

    The intent of this document is to define a common case study for the aspect-oriented modeling research community. The domain of the case study is crisis management systems, i.e., systems that help in identifying, assessing, and handling a crisis situation by orchestrating the communication between all parties involved in handling the crisis, by allocating and managing resources, and by providing access to relevant crisis-related information to authorized users. This document contains informal requirements of crisis management systems (CMSs) in general, a feature model for a CMS product line, use case models for a car crash CMS (CCCMS), a domain model for the CCCMS, an informal physical architecture description of the CCCMS, as well as some design models of a possible object-oriented implementation of parts of the CCCMS backend. AOM researchers who want to demonstrate the power of their AOM approach or technique can hence apply the approach at the most appropriate level of abstraction.

  20. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    NASA Technical Reports Server (NTRS)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  1. Feasibility Study of Low-Cost Image-Based Heritage Documentation in Nepal

    NASA Astrophysics Data System (ADS)

    Dhonju, H. K.; Xiao, W.; Sarhosis, V.; Mills, J. P.; Wilkinson, S.; Wang, Z.; Thapa, L.; Panday, U. S.

    2017-02-01

    Cultural heritage structural documentation is of great importance in terms of historical preservation, tourism, educational and spiritual values. Cultural heritage across the world, and in Nepal in particular, is at risk from various natural hazards (e.g. earthquakes, flooding, rainfall etc), poor maintenance and preservation, and even human destruction. This paper evaluates the feasibility of low-cost photogrammetric modelling cultural heritage sites, and explores the practicality of using photogrammetry in Nepal. The full pipeline of 3D modelling for heritage documentation and conservation, including visualisation, reconstruction, and structure analysis, is proposed. In addition, crowdsourcing is discussed as a method of data collection of growing prominence.

  2. The IRMIS object model and services API.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saunders, C.; Dohan, D. A.; Arnold, N. D.

    2005-01-01

    The relational model developed for the Integrated Relational Model of Installed Systems (IRMIS) toolkit has been successfully used to capture the Advanced Photon Source (APS) control system software (EPICS process variables and their definitions). The relational tables are populated by a crawler script that parses each Input/Output Controller (IOC) start-up file when an IOC reboot is detected. User interaction is provided by a Java Swing application that acts as a desktop for viewing the process variable information. Mapping between the display objects and the relational tables was carried out with the Hibernate Object Relational Modeling (ORM) framework. Work is wellmore » underway at the APS to extend the relational modeling to include control system hardware. For this work, due in part to the complex user interaction required, the primary application development environment has shifted from the relational database view to the object oriented (Java) perspective. With this approach, the business logic is executed in Java rather than in SQL stored procedures. This paper describes the object model used to represent control system software, hardware, and interconnects in IRMIS. We also describe the services API used to encapsulate the required behaviors for creating and maintaining the complex data. In addition to the core schema and object model, many important concepts in IRMIS are captured by the services API. IRMIS is an ambitious collaborative effort for defining and developing a relational database and associated applications to comprehensively document the large and complex EPICS-based control systems of today's accelerators. The documentation effort includes process variables, control system hardware, and interconnections. The approach could also be used to document all components of the accelerator, including mechanical, vacuum, power supplies, etc. One key aspect of IRMIS is that it is a documentation framework, not a design and development tool. We do not generate EPICS control system configurations from IRMIS, and hence do not impose any additional requirements on EPICS developers.« less

  3. Updated Results for the Wake Vortex Inverse Model

    NASA Technical Reports Server (NTRS)

    Robins, Robert E.; Lai, David Y.; Delisi, Donald P.; Mellman, George R.

    2008-01-01

    NorthWest Research Associates (NWRA) has developed an Inverse Model for inverting aircraft wake vortex data. The objective of the inverse modeling is to obtain estimates of the vortex circulation decay and crosswind vertical profiles, using time history measurements of the lateral and vertical position of aircraft vortices. The Inverse Model performs iterative forward model runs using estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Iterations are performed until a user-defined criterion is satisfied. Outputs from an Inverse Model run are the best estimates of the time history of the vortex circulation derived from the observed data, the vertical crosswind profile, and several vortex parameters. The forward model, named SHRAPA, used in this inverse modeling is a modified version of the Shear-APA model, and it is described in Section 2 of this document. Details of the Inverse Model are presented in Section 3. The Inverse Model was applied to lidar-observed vortex data at three airports: FAA acquired data from San Francisco International Airport (SFO) and Denver International Airport (DEN), and NASA acquired data from Memphis International Airport (MEM). The results are compared with observed data. This Inverse Model validation is documented in Section 4. A summary is given in Section 5. A user's guide for the inverse wake vortex model is presented in a separate NorthWest Research Associates technical report (Lai and Delisi, 2007a).

  4. L3:PHI.CMD.P13.02 Support for CILC L1 Milestone Using STAR-CCM+

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slattery, Stuart R.; Gurecky, William L.

    2016-10-07

    This report documents work performed to support Consortium for the Advanced Simulation of LWRs (CASL) modeling of Chalk River Unidentified Deposit (CRUD) Induced Power Shift (CIPS) and CRUD Induced Local Corrosion (CILC) using the Cicada package. The work documented here is intended to complement current and future CIPS and CILC modeling activities in CASL. We provide tools for crud and corrosion-related simulation and analysis by developing a better understanding of the interplay between the coupled physics that describe the phenomena at different time and length scales. We intend to use these models to better inform future simulation capability and development.

  5. Evaluating Nextgen Closely Spaced Parallel Operations Concepts with Validated Human Performance Models: Flight Deck Guidelines

    NASA Technical Reports Server (NTRS)

    Hooey, Becky Lee; Gore, Brian Francis; Mahlstedt, Eric; Foyle, David C.

    2013-01-01

    The objectives of the current research were to develop valid human performance models (HPMs) of approach and land operations; use these models to evaluate the impact of NextGen Closely Spaced Parallel Operations (CSPO) on pilot performance; and draw conclusions regarding flight deck display design and pilot-ATC roles and responsibilities for NextGen CSPO concepts. This document presents guidelines and implications for flight deck display designs and candidate roles and responsibilities. A companion document (Gore, Hooey, Mahlstedt, & Foyle, 2013) provides complete scenario descriptions and results including predictions of pilot workload, visual attention and time to detect off-nominal events.

  6. AQUATOX Data Sources Documents

    EPA Pesticide Factsheets

    Contains the data sources for parameter values of the AQUATOX model including: a bibliography for the AQUATOX data libraries and the compendia of parameter values for US Army Corps of Engineers models.

  7. AN EVALUATION OF HANFORD SITE TANK FARM SUBSURFACE CONTAMINATION FY2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MANN, F.M.

    2007-07-10

    The Tank Farm Vadose Zone (TFVZ) Project conducts activities to characterize and analyze the long-term environmental and human health impacts from tank waste releases to the vadose zone. The project also implements interim measures to mitigate impacts, and plans the remediation of waste releases from tank farms and associated facilities. The scope of this document is to report data needs that are important to estimating long-term human health and environmental risks. The scope does not include technologies needed to remediate contaminated soils and facilities, technologies needed to close tank farms, or management and regulatory decisions that will impact remediation andmore » closure. This document is an update of ''A Summary and Evaluation of Hanford Site Tank Farm Subsurface Contamination''. That 1998 document summarized knowledge of subsurface contamination beneath the tank farms at the time. It included a preliminary conceptual model for migration of tank wastes through the vadose zone and an assessment of data and analysis gaps needed to update the conceptual model. This document provides a status of the data and analysis gaps previously defined and discussion of the gaps and needs that currently exist to support the stated mission of the TFVZ Project. The first data-gaps document provided the basis for TFVZ Project activities over the previous eight years. Fourteen of the nineteen knowledge gaps identified in the previous document have been investigated to the point that the project defines the current status as acceptable. In the process of filling these gaps, significant accomplishments were made in field work and characterization, laboratory investigations, modeling, and implementation of interim measures. The current data gaps are organized in groups that reflect Components of the tank farm vadose zone conceptual model: inventory, release, recharge, geohydrology, geochemistry, and modeling. The inventory and release components address residual wastes that will remain in the tanks and tank-farm infrastructure after closure and potential losses from leaks during waste retrieval. Recharge addresses the impacts of current conditions in the tank farms (i.e. gravel covers that affect infiltration and recharge) as well as the impacts of surface barriers. The geohydrology and geochemistry components address the extent of the existing subsurface contaminant inventory and drivers and pathways for contaminants to be transported through the vadose zone and groundwater. Geochemistry addresses the mobility of key reactive contaminants such as uranium. Modeling addresses conceptual models and how they are simulated in computers. The data gaps will be used to provide input to planning (including the upcoming C Farm Data Quality Objective meetings scheduled this year).« less

  8. THE MARK I BUSINESS SYSTEM SIMULATION MODEL

    DTIC Science & Technology

    of a large-scale business simulation model as a vehicle for doing research in management controls. The major results of the program were the...development of the Mark I business simulation model and the Simulation Package (SIMPAC). SIMPAC is a method and set of programs facilitating the construction...of large simulation models. The object of this document is to describe the Mark I Corporation model, state why parts of the business were modeled as they were, and indicate the research applications of the model. (Author)

  9. PubMed related articles: a probabilistic topic-based model for content similarity

    PubMed Central

    Lin, Jimmy; Wilbur, W John

    2007-01-01

    Background We present a probabilistic topic-based model for content similarity called pmra that underlies the related article search feature in PubMed. Whether or not a document is about a particular topic is computed from term frequencies, modeled as Poisson distributions. Unlike previous probabilistic retrieval models, we do not attempt to estimate relevance–but rather our focus is "relatedness", the probability that a user would want to examine a particular document given known interest in another. We also describe a novel technique for estimating parameters that does not require human relevance judgments; instead, the process is based on the existence of MeSH ® in MEDLINE ®. Results The pmra retrieval model was compared against bm25, a competitive probabilistic model that shares theoretical similarities. Experiments using the test collection from the TREC 2005 genomics track shows a small but statistically significant improvement of pmra over bm25 in terms of precision. Conclusion Our experiments suggest that the pmra model provides an effective ranking algorithm for related article search. PMID:17971238

  10. The Methodology of Interactive Parametric Modelling of Construction Site Facilities in BIM Environment

    NASA Astrophysics Data System (ADS)

    Kozlovská, Mária; Čabala, Jozef; Struková, Zuzana

    2014-11-01

    Information technology is becoming a strong tool in different industries, including construction. The recent trend of buildings designing is leading up to creation of the most comprehensive virtual building model (Building Information Model) in order to solve all the problems relating to the project as early as in the designing phase. Building information modelling is a new way of approaching to the design of building projects documentation. Currently, the building site layout as a part of the building design documents has a very little support in the BIM environment. Recently, the research of designing the construction process conditions has centred on improvement of general practice in planning and on new approaches to construction site layout planning. The state of art in field of designing the construction process conditions indicated an unexplored problem related to connection of knowledge system with construction site facilities (CSF) layout through interactive modelling. The goal of the paper is to present the methodology for execution of 3D construction site facility allocation model (3D CSF-IAM), based on principles of parametric and interactive modelling.

  11. 2013 RFP Ports Initiative Supporting Documents

    EPA Pesticide Factsheets

    Documents include: RFPs, Project Narrative, Application fleet description (AFD), Priority County List, Model years for eligible nonroad engines (nonroad remaining useful life), Sample Drayage, Marince Engine Eligibility, FAQs, and Webinar slides

  12. 2014 RFP Ports Initiative Supporting Documents

    EPA Pesticide Factsheets

    Documents include: RFPs, Project Narrative, Application fleet description (AFD), Priority County List, Model years for eligible nonroad engines (nonroad remaining useful life), Sample Drayage, Marince Engine Eligibility, FAQs, and Webinar slides

  13. Renewable Fuels Module - NEMS Documentation

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the Annual Energy Outlook forecasts.

  14. The Earth System Documentation (ES-DOC) Software Process

    NASA Astrophysics Data System (ADS)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  15. THE LAKE MICHIGAN MASS BALANCE PROJECT: QUALITY ASSURANCE PLAN FOR MATHEMATICAL MODELLING

    EPA Science Inventory

    This report documents the quality assurance process for the development and application of the Lake Michigan Mass Balance Models. The scope includes the overall modeling framework as well as the specific submodels that are linked to form a comprehensive synthesis of physical, che...

  16. Short-Term Energy Outlook Model Documentation: Other Petroleum Products Consumption Model

    EIA Publications

    2011-01-01

    The other petroleum product consumption module of the Short-Term Energy Outlook (STEO) model is designed to provide U.S. consumption forecasts for 6 petroleum product categories: asphalt and road oil, petrochemical feedstocks, petroleum coke, refinery still gas, unfinished oils, and other miscvellaneous products

  17. FHWA travel analysis framework : development of VMT forecasting models for use by the Federal Highway Administration

    DOT National Transportation Integrated Search

    2014-05-12

    This document details the process that the Volpe National Transportation Systems Center (Volpe) used to develop travel forecasting models for the Federal Highway Administration (FHWA). The purpose of these models is to allow FHWA to forecast future c...

  18. Systems Operation Studies for Automated Guideway Transit Systems: Feeder Systems Model Functional Specification

    DOT National Transportation Integrated Search

    1981-01-01

    This document specifies the functional requirements for the AGT-SOS Feeder Systems Model (FSM), the type of hardware required, and the modeling techniques employed by the FSM. The objective of the FSM is to map the zone-to-zone transit patronage dema...

  19. Operation of the computer model for microenvironment solar exposure

    NASA Technical Reports Server (NTRS)

    Gillis, J. R.; Bourassa, R. J.; Gruenbaum, P. E.

    1995-01-01

    A computer model for microenvironmental solar exposure was developed to predict solar exposure to satellite surfaces which may shadow or reflect on one another. This document describes the technical features of the model as well as instructions for the installation and use of the program.

  20. Matrix Population Model for Estimating Effects from Time-Varying Aquatic Exposures: Technical Documentation

    EPA Science Inventory

    The Office of Pesticide Programs models daily aquatic pesticide exposure values for 30 years in its risk assessments. However, only a fraction of that information is typically used in these assessments. The population model employed herein is a deterministic, density-dependent pe...

  1. ICCE/ICCAI 2000 Full & Short Papers (Student Modeling).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers on student modeling from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Computational Model for Learner's Motivation States in Individualized Tutoring System" (Behrouz H. Far and Anete H.…

  2. Watershed Management Optimization Support Tool (WMOST) v1: Theoretical Documentation

    EPA Science Inventory

    The Watershed Management Optimization Support Tool (WMOST) is a screening model that is spatially lumped with options for a daily or monthly time step. It is specifically focused on modeling the effect of management decisions on the watershed. The model considers water flows and ...

  3. COMPUTER PROGRAM DOCUMENTATION FOR THE ENHANCED STREAM WATER QUALITY MODEL QUAL2E

    EPA Science Inventory

    Presented in the manual are recent modifications and improvements to the widely used stream water quality model QUAL-II. Called QUAL2E, the enhanced model incorporates improvements in eight areas: (1) algal, nitrogen, phosphorus, and dissolved oxygen interactions; (2) algal growt...

  4. Improved compliance by BPM-driven workflow automation.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  5. DIVWAG Model Documentation. Volume II. Programmer/Analyst Manual. Part 4.

    DTIC Science & Technology

    1976-07-01

    Model Constant Data Deck Structure . .. .... IV-13-A-40 Appendix B. Movement Model Program Descriptions . .. .. . .IV-13-B-1 1. Introduction...Data ................ IV-15-A-17 11. Airmobile Constant Data Deck Structure .. ...... .. IV-15-A-30 Appendix B. Airmobile Model Program Descriptions...Make no changes. 12. AIRMOBILE CONSTANT DATA DECK STRUCTURE . The deck structure required by the Airmobile Model constant data load program and the data

  6. An Empirical Model-based MOE for Friction Reduction by Slot-Ejected Polymer Solutions in an Aqueous Environment

    DTIC Science & Technology

    2007-12-21

    of hydrodynamics and the physical characteristics of the polymers. The physics models include both analytical models and numerical simulations ...the experimental observations. The numerical simulations also succeed in replicating some experimental measurements. However, there is still no...become quite significant. 4.5 Documentation The complete model is coded in MatLab . In the model, all units are cgs, so distances are in

  7. Observation model and parameter partials for the JPL VLBI parameter estimation software MODEST, 19 94

    NASA Technical Reports Server (NTRS)

    Sovers, O. J.; Jacobs, C. S.

    1994-01-01

    This report is a revision of the document Observation Model and Parameter Partials for the JPL VLBI Parameter Estimation Software 'MODEST'---1991, dated August 1, 1991. It supersedes that document and its four previous versions (1983, 1985, 1986, and 1987). A number of aspects of the very long baseline interferometry (VLBI) model were improved from 1991 to 1994. Treatment of tidal effects is extended to model the effects of ocean tides on universal time and polar motion (UTPM), including a default model for nearly diurnal and semidiurnal ocean tidal UTPM variations, and partial derivatives for all (solid and ocean) tidal UTPM amplitudes. The time-honored 'K(sub 1) correction' for solid earth tides has been extended to include analogous frequency-dependent response of five tidal components. Partials of ocean loading amplitudes are now supplied. The Zhu-Mathews-Oceans-Anisotropy (ZMOA) 1990-2 and Kinoshita-Souchay models of nutation are now two of the modeling choices to replace the increasingly inadequate 1980 International Astronomical Union (IAU) nutation series. A rudimentary model of antenna thermal expansion is provided. Two more troposphere mapping functions have been added to the repertoire. Finally, corrections among VLBI observations via the model of Treuhaft and lanyi improve modeling of the dynamic troposphere. A number of minor misprints in Rev. 4 have been corrected.

  8. Emergency Response Capability Baseline Needs Assessment - Requirements Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharry, John A.

    This document was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by LLNL Emergency Management Department Head James Colson. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only addresses emergency response.

  9. The Tech Prep Handbook: Essential Documents To Promte Effective Tech Prep Policies and Practices.

    ERIC Educational Resources Information Center

    Hensley, Oliver D., Ed.; And Others

    Developed during a project to document and analyze the tech prep initiative in Texas, this handbook contains exemplary documents associated with the model programs in the state. This second edition of the handbook organizes documents in sections (sections A, C, D, and G) that correspond to the major impact sectors identified during the research…

  10. Development of 3D Oxide Fuel Mechanics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, B. W.; Casagranda, A.; Pitts, S. A.

    This report documents recent work to improve the accuracy and robustness of the mechanical constitutive models used in the BISON fuel performance code. These developments include migration of the fuel mechanics models to be based on the MOOSE Tensor Mechanics module, improving the robustness of the smeared cracking model, implementing a capability to limit the time step size based on material model response, and improving the robustness of the return mapping iterations used in creep and plasticity models.

  11. Architectural Heritage Documentation by Using Low Cost Uav with Fisheye Lens: Otag-I Humayun in Istanbul as a Case Study

    NASA Astrophysics Data System (ADS)

    Yastikli, N.; Özerdem, Ö. Z.

    2017-11-01

    The digital documentation of architectural heritage is important for monitoring, preserving, managing as well as 3B BIM modelling, time-space VR (virtual reality) applications. The unmanned aerial vehicles (UAVs) have been widely used in these application thanks to rapid developments in technology which enable the high resolution images with resolutions in millimeters. Moreover, it has become possible to produce highly accurate 3D point clouds with structure from motion (SfM) and multi-view stereo (MVS), to obtain a surface reconstruction of a realistic 3D architectural heritage model by using high-overlap images and 3D modeling software such as Context capture, Pix4Dmapper, Photoscan. In this study, digital documentation of Otag-i Humayun (The Ottoman Empire Sultan's Summer Palace) located in Davutpaşa, Istanbul/Turkey is aimed using low cost UAV. The data collections have been made with low cost UAS 3DR Solo UAV with GoPro Hero 4 with fisheye lens. The data processing was accomplished by using commercial Pix4D software. The dense point clouds, a true orthophoto and 3D solid model of the Otag-i Humayun were produced results. The quality check of the produced point clouds has been performed. The obtained result from Otag-i Humayun in Istanbul proved that, the low cost UAV with fisheye lens can be successfully used for architectural heritage documentation.

  12. Documentation for MeshKit - Reactor Geometry (&mesh) Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, Rajeev; Mahadevan, Vijay

    2015-09-30

    This report gives documentation for using MeshKit’s Reactor Geometry (and mesh) Generator (RGG) GUI and also briefly documents other algorithms and tools available in MeshKit. RGG is a program designed to aid in modeling and meshing of complex/large hexagonal and rectilinear reactor cores. RGG uses Argonne’s SIGMA interfaces, Qt and VTK to produce an intuitive user interface. By integrating a 3D view of the reactor with the meshing tools and combining them into one user interface, RGG streamlines the task of preparing a simulation mesh and enables real-time feedback that reduces accidental scripting mistakes that could waste hours of meshing.more » RGG interfaces with MeshKit tools to consolidate the meshing process, meaning that going from model to mesh is as easy as a button click. This report is designed to explain RGG v 2.0 interface and provide users with the knowledge and skills to pilot RGG successfully. Brief documentation of MeshKit source code, tools and other algorithms available are also presented for developers to extend and add new algorithms to MeshKit. RGG tools work in serial and parallel and have been used to model complex reactor core models consisting of conical pins, load pads, several thousands of axially varying material properties of instrumentation pins and other interstices meshes.« less

  13. International Space Station Human Behavior and Performance Competency Model: Volume I

    NASA Technical Reports Server (NTRS)

    Schmidt, Lacey

    2008-01-01

    This document defines Human Behavior and Performance (HBP) competencies that are recommended to be included as requirements to participate in international long duration missions. They were developed in response to the Multilateral Crew Operations Panel (MMOP) request to develop HBP training requirements for the International Space Station (ISS). The competency model presented here was developed by the ITCB HBPT WG and forms the basis for determining the HBP training curriculum for long duration crewmembers. This document lists specific HBP competencies and behaviors required of astronauts/cosmonauts who participate in ISS expedition and other international longduration missions. Please note that this model does not encompass all competencies required. For example, outside the scope of this document are cognitive skills and abilities, including but not limited to concentration, memorization, perception, imagination, and thinking. It is assumed that these skills, which are crucial in terms of human behavior and performance, are considered during selection phase since such professionally significant qualities of the operator should be taken into consideration in order to ensure sufficient baseline levels that can be further improved during general astronaut training. Also, technical competencies, even though critical for crewmembers, are beyond the scope of this document. It should also be noted that the competencies in this model (and subsequent objectives) are not intended to limit the internal activities or training programs of any international partner.

  14. The Software Design Document: More than a User's Manual.

    ERIC Educational Resources Information Center

    Bowers, Dennis

    1989-01-01

    Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…

  15. Information Model for Reusability in Clinical Trial Documentation

    ERIC Educational Resources Information Center

    Bahl, Bhanu

    2013-01-01

    In clinical research, New Drug Application (NDA) to health agencies requires generation of a large number of documents throughout the clinical development life cycle, many of which are also submitted to public databases and external partners. Current processes to assemble the information, author, review and approve the clinical research documents,…

  16. A Structured Model for Software Documentation.

    ERIC Educational Resources Information Center

    Swigger, Keith

    The concept of "structured programming" was developed to facilitate software production, but it has not carried over to documentation design. Two concepts of structure are relevant to user documentation for computer programs. The first is based on programming techniques that emphasize decomposition of tasks into discrete modules, while the second…

  17. Topic Models for Link Prediction in Document Networks

    ERIC Educational Resources Information Center

    Kataria, Saurabh

    2012-01-01

    Recent explosive growth of interconnected document collections such as citation networks, network of web pages, content generated by crowd-sourcing in collaborative environments, etc., has posed several challenging problems for data mining and machine learning community. One central problem in the domain of document networks is that of "link…

  18. Helicopter fuel burn modeling in AEDT.

    DOT National Transportation Integrated Search

    2011-08-01

    This report documents work done to enhance helicopter fuel consumption modeling in the Federal Aviation : Administrations Aviation Environmental Design Tool (AEDT). Fuel consumption and flight performance data : were collected from helicopter flig...

  19. PRZM-3, A MODEL FOR PREDICTING PESTICIDE AND NITROGEN FATE IN THE CROP ROOT AND UNSATURATED SOIL ZONES: USER'S MANUAL FOR RELEASE 3.12.2

    EPA Science Inventory

    This publication contains documentation for the PRZM-3 model. PRZM-3 is the most recent version of a modeling system that links two subordinate models, PRZM and VADOFT, in order to predict pesticide transport and transformation down through the crop root and unsaturated soil zone...

  20. Cultural Resource Predictive Modeling

    DTIC Science & Technology

    2017-10-01

    property to manage ? a. Yes 2) Do you use CRPM (Cultural Resource Predictive Modeling) No, but I use predictive modelling informally . For example...resource program and provide support to the test ranges for their missions. This document will provide information such as lessons learned, points...of contact, and resources to the range cultural resource managers . Objective/Scope: Identify existing cultural resource predictive models and

  1. User Delay Cost Model and Facilities Maintenance Cost Model for a Terminal Control Area : Volume 2. User's Manual and Program Documentation for the User Delay Cost Model

    DOT National Transportation Integrated Search

    1978-05-01

    The User Delay Cost Model (UDCM) is a Monte Carlo simulation of certain classes of movement of air traffic in the Boston Terminal Control Area (TCA). It incorporates a weather module, an aircraft generation module, a facilities module, and an air con...

  2. COVER: A user's guide to the CANOPY and SHRUBS extension of the Stand Prognosis Model

    Treesearch

    Melinda Moeur

    1985-01-01

    The COVER model predicts vertical and horizontal tree canopy closure, tree foliage biomass, and the probability of occurrence, height, and cover of shrubs in forest stands. This paper documents use of the COVER program, an adjunct to the Stand Prognosis Model. Preparation of input, interpretation of output, program control, model characteristics, and example...

  3. Scientific Ballooning Technologies Workshop STO-2 Thermal Design and Analysis

    NASA Technical Reports Server (NTRS)

    Ferguson, Doug

    2016-01-01

    The heritage thermal model for the full STO-2 (Stratospheric Terahertz Observatory II), vehicle has been updated to model the CSBF (Columbia Scientific Balloon Facility) SIP-14 (Scientific Instrument Package) in detail. Analysis of this model has been performed for the Antarctica FY2017 launch season. Model temperature predictions are compared to previous results from STO-2 review documents.

  4. FPL-PELPS : a price endogenous linear programming system for economic modeling, supplement to PELPS III, version 1.1.

    Treesearch

    Patricia K. Lebow; Henry Spelter; Peter J. Ince

    2003-01-01

    This report provides documentation and user information for FPL-PELPS, a personal computer price endogenous linear programming system for economic modeling. Originally developed to model the North American pulp and paper industry, FPL-PELPS follows its predecessors in allowing the modeling of any appropriate sector to predict consumption, production and capacity by...

  5. Development of the Play Experience Model to Enhance Desirable Qualifications of Early Childhood

    ERIC Educational Resources Information Center

    Panpum, Watchara; Soonthornrojana, Wimonrat; Nakunsong, Thatsanee

    2015-01-01

    The objectives of this research were to develop the play experience model and to study the effect of usage in play experience model for enhancing the early childhood's desirable qualification. There were 3 phases of research: 1) the document and context in experience management were studied, 2) the play experience model was developed, and 3) the…

  6. User Manual for SAHM package for VisTrails

    USGS Publications Warehouse

    Talbert, C.B.; Talbert, M.K.

    2012-01-01

    The Software for Assisted Habitat I\\•1odeling (SAHM) has been created to both expedite habitat modeling and help maintain a record of the various input data, pre-and post-processing steps and modeling options incorporated in the construction of a species distribution model. The four main advantages to using the combined VisTrail: SAHM package for species distribution modeling are: 1. formalization and tractable recording of the entire modeling process 2. easier collaboration through a common modeling framework 3. a user-friendly graphical interface to manage file input, model runs, and output 4. extensibility to incorporate future and additional modeling routines and tools. This user manual provides detailed information on each module within the SAHM package, their input, output, common connections, optional arguments, and default settings. This information can also be accessed for individual modules by right clicking on the documentation button for any module in VisTrail or by right clicking on any input or output for a module and selecting view documentation. This user manual is intended to accompany the user guide which provides detailed instructions on how to install the SAHM package within VisTrails and then presents information on the use of the package.

  7. Engineered Barrier System: Physical and Chemical Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    2004-04-26

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming bymore » deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.« less

  8. National Centers for Environmental Prediction

    Science.gov Websites

    Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Documentation, experiments, web content Nicole McKee Makefiles, scripts, launcher Edward Colon NEMSIO, post Yang GFS post Hui-ya Chuang NAM development Tom Black Dusan Jovic Jim Abeles GFS development S Moorthi

  9. Two Models for Implementing Senior Mentor Programs in Academic Medical Settings

    ERIC Educational Resources Information Center

    Corwin, Sara J.; Bates, Tovah; Cohan, Mary; Bragg, Dawn S.; Roberts, Ellen

    2007-01-01

    This paper compares two models of undergraduate geriatric medical education utilizing senior mentoring programs. Descriptive, comparative multiple-case study was employed analyzing program documents, archival records, and focus group data. Themes were compared for similarities and differences between the two program models. Findings indicate that…

  10. New Models and Metaphors for Human Resource Development.

    ERIC Educational Resources Information Center

    1999

    This document contains two reports from a poster session on new ideas and models in human resource development (HRD). The first presentation, "Two-way Customer-Service Provider Cycle" (Harriet V. Lawrence, Albert K. Wiswell), discusses a two-way supply cycle model that illustrates relational issues in customer service, including needs…

  11. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 2. Computer-program documentation. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    This report describes the structure and operation of prototype computer programs developed for a Monte Carlo simulation model, GENESIS, and for two analytical models, OPCON and OPPLAN. It includes input data requirements and sample test cases.

  12. Southwest University's No-Fee Teacher-Training Model

    ERIC Educational Resources Information Center

    Chen, Shijian; Yang, Shuhan; Li, Linyuan

    2013-01-01

    The training model for Southwest University's no-fee teacher education program has taken shape over several years. Based on a review of the documentation and interviews with administrators and no-fee preservice students from different specialties, this article analyzes Southwest University's no-fee teacher-training model in terms of three main…

  13. ESTIMATION OF INFILTRATION RATE IN THE VADOSE ZONE: COMPILATION OF SIMPLE MATHEMATICAL MODELS - VOLUME I

    EPA Science Inventory

    The unsaturated or vadose zone provides a complex system for the simulation of water movement and contaminant transport and fate. Numerous models are available for performing simulations related to the movement of water. There exists extensive documentation of these models. Ho...

  14. The Carerra Model: A Success in Pregnancy Prevention.

    ERIC Educational Resources Information Center

    Elling, Duane M.

    This document outlines the development, evaluation, and replication of the Carrera model for pregnancy prevention. The Carerra model helps teens avoid pregnancy by empowering them to develop and reach personal goals, and by providing them with information on sexual issues, including abstinence, contraception, and the consequences of sexual…

  15. PREDICTING ATTENUATION OF VIRUSES DURING PERCOLATION IN SOILS: 2. USER'S GUIDE TO THE VIRULO 1.0 COMPUTER MODEL

    EPA Science Inventory

    In the EPA document Predicting Attenuation of Viruses During Percolation in Soils 1. Probabilistic Model the conceptual, theoretical, and mathematical foundations for a predictive screening model were presented. In this current volume we present a User's Guide for the computer mo...

  16. Teaching through Modeling: Four Schools' Experiences in Sustainability Education

    ERIC Educational Resources Information Center

    Higgs, Amy Lyons; McMillan, Victoria M.

    2006-01-01

    In this article, the authors examine how 4 innovative secondary schools model sustainable practices to their students. During school visits, the authors conducted interviews, observed daily life, and reviewed school documents. They found that modeling is a valuable approach to sustainability education, promoting both learning about sustainability…

  17. Tiered Pricing: Implications for Library Collections

    ERIC Educational Resources Information Center

    Hahn, Karla

    2005-01-01

    In recent years an increasing number of publishers have adopted tiered pricing of journals. The design and implications of tiered-pricing models, however, are poorly understood. Tiered pricing can be modeled using several variables. A survey of current tiered-pricing models documents the range of key variables used. A sensitivity analysis…

  18. INTEGRATION OF AN ECONOMY UNDER IMPERFECT COMPETITION WITH A TWELVE-CELL ECOLOGICAL MODEL

    EPA Science Inventory

    This report documents the scientific research work done to date on developing a generalized mathematical model depicting a combined economic-ecological-social system with the goal of making it available to the scientific community. The model is preliminary and has not been tested...

  19. 7 CFR Exhibit E to Subpart A of... - Voluntary National Model Building Codes

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Voluntary National Model Building Codes E Exhibit E... HOUSING SERVICE, RURAL BUSINESS-COOPERATIVE SERVICE, RURAL UTILITIES SERVICE, AND FARM SERVICE AGENCY... National Model Building Codes The following documents address the health and safety aspects of buildings...

  20. National Centers for Environmental Prediction

    Science.gov Websites

    Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log

  1. National Centers for Environmental Prediction

    Science.gov Websites

    Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log Contacts

  2. ADVANCED UTILITY SIMULATION MODEL DOCUMENTATION OF SYSTEM DESIGN STATE LEVEL MODEL (VERSION 1.0)

    EPA Science Inventory

    The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...

  3. Exploring Term Dependences in Probabilistic Information Retrieval Model.

    ERIC Educational Resources Information Center

    Cho, Bong-Hyun; Lee, Changki; Lee, Gary Geunbae

    2003-01-01

    Describes a theoretic process to apply Bahadur-Lazarsfeld expansion (BLE) to general probabilistic models and the state-of-the-art 2-Poisson model. Through experiments on two standard document collections, one in Korean and one in English, it is demonstrated that incorporation of term dependences using BLE significantly contributes to performance…

  4. Model Learner Outcomes for Technology Education/Industrial Technology.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Education, St. Paul.

    This guide provides model learner outcomes used by communities and schools to improve learning experiences in trade and industrial education. It contains a mission statement for public education in Minnesota and 13 learner goals that must be incorporated into each district's goal statements. The bulk of this document contains model learner…

  5. A Custody Evaluation Model for Pre-School Children.

    ERIC Educational Resources Information Center

    Roseby, Vivienne

    This document addresses the needs of mental health consultants involved in decision-making in custody disputes. A psycho-ecological model for assessing contexts of development in cases involving preschool children is presented, and the theoretical basis and rationale for the model are discussed. Issues, instruments, and findings of recent…

  6. INDOOR AIR QUALITY MODEL VERSION 1.0 DOCUMENTATION

    EPA Science Inventory

    The report presents a multiroom model for estimating the impact of various sources on indoor air quality (IAQ). The model is written for use on IBM-PC and compatible microcomputers. It is easy to use with a menu-driven user interface. Data are entered using a fill-in-a-form inter...

  7. Sedimentation Solutions for Military Ocean Terminal Sunny Point (MOTSU), North Carolina

    DTIC Science & Technology

    2012-07-01

    quality at MOTSU at the request of US Army Engineer District–Wilmington (USAED-SAW). The objective was achieved through numerical modeling ...literature review, and sediment forecasting. This report documents the results of the numerical modeling study only. Two advantageous approaches for...data .............................................................................................................. 25  4  Model Development

  8. SMOKE TOOL FOR MODELS-3 VERSION 4.1 STRUCTURE AND OPERATION DOCUMENTATION

    EPA Science Inventory

    The SMOKE Tool is a part of the Models-3 system, a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. The SMOKE Tool is an input processor for SMOKE, (Sparse Matrix Operator Kernel Emissio...

  9. Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karali, Nihan; Xu, Tengfang; Sathaye, Jayant

    2012-12-12

    The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.

  10. FMCSA safety program effectiveness measurement compliance review effectiveness model results for carriers with compliance reviews in fiscal year 2009 : [analysis brief].

    DOT National Transportation Integrated Search

    2014-04-01

    This Analysis Brief documents the methodology and results from the Compliance Review Effectiveness Model (CREM) for carriers receiving CRs in fiscal year (FY) 2009. The model measures the effectiveness of the compliance review (CR) program, one of th...

  11. Transportation Sector Model of the National Energy Modeling System. Volume 2 -- Appendices: Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The attachments contained within this appendix provide additional details about the model development and estimation process which do not easily lend themselves to incorporation in the main body of the model documentation report. The information provided in these attachments is not integral to the understanding of the model`s operation, but provides the reader with opportunity to gain a deeper understanding of some of the model`s underlying assumptions. There will be a slight degree of replication of materials found elsewhere in the documentation, made unavoidable by the dictates of internal consistency. Each attachment is associated with a specific component of themore » transportation model; the presentation follows the same sequence of modules employed in Volume 1. The following attachments are contained in Appendix F: Fuel Economy Model (FEM)--provides a discussion of the FEM vehicle demand and performance by size class models; Alternative Fuel Vehicle (AFV) Model--describes data input sources and extrapolation methodologies; Light-Duty Vehicle (LDV) Stock Model--discusses the fuel economy gap estimation methodology; Light Duty Vehicle Fleet Model--presents the data development for business, utility, and government fleet vehicles; Light Commercial Truck Model--describes the stratification methodology and data sources employed in estimating the stock and performance of LCT`s; Air Travel Demand Model--presents the derivation of the demographic index, used to modify estimates of personal travel demand; and Airborne Emissions Model--describes the derivation of emissions factors used to associate transportation measures to levels of airborne emissions of several pollutants.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. A. Wasiolek

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the referencemore » biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).« less

  13. NCPP's Use of Standard Metadata to Promote Open and Transparent Climate Modeling

    NASA Astrophysics Data System (ADS)

    Treshansky, A.; Barsugli, J. J.; Guentchev, G.; Rood, R. B.; DeLuca, C.

    2012-12-01

    The National Climate Predictions and Projections (NCPP) Platform is developing comprehensive regional and local information about the evolving climate to inform decision making and adaptation planning. This includes both creating and providing tools to create metadata about the models and processes used to create its derived data products. NCPP is using the Common Information Model (CIM), an ontology developed by a broad set of international partners in climate research, as its metadata language. This use of a standard ensures interoperability within the climate community as well as permitting access to the ecosystem of tools and services emerging alongside the CIM. The CIM itself is divided into a general-purpose (UML & XML) schema which structures metadata documents, and a project or community-specific (XML) Controlled Vocabulary (CV) which constraints the content of metadata documents. NCPP has already modified the CIM Schema to accommodate downscaling models, simulations, and experiments. NCPP is currently developing a CV for use by the downscaling community. Incorporating downscaling into the CIM will lead to several benefits: easy access to the existing CIM Documents describing CMIP5 models and simulations that are being downscaled, access to software tools that have been developed in order to search, manipulate, and visualize CIM metadata, and coordination with national and international efforts such as ES-DOC that are working to make climate model descriptions and datasets interoperable. Providing detailed metadata descriptions which include the full provenance of derived data products will contribute to making that data (and, the models and processes which generated that data) more open and transparent to the user community.

  14. An Introduction to Transient Engine Applications Using the Numerical Propulsion System Simulation (NPSS) and MATLAB

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey C.; Csank, Jeffrey T.; Haller, William J.; Seidel, Jonathan A.

    2016-01-01

    This document outlines methodologies designed to improve the interface between the Numerical Propulsion System Simulation framework and various control and dynamic analyses developed in the Matlab and Simulink environment. Although NPSS is most commonly used for steady-state modeling, this paper is intended to supplement the relatively sparse documentation on it's transient analysis functionality. Matlab has become an extremely popular engineering environment, and better methodologies are necessary to develop tools that leverage the benefits of these disparate frameworks. Transient analysis is not a new feature of the Numerical Propulsion System Simulation (NPSS), but transient considerations are becoming more pertinent as multidisciplinary trade-offs begin to play a larger role in advanced engine designs. This paper serves to supplement the relatively sparse documentation on transient modeling and cover the budding convergence between NPSS and Matlab based modeling toolsets. The following sections explore various design patterns to rapidly develop transient models. Each approach starts with a base model built with NPSS, and assumes the reader already has a basic understanding of how to construct a steady-state model. The second half of the paper focuses on further enhancements required to subsequently interface NPSS with Matlab codes. The first method being the simplest and most straightforward but performance constrained, and the last being the most abstract. These methods aren't mutually exclusive and the specific implementation details could vary greatly based on the designer's discretion. Basic recommendations are provided to organize model logic in a format most easily amenable to integration with existing Matlab control toolsets.

  15. Is an advance care planning model feasible in community palliative care? A multi-site action research approach.

    PubMed

    Blackford, Jeanine; Street, Annette

    2012-09-01

    This article reports a study to determine the feasibility of an advance care planning model developed with Australian community palliative care services. An effective advance care planning programme involves an organizational wide commitment and preparedness for health service reform to embed advance care planning into routine practice. Internationally, such programmes have been implemented predominantly in aged and acute care with more recent work in primary care. A multi-site action research was conducted over a 16-month period in 2007-2009 with three Victorian community palliative care services. Using mixed method data collection strategies to assess feasibility, we conducted a baseline audit of staff and clients; analysed relevant documents (client records, policies, procedures and quality improvement strategies) pre-implementation and post-implementation and conducted key informant interviews (n = 9). Three community palliative care services: one regional and two metropolitan services in Victoria, Australia. The services demonstrated that it was feasible to embed the Model into their organizational structures. Advance care planning conversations and involvement of family was an important outcome measure rather than completion rate of advance care planning documents in community settings. Services adapted and applied their own concept of community, which widened the impact of the model. Changes to quality audit processes were essential to consolidate the model into routine palliative care practice. An advance care planning model is feasible for community palliative care services. Quality audit processes are an essential component of the Model with documentation of advance care planning discussion established as an important outcome measure. © 2011 Blackwell Publishing Ltd.

  16. Fuel burn modeling of turboprop aircraft.

    DOT National Transportation Integrated Search

    2011-08-01

    This report documents work done to enhance turbo-propeller aircraft fuel consumption modeling in the Federal Aviation Administrations Aviation Environmental Design Tool (AEDT). Fuel consumption and flight performance data were collected from aircr...

  17. 2005 v4.2 Technical Support Document

    EPA Pesticide Factsheets

    Technical Support Document for the Final Transport Rule describes how updated 2005 NEI, version 2 emissions and were processed for air quality modeling in support of the Cross-state Air Pollution Rule (CSAPR).

  18. TransGuide : model deployment initiative design report

    DOT National Transportation Integrated Search

    1998-09-01

    This report documents the high-level design of the TransGuide MDI project and discusses the design trade-off decisions. A detailed, specific project level design is provided in each projects System Design document.

  19. NPS national transit inventory, 2013

    DOT National Transportation Integrated Search

    2014-07-31

    This document summarizes key highlights from the National Park Service (NPS) 2013 National Transit Inventory, and presents data for NPS transit systems system-wide. The document discusses statistics related to ridership, business models, fleet charac...

  20. ENVIRONMENTAL INFORMATION MANAGEMENT SYSTEM (EIMS)

    EPA Science Inventory

    The Environmental Information Management System (EIMS) organizes descriptive information (metadata) for data sets, databases, documents, models, projects, and spatial data. The EIMS design provides a repository for scientific documentation that can be easily accessed with standar...

Top