Sample records for model description user

  1. Stochastic Lanchester Air-to-Air Campaign Model: Model Description and Users Guides

    DTIC Science & Technology

    2009-01-01

    STOCHASTIC LANCHESTER AIR-TO-AIR CAMPAIGN MODEL MODEL DESCRIPTION AND USERS GUIDES—2009 REPORT PA702T1 Rober t V. Hemm Jr. Dav id A . Lee...LMI © 2009. ALL RIGHTS RESERVED. Stochastic Lanchester Air-to-Air Campaign Model: Model Description and Users Guides—2009 PA702T1/JANUARY...2009 Executive Summary This report documents the latest version of the Stochastic Lanchester Air-to-Air Campaign Model (SLAACM), developed by LMI for

  2. Format( )MEDIC( )Input

    NASA Astrophysics Data System (ADS)

    Foster, K.

    1994-09-01

    This document is a description of a computer program called Format( )MEDIC( )Input. The purpose of this program is to allow the user to quickly reformat wind velocity data in the Model Evaluation Database (MEDb) into a reasonable 'first cut' set of MEDIC input files (MEDIC.nml, StnLoc.Met, and Observ.Met). The user is cautioned that these resulting input files must be reviewed for correctness and completeness. This program will not format MEDb data into a Problem Station Library or Problem Metdata File. A description of how the program reformats the data is provided, along with a description of the required and optional user input and a description of the resulting output files. A description of the MEDb is not provided here but can be found in the RAS Division Model Evaluation Database Description document.

  3. Shuttle operations simulation model programmers'/users' manual

    NASA Technical Reports Server (NTRS)

    Porter, D. G.

    1972-01-01

    The prospective user of the shuttle operations simulation (SOS) model is given sufficient information to enable him to perform simulation studies of the space shuttle launch-to-launch operations cycle. The procedures used for modifying the SOS model to meet user requirements are described. The various control card sequences required to execute the SOS model are given. The report is written for users with varying computer simulation experience. A description of the components of the SOS model is included that presents both an explanation of the logic involved in the simulation of the shuttle operations cycle and a description of the routines used to support the actual simulation.

  4. Thermal APU/hydraulics analysis program. User's guide and programmer's manual

    NASA Technical Reports Server (NTRS)

    Deluna, T. A.

    1976-01-01

    The User's Guide information plus program description necessary to run and have a general understanding of the Thermal APU/Hydraulics Analysis Program (TAHAP) is described. This information consists of general descriptions of the APU/hydraulic system and the TAHAP model, input and output data descriptions, and specific subroutine requirements. Deck setups and input data formats are included and other necessary and/or helpful information for using TAHAP is given. The math model descriptions for the driver program and each of its supporting subroutines are outlined.

  5. Downsizer - A Graphical User Interface-Based Application for Browsing, Acquiring, and Formatting Time-Series Data for Hydrologic Modeling

    USGS Publications Warehouse

    Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.

    2009-01-01

    The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.

  6. ModelArchiver—A program for facilitating the creation of groundwater model archives

    USGS Publications Warehouse

    Winston, Richard B.

    2018-03-01

    ModelArchiver is a program designed to facilitate the creation of groundwater model archives that meet the requirements of the U.S. Geological Survey (USGS) policy (Office of Groundwater Technical Memorandum 2016.02, https://water.usgs.gov/admin/memo/GW/gw2016.02.pdf, https://water.usgs.gov/ogw/policy/gw-model/). ModelArchiver version 1.0 leads the user step-by-step through the process of creating a USGS groundwater model archive. The user specifies the contents of each of the subdirectories within the archive and provides descriptions of the archive contents. Descriptions of some files can be specified automatically using file extensions. Descriptions also can be specified individually. Those descriptions are added to a readme.txt file provided by the user. ModelArchiver moves the content of the archive to the archive folder and compresses some folders into .zip files.As part of the archive, the modeler must create a metadata file describing the archive. The program has a built-in metadata editor and provides links to websites that can aid in creation of the metadata. The built-in metadata editor is also available as a stand-alone program named FgdcMetaEditor version 1.0, which also is described in this report. ModelArchiver updates the metadata file provided by the user with descriptions of the files in the archive. An optional archive list file generated automatically by ModelMuse can streamline the creation of archives by identifying input files, output files, model programs, and ancillary files for inclusion in the archive.

  7. LDA-Based Unified Topic Modeling for Similar TV User Grouping and TV Program Recommendation.

    PubMed

    Pyo, Shinjee; Kim, Eunhui; Kim, Munchurl

    2015-08-01

    Social TV is a social media service via TV and social networks through which TV users exchange their experiences about TV programs that they are viewing. For social TV service, two technical aspects are envisioned: grouping of similar TV users to create social TV communities and recommending TV programs based on group and personal interests for personalizing TV. In this paper, we propose a unified topic model based on grouping of similar TV users and recommending TV programs as a social TV service. The proposed unified topic model employs two latent Dirichlet allocation (LDA) models. One is a topic model of TV users, and the other is a topic model of the description words for viewed TV programs. The two LDA models are then integrated via a topic proportion parameter for TV programs, which enforces the grouping of similar TV users and associated description words for watched TV programs at the same time in a unified topic modeling framework. The unified model identifies the semantic relation between TV user groups and TV program description word groups so that more meaningful TV program recommendations can be made. The unified topic model also overcomes an item ramp-up problem such that new TV programs can be reliably recommended to TV users. Furthermore, from the topic model of TV users, TV users with similar tastes can be grouped as topics, which can then be recommended as social TV communities. To verify our proposed method of unified topic-modeling-based TV user grouping and TV program recommendation for social TV services, in our experiments, we used real TV viewing history data and electronic program guide data from a seven-month period collected by a TV poll agency. The experimental results show that the proposed unified topic model yields an average 81.4% precision for 50 topics in TV program recommendation and its performance is an average of 6.5% higher than that of the topic model of TV users only. For TV user prediction with new TV programs, the average prediction precision was 79.6%. Also, we showed the superiority of our proposed model in terms of both topic modeling performance and recommendation performance compared to two related topic models such as polylingual topic model and bilingual topic model.

  8. A tool for multi-scale modelling of the renal nephron

    PubMed Central

    Nickerson, David P.; Terkildsen, Jonna R.; Hamilton, Kirk L.; Hunter, Peter J.

    2011-01-01

    We present the development of a tool, which provides users with the ability to visualize and interact with a comprehensive description of a multi-scale model of the renal nephron. A one-dimensional anatomical model of the nephron has been created and is used for visualization and modelling of tubule transport in various nephron anatomical segments. Mathematical models of nephron segments are embedded in the one-dimensional model. At the cellular level, these segment models use models encoded in CellML to describe cellular and subcellular transport kinetics. A web-based presentation environment has been developed that allows the user to visualize and navigate through the multi-scale nephron model, including simulation results, at the different spatial scales encompassed by the model description. The Zinc extension to Firefox is used to provide an interactive three-dimensional view of the tubule model and the native Firefox rendering of scalable vector graphics is used to present schematic diagrams for cellular and subcellular scale models. The model viewer is embedded in a web page that dynamically presents content based on user input. For example, when viewing the whole nephron model, the user might be presented with information on the various embedded segment models as they select them in the three-dimensional model view. Alternatively, the user chooses to focus the model viewer on a cellular model located in a particular nephron segment in order to view the various membrane transport proteins. Selecting a specific protein may then present the user with a description of the mathematical model governing the behaviour of that protein—including the mathematical model itself and various simulation experiments used to validate the model against the literature. PMID:22670210

  9. The NASA/MSFC global reference atmospheric model: 1990 version (GRAM-90). Part 1: Technical/users manual

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Alyea, F. N.; Cunnold, D. M.; Jeffries, W. R., III; Johnson, D. L.

    1991-01-01

    A technical description of the NASA/MSFC Global Reference Atmospheric Model 1990 version (GRAM-90) is presented with emphasis on the additions and new user's manual descriptions of the program operation aspects of the revised model. Some sample results for the new middle atmosphere section and comparisons with results from a three dimensional circulation model are provided. A programmer's manual with more details for those wishing to make their own GRAM program adaptations is also presented.

  10. TOTAL user manual

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.; Boerschlein, David P.

    1994-01-01

    Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all of the states and transitions in the model of a complex system can be devastatingly tedious and error-prone. Even with tools such as the Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST), the user must describe a system by specifying the rules governing the behavior of the system in order to generate the model. With the Table Oriented Translator to the ASSIST Language (TOTAL), the user can specify the components of a typical system and their attributes in the form of a table. The conditions that lead to system failure are also listed in a tabular form. The user can also abstractly specify dependencies with causes and effects. The level of information required is appropriate for system designers with little or no background in the details of reliability calculations. A menu-driven interface guides the user through the system description process, and the program updates the tables as new information is entered. The TOTAL program automatically generates an ASSIST input description to match the system description.

  11. Functional Requirements of a Target Description System for Vulnerability Analysis

    DTIC Science & Technology

    1979-11-01

    called GIFT .1,2 Together the COMGEOM description model and GIFT codes make up the BRL’s target description system. The significance of a target...and modifying target descriptions are described. 1 Lawrence W. Bain, Jr. and Mathew J. Reisinger, "The GIFT Code User Manual; Volume 1...34The GIFT Code User Manual; Volume II, The Output Options," unpublished draft of BRL report. II. UNDERLYING PHILOSOPHY The BRL has a computer

  12. Blade loss transient dynamics analysis. Volume 3: User's manual for TETRA program

    NASA Technical Reports Server (NTRS)

    Black, G. R.; Gallardo, V. C.; Storace, A. S.; Sagendorph, F.

    1981-01-01

    The users manual for TETRA contains program logic, flow charts, error messages, input sheets, modeling instructions, option descriptions, input variable descriptions, and demonstration problems. The process of obtaining a NASTRAN 17.5 generated modal input file for TETRA is also described with a worked sample.

  13. NASA AVOSS Fast-Time Models for Aircraft Wake Prediction: User's Guide (APA3.8 and TDP2.1)

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew J.; Limon Duparcmeur, Fanny M.

    2016-01-01

    NASA's current distribution of fast-time wake vortex decay and transport models includes APA (Version 3.8) and TDP (Version 2.1). This User's Guide provides detailed information on the model inputs, file formats, and model outputs. A brief description of the Memphis 1995, Dallas/Fort Worth 1997, and the Denver 2003 wake vortex datasets is given along with the evaluation of models. A detailed bibliography is provided which includes publications on model development, wake field experiment descriptions, and applications of the fast-time wake vortex models.

  14. Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsdell, J.V. Jr.; Simonen, C.A.; Burk, K.W.

    1994-02-01

    The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses that individuals may have received from operations at the Hanford Site since 1944. This report deals specifically with the atmospheric transport model, Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). RATCHET is a major rework of the MESOILT2 model used in the first phase of the HEDR Project; only the bookkeeping framework escaped major changes. Changes to the code include (1) significant changes in the representation of atmospheric processes and (2) incorporation of Monte Carlo methods for representing uncertainty in input data, model parameters,more » and coefficients. To a large extent, the revisions to the model are based on recommendations of a peer working group that met in March 1991. Technical bases for other portions of the atmospheric transport model are addressed in two other documents. This report has three major sections: a description of the model, a user`s guide, and a programmer`s guide. These sections discuss RATCHET from three different perspectives. The first provides a technical description of the code with emphasis on details such as the representation of the model domain, the data required by the model, and the equations used to make the model calculations. The technical description is followed by a user`s guide to the model with emphasis on running the code. The user`s guide contains information about the model input and output. The third section is a programmer`s guide to the code. It discusses the hardware and software required to run the code. The programmer`s guide also discusses program structure and each of the program elements.« less

  15. Airport-Noise Levels and Annoyance Model (ALAMO) user's guide

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    A guide for the use of the Airport-Noise Level and Annoyance MOdel (ALAMO) at the Langley Research Center computer complex is provided. This document is divided into 5 primary sections, the introduction, the purpose of the model, and an in-depth description of the following subsystems: baseline, noise reduction simulation and track analysis. For each subsystem, the user is provided with a description of architecture, an explanation of subsystem use, sample results, and a case runner's check list. It is assumed that the user is familiar with the operations at the Langley Research Center (LaRC) computer complex, the Network Operating System (NOS 1.4) and CYBER Control Language. Incorporated within the ALAMO model is a census database system called SITE II.

  16. [A new model fo the evaluation of measurements of the neurocranium].

    PubMed

    Seidler, H; Wilfing, H; Weber, G; Traindl-Prohazka, M; zur Nedden, D; Platzer, W

    1993-12-01

    A simple and user-friendly model for trigonometric description of the neurocranium based on newly defined points of measurement is presented. This model not only provides individual description, but also allows for an evaluation of developmental and phylogenetic aspects.

  17. Serious Games for Health: The Potential of Metadata.

    PubMed

    Göbel, Stefan; Maddison, Ralph

    2017-02-01

    Numerous serious games and health games exist, either as commercial products (typically with a focus on entertaining a broad user group) or smaller games and game prototypes, often resulting from research projects (typically tailored to a smaller user group with a specific health characteristic). A major drawback of existing health games is that they are not very well described and attributed with (machine-readable, quantitative, and qualitative) metadata such as the characterizing goal of the game, the target user group, or expected health effects well proven in scientific studies. This makes it difficult or even impossible for end users to find and select the most appropriate game for a specific situation (e.g., health needs). Therefore, the aim of this article was to motivate the need and potential/benefit of metadata for the description and retrieval of health games and to describe a descriptive model for the qualitative description of games for health. It was not the aim of the article to describe a stable, running system (portal) for health games. This will be addressed in future work. Building on previous work toward a metadata format for serious games, a descriptive model for the formal description of games for health is introduced. For the conceptualization of this model, classification schemata of different existing health game repositories are considered. The classification schema consists of three levels: a core set of mandatory descriptive fields relevant for all games for health application areas, a detailed level with more comprehensive, optional information about the games, and so-called extension as level three with specific descriptive elements relevant for dedicated health games application areas, for example, cardio training. A metadata format provides a technical framework to describe, find, and select appropriate health games matching the needs of the end user. Future steps to improve, apply, and promote the metadata format in the health games market are discussed.

  18. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    USGS Publications Warehouse

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  19. Studying the Accuracy of Software Process Elicitation: The User Articulated Model

    ERIC Educational Resources Information Center

    Crabtree, Carlton A.

    2010-01-01

    Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…

  20. Airport Landside. Volume II. The Airport Landside Simulation Model (ALSIM) Description and Users Guide.

    DOT National Transportation Integrated Search

    1982-06-01

    This volume provides a general description of the Airport Landside Simulation Model. A summary of simulated passenger and vehicular processing through the landside is presented. Program operating characteristics and assumptions are documented and a c...

  1. The Loyal Opposition Comments on Plan Domain Description Languages

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Golden, Keith; Jonsson, Ari

    2003-01-01

    In this paper we take a critical look at PDDL 2.1 as designers and users of plan domain description languages. We describe planning domains that have features which are hard to model using PDDL 2.1. We then offer some suggestions on domain description language design, and describe how these suggestions make modeling our chosen domains easier.

  2. Predicting Facebook users' online privacy protection: risk, trust, norm focus theory, and the theory of planned behavior.

    PubMed

    Saeri, Alexander K; Ogilvie, Claudette; La Macchia, Stephen T; Smith, Joanne R; Louis, Winnifred R

    2014-01-01

    The present research adopts an extended theory of the planned behavior model that included descriptive norms, risk, and trust to investigate online privacy protection in Facebook users. Facebook users (N = 119) completed a questionnaire assessing their attitude, subjective injunctive norm, subjective descriptive norm, perceived behavioral control, implicit perceived risk, trust of other Facebook users, and intentions toward protecting their privacy online. Behavior was measured indirectly 2 weeks after the study. The data show partial support for the theory of planned behavior and strong support for the independence of subjective injunctive and descriptive norms. Risk also uniquely predicted intentions over and above the theory of planned behavior, but there were no unique effects of trust on intentions, nor of risk or trust on behavior. Implications are discussed.

  3. Model Description and Proposed Application for the Enlisted Personnel Inventory, Cost, and Compensation Model

    DTIC Science & Technology

    1994-07-01

    provide additional information for the user / policy analyst: Eichers, D., Sola, M., McLernan, G., EPICC User’s Manual , Systems Research and Applications...maintenance, and a set of on-line help screens. Each are further discussed below and a full discussion is included in the EPICC User’s Manual . Menu Based...written documentation (user’s manual ) that will be provided with the model. 55 The next chapter discusses the validation of the inventory projection and

  4. HYDROLOGIC EVALUATION OF LANDFILL PERFORMANCE (HELP) MODEL - USER'S GUIDE FOR VERSION 3

    EPA Science Inventory

    This report documents the solution methods and process descriptions used in the Version 3 of the HELP model. Program documentation including program options, system and operating requirements, file structures, program structure and variable descriptions are provided in a separat...

  5. Unified aeroacoustics analysis for high speed turboprop aerodynamics and noise. Volume 4: Computer user's manual for UAAP turboprop aeroacoustic code

    NASA Astrophysics Data System (ADS)

    Menthe, R. W.; McColgan, C. J.; Ladden, R. M.

    1991-05-01

    The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.

  6. Unified aeroacoustics analysis for high speed turboprop aerodynamics and noise. Volume 4: Computer user's manual for UAAP turboprop aeroacoustic code

    NASA Technical Reports Server (NTRS)

    Menthe, R. W.; Mccolgan, C. J.; Ladden, R. M.

    1991-01-01

    The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.

  7. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  8. Research-Based Model for Adult Consumer-Homemaking Education.

    ERIC Educational Resources Information Center

    Ball State Univ., Muncie, IN.

    This model is designed to be used as a guide by all teachers and designers of adult vocational consumer and homemaking courses who usually function as program planners. Chapter 1 contains an operational definition, the rationale, and description of intended users. Chapter 2 presents the model description with an overview and discussion of the…

  9. User's manual for the Simulated Life Analysis of Vehicle Elements (SLAVE) model

    NASA Technical Reports Server (NTRS)

    Paul, D. D., Jr.

    1972-01-01

    The simulated life analysis of vehicle elements model was designed to perform statistical simulation studies for any constant loss rate. The outputs of the model consist of the total number of stages required, stages successfully completing their lifetime, and average stage flight life. This report contains a complete description of the model. Users' instructions and interpretation of input and output data are presented such that a user with little or no prior programming knowledge can successfully implement the program.

  10. Solid rocket booster performance evaluation model. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    This users manual for the solid rocket booster performance evaluation model (SRB-II) contains descriptions of the model, the program options, the required program inputs, the program output format and the program error messages. SRB-II is written in FORTRAN and is operational on both the IBM 370/155 and the MSFC UNIVAC 1108 computers.

  11. User Interface Models for Multidisciplinary Bibliographic Information Dissemination Centers.

    ERIC Educational Resources Information Center

    Zipperer, W. C.

    Two information dissemination centers at University of California at Los Angeles and University of Georgia studied the interactions between computer based search facilities and their users. The study, largely descriptive in nature, investigated the interaction processes between data base users and profile analysis or information specialists in…

  12. TIM Version 3.0 beta Technical Description and User Guide - Appendix A - User's Guidance for TIM v.3.0(beta)

    EPA Pesticide Factsheets

    Provides detailed guidance to the user on how to select input parameters for running the Terrestrial Investigation Model (TIM) and recommendations for default values that can be used when no chemical-specific or species-specific information are available.

  13. Report of the LSPI/NASA Workshop on Lunar Base Methodology Development

    NASA Technical Reports Server (NTRS)

    Nozette, Stewart; Roberts, Barney

    1985-01-01

    Groundwork was laid for computer models which will assist in the design of a manned lunar base. The models, herein described, will provide the following functions for the successful conclusion of that task: strategic planning; sensitivity analyses; impact analyses; and documentation. Topics addressed include: upper level model description; interrelationship matrix; user community; model features; model descriptions; system implementation; model management; and plans for future action.

  14. Service-oriented model-encapsulation strategy for sharing and integrating heterogeneous geo-analysis models in an open web environment

    NASA Astrophysics Data System (ADS)

    Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian

    2016-04-01

    Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.

  15. OCD: The offshore and coastal dispersion model. Volume 1. User's guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiCristofaro, D.C.; Hanna, S.R.

    1989-11-01

    The Offshore and Coastal Dispersion (OCD) Model has been developed to simulate the effect of offshore emissions from point, area, or line sources on the air quality of coastal regions. The OCD model was adapted from the EPA guideline model MPTER (EPA, 1980). Modifications were made to incorporate overwater plume transport and dispersion as well as changes that occur as the plume crosses the shoreline. This is a revised OCD model, the fourth version to date. The volume is the User's Guide which includes a Model overview, technical description, user's instructions, and notes on model evaluation and results.

  16. User's guide to the SEPHIS computer code for calculating the Thorex solvent extraction system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, S.B.; Rainey, R.H.

    1979-05-01

    The SEPHIS computer program was developed to simulate the countercurrent solvent extraction process. The code has now been adapted to model the Acid Thorex flow sheet. This report represents a practical user's guide to SEPHIS - Thorex containing a program description, user information, program listing, and sample input and output.

  17. Design of Software for Design of Finite Element for Structural Analysis. Ph.D. Thesis - Stuttgart Univ., 22 Nov. 1983

    NASA Technical Reports Server (NTRS)

    Helfrich, Reinhard

    1987-01-01

    The concepts of software engineering which allow a user of the finite element method to describe a model, to collect and to check the model data in a data base as well as to form the matrices required for a finite element calculation are examined. Next the components of the model description are conceived including the mesh tree, the topology, the configuration, the kinematic boundary conditions, the data for each element, and the loads. The possibilities for description and review of the data are considered. The concept of the segments for the modularization of the programs follows the components of the model description. The significance of the mesh tree as a globular guiding structure will be understood in view of the principle of the unity of the model, the mesh tree, and the data base. The user-friendly aspects of the software system will be summarized: the principle of language communication, the data generators, error processing, and data security.

  18. Wake Vortex Inverse Model User's Guide

    NASA Technical Reports Server (NTRS)

    Lai, David; Delisi, Donald

    2008-01-01

    NorthWest Research Associates (NWRA) has developed an inverse model for inverting landing aircraft vortex data. The data used for the inversion are the time evolution of the lateral transport position and vertical position of both the port and starboard vortices. The inverse model performs iterative forward model runs using various estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Forward model predictions of lateral transport and altitude are then compared with the observed data. Differences between the data and model predictions guide the choice of vortex parameter values, crosswind profile and circulation evolution in the next iteration. Iterations are performed until a user-defined criterion is satisfied. Currently, the inverse model is set to stop when the improvement in the rms deviation between the data and model predictions is less than 1 percent for two consecutive iterations. The forward model used in this inverse model is a modified version of the Shear-APA model. A detailed description of this forward model, the inverse model, and its validation are presented in a different report (Lai, Mellman, Robins, and Delisi, 2007). This document is a User's Guide for the Wake Vortex Inverse Model. Section 2 presents an overview of the inverse model program. Execution of the inverse model is described in Section 3. When executing the inverse model, a user is requested to provide the name of an input file which contains the inverse model parameters, the various datasets, and directories needed for the inversion. A detailed description of the list of parameters in the inversion input file is presented in Section 4. A user has an option to save the inversion results of each lidar track in a mat-file (a condensed data file in Matlab format). These saved mat-files can be used for post-inversion analysis. A description of the contents of the saved files is given in Section 5. An example of an inversion input file, with preferred parameters values, is given in Appendix A. An example of the plot generated at a normal completion of the inversion is shown in Appendix B.

  19. The 1973 NASA payload model: Space opportunities 1973 - 1991. [characteristics of payloads and requirements of user community

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The tables of schedules and descriptions which portray the 1973 NASA Payload Model are presented. The schedules cover all NASA programs and the anticipated requirements of the user community, not including the Department of Defense, for the 1973 to 1991 period. The descriptions give an indication of what the payload is expected to accomplish, its characteristics, and where it is going. The payload flight schedules shown for each of the discipline areas indicate the time frame in which individual payloads will be launched, serviced, or retrieved. These do not necessarily constitute shuttle flights, however, since more than one payload can be flown on a single shuttle flight depending on size, weight, orbital destination, and the suitability of combining them. The weight, dimension, and destination data represent approximations of the payload characteristics as estimated by the Program Offices. Payload codes are provided for easy correlation between the schedules and descriptions of the Payload Model and subsequent documentation which may reference this model.

  20. User's manual for generalized ILSGLD-ILS glide slope performance prediction : multipath scattering

    DOT National Transportation Integrated Search

    1976-11-01

    This manual presents the computer program package for the generalized ILSGLD scattering model. The text includes a complete description of the program itself as well as 3 brief descriptions : of the ILS system and antenna patterns. The program listin...

  1. TIM Version 3.0 beta Technical Description and User Guide - Appendix B - Example input file for TIMv3.0

    EPA Pesticide Factsheets

    Terrestrial Investigation Model, TIM, has several appendices to its user guide. This is the appendix that includes an example input file in its preserved format. Both parameters and comments defining them are included.

  2. Users matter : multi-agent systems model of high performance computing cluster users.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Hood, C. S.; Decision and Information Sciences

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less

  3. Building Program Models Incrementally from Informal Descriptions.

    DTIC Science & Technology

    1979-10-01

    specified at each step. Since the user controls the interaction, the user may determine the order in which information flows into PMB. Information is received...until only ten years ago the term aautomatic programming" referred to the development of the assemblers, macro expanders, and compilers for these

  4. Software Technology for Adaptable, Reliable Systems (STARS). Repository Integration AdaKNET Software User’s Manual

    DTIC Science & Technology

    1990-10-03

    9 4.1. Mapping the Conceptual Model to the Implementation ................................................ 9 4.2. Overview of...browser-editor application. Finally, appendix A provides a detailed description of the AdaKNET conceptual model; users of AdaKNET should fami...provide a brief summary of the semantics of the underlying conceptual model implemented by AdaKNET, use of the AdaKNET ADT will require a more thorough

  5. State transition storyboards: A tool for designing the Goldstone solar system radar data acquisition system user interface software

    NASA Technical Reports Server (NTRS)

    Howard, S. D.

    1987-01-01

    Effective user interface design in software systems is a complex task that takes place without adequate modeling tools. By combining state transition diagrams and the storyboard technique of filmmakers, State Transition Storyboards were developed to provide a detailed modeling technique for the Goldstone Solar System Radar Data Acquisition System human-machine interface. Illustrations are included with a description of the modeling technique.

  6. Solid rocket booster performance evaluation model. Volume 1: Engineering description

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The space shuttle solid rocket booster performance evaluation model (SRB-II) is made up of analytical and functional simulation techniques linked together so that a single pass through the model will predict the performance of the propulsion elements of a space shuttle solid rocket booster. The available options allow the user to predict static test performance, predict nominal and off nominal flight performance, and reconstruct actual flight and static test performance. Options selected by the user are dependent on the data available. These can include data derived from theoretical analysis, small scale motor test data, large motor test data and motor configuration data. The user has several options for output format that include print, cards, tape and plots. Output includes all major performance parameters (Isp, thrust, flowrate, mass accounting and operating pressures) as a function of time as well as calculated single point performance data. The engineering description of SRB-II discusses the engineering and programming fundamentals used, the function of each module, and the limitations of each module.

  7. Social Image Captioning: Exploring Visual Attention and User Attention.

    PubMed

    Wang, Leiquan; Chu, Xiaoliang; Zhang, Weishan; Wei, Yiwei; Sun, Weichen; Wu, Chunlei

    2018-02-22

    Image captioning with a natural language has been an emerging trend. However, the social image, associated with a set of user-contributed tags, has been rarely investigated for a similar task. The user-contributed tags, which could reflect the user attention, have been neglected in conventional image captioning. Most existing image captioning models cannot be applied directly to social image captioning. In this work, a dual attention model is proposed for social image captioning by combining the visual attention and user attention simultaneously.Visual attention is used to compress a large mount of salient visual information, while user attention is applied to adjust the description of the social images with user-contributed tags. Experiments conducted on the Microsoft (MS) COCO dataset demonstrate the superiority of the proposed method of dual attention.

  8. Social Image Captioning: Exploring Visual Attention and User Attention

    PubMed Central

    Chu, Xiaoliang; Zhang, Weishan; Wei, Yiwei; Sun, Weichen; Wu, Chunlei

    2018-01-01

    Image captioning with a natural language has been an emerging trend. However, the social image, associated with a set of user-contributed tags, has been rarely investigated for a similar task. The user-contributed tags, which could reflect the user attention, have been neglected in conventional image captioning. Most existing image captioning models cannot be applied directly to social image captioning. In this work, a dual attention model is proposed for social image captioning by combining the visual attention and user attention simultaneously.Visual attention is used to compress a large mount of salient visual information, while user attention is applied to adjust the description of the social images with user-contributed tags. Experiments conducted on the Microsoft (MS) COCO dataset demonstrate the superiority of the proposed method of dual attention. PMID:29470409

  9. Development of fuel oil management system software: Phase 1, Tank management module. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lange, H.B.; Baker, J.P.; Allen, D.

    1992-01-01

    The Fuel Oil Management System (FOMS) is a micro-computer based software system being developed to assist electric utilities that use residual fuel oils with oil purchase and end-use decisions. The Tank Management Module (TMM) is the first FOMS module to be produced. TMM enables the user to follow the mixing status of oils contained in a number of oil storage tanks. The software contains a computational model of residual fuel oil mixing which addresses mixing that occurs as one oil is added to another in a storage tank and also purposeful mixing of the tank by propellers, recirculation or convection.Themore » model also addresses the potential for sludge formation due to incompatibility of oils being mixed. Part 1 of the report presents a technical description of the mixing model and a description of its development. Steps followed in developing the mixing model included: (1) definition of ranges of oil properties and tank design factors used by utilities; (2) review and adaption of prior applicable work; (3) laboratory development; and (4) field verification. Also, a brief laboratory program was devoted to exploring the suitability of suggested methods for predicting viscosities, flash points and pour points of oil mixtures. Part 2 of the report presents a functional description of the TMM software and a description of its development. The software development program consisted of the following steps: (1) on-site interviews at utilities to prioritize needs and characterize user environments; (2) construction of the user interface; and (3) field testing the software.« less

  10. Development of fuel oil management system software: Phase 1, Tank management module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lange, H.B.; Baker, J.P.; Allen, D.

    1992-01-01

    The Fuel Oil Management System (FOMS) is a micro-computer based software system being developed to assist electric utilities that use residual fuel oils with oil purchase and end-use decisions. The Tank Management Module (TMM) is the first FOMS module to be produced. TMM enables the user to follow the mixing status of oils contained in a number of oil storage tanks. The software contains a computational model of residual fuel oil mixing which addresses mixing that occurs as one oil is added to another in a storage tank and also purposeful mixing of the tank by propellers, recirculation or convection.Themore » model also addresses the potential for sludge formation due to incompatibility of oils being mixed. Part 1 of the report presents a technical description of the mixing model and a description of its development. Steps followed in developing the mixing model included: (1) definition of ranges of oil properties and tank design factors used by utilities; (2) review and adaption of prior applicable work; (3) laboratory development; and (4) field verification. Also, a brief laboratory program was devoted to exploring the suitability of suggested methods for predicting viscosities, flash points and pour points of oil mixtures. Part 2 of the report presents a functional description of the TMM software and a description of its development. The software development program consisted of the following steps: (1) on-site interviews at utilities to prioritize needs and characterize user environments; (2) construction of the user interface; and (3) field testing the software.« less

  11. Chapter 4: Variant descriptions

    Treesearch

    Duncan C. Lutes; Donald C. E. Robinson

    2003-01-01

    The Fire and Fuels Extension (FFE) to the Forest Vegetation Simulator (FVS) simulates fuel dynamics and potential fire behavior over time, in the context of stand development and management. This report documents differences between geographic variants of the FFE. It is a companion document to the FFE "Model Description" and "User's Guide."...

  12. Mental models of a water management system in a green building.

    PubMed

    Kalantzis, Anastasia; Thatcher, Andrew; Sheridan, Craig

    2016-11-01

    This intergroup case study compared users' mental models with an expert design model of a water management system in a green building. The system incorporates a constructed wetland component and a rainwater collection pond that together recycle water for re-use in the building and its surroundings. The sample consisted of five building occupants and the cleaner (6 users) and two experts who were involved with the design of the water management system. Users' mental model descriptions and the experts' design model were derived from in-depth interviews combined with self-constructed (and verified) diagrams. Findings from the study suggest that there is considerable variability in the user mental models that could impact the efficient functioning of the water management system. Recommendations for improvements are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. The Delta Launch Vehicle Model 2914 Series

    NASA Technical Reports Server (NTRS)

    Gunn, C. R.

    1973-01-01

    The newest Delta launch vehicle configuration, Model 2914 is described for potential users together with recent flight results. A functional description of the vehicle, its performance, flight profile, flight environment, injection accuracy, spacecraft integration requirements, user organizational interfaces, launch operations, costs and reimbursable users payment plan are provided. The versatile, relatively low cost Delta has a flight demonstrated reliability record of 92 percent that has been established in 96 launches over twelve years while concurrently undergoing ten major upratings to keep pace with the ever increasing performance and reliability requirements of its users. At least 40 more launches are scheduled over the next three years from the Eastern and Western Test Ranges.

  14. Financial constraints in capacity planning: a national utility regulatory model (NUREG). Volume III of III: software description. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1981-10-29

    This volume is the software description for the National Utility Regulatory Model (NUREG). This is the third of three volumes provided by ICF under contract number DEAC-01-79EI-10579. These three volumes are: a manual describing the NUREG methodology; a users guide; and a description of the software. This manual describes the software which has been developed for NUREG. This includes a listing of the source modules. All computer code has been written in FORTRAN.

  15. BIBLIO: A Computer System Designed to Support the Near-Library User Model of Information Retrieval.

    ERIC Educational Resources Information Center

    Belew, Richard K.; Holland, Maurita Peterson

    1988-01-01

    Description of the development of the Information Exchange Facility, a prototype microcomputer-based personal bibliographic facility, covers software selection, user selection, overview of the system, and evaluation. The plan for an integrated system, BIBLIO, and the future role of libraries are discussed. (eight references) (MES)

  16. Strategic Control Algorithm Development : Volume 4A. Computer Program Report.

    DOT National Transportation Integrated Search

    1974-08-01

    A description of the strategic algorithm evaluation model is presented, both at the user and programmer levels. The model representation of an airport configuration, environmental considerations, the strategic control algorithm logic, and the airplan...

  17. Strategic Control Algorithm Development : Volume 4B. Computer Program Report (Concluded)

    DOT National Transportation Integrated Search

    1974-08-01

    A description of the strategic algorithm evaluation model is presented, both at the user and programmer levels. The model representation of an airport configuration, environmental considerations, the strategic control algorithm logic, and the airplan...

  18. Home Energy Scoring Tools (website) and Application Programming Interfaces, APIs (aka HEScore)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Evan; Bourassa, Norm; Rainer, Leo

    A web-based residential energy rating tool with APIs that runs the LBNL website: Provides customized estimates of residential energy use and energy bills based on building description information provided by the user. Energy use is estimated using engineering models developed at LBNL. Space heating and cooling use is based on the DOE-2. 1E building simulation model. Other end-users (water heating, appliances, lighting, and misc. equipment) are based on engineering models developed by LBNL.

  19. Home Energy Scoring Tools (website) and Application Programming Interfaces, APIs (aka HEScore)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Evan; Bourassa, Norm; Rainer, Leo

    2016-04-22

    A web-based residential energy rating tool with APIs that runs the LBNL website: Provides customized estimates of residential energy use and energy bills based on building description information provided by the user. Energy use is estimated using engineering models developed at LBNL. Space heating and cooling use is based on the DOE-2. 1E building simulation model. Other end-users (water heating, appliances, lighting, and misc. equipment) are based on engineering models developed by LBNL.

  20. UPDATE ON EPA'S URBAN WATERSHED MANAGEMENT BRANCH MODELING ACTIVITIES

    EPA Science Inventory

    This paper provides the Stormwater Management Model (SWMM) user community with a description of the Environmental Protection Agency (EPA's) Office of Research and Development (ORD) approach to urban watershed modeling research and provides an update on current ORD SWMM-related pr...

  1. The GIS weasel - An interface for the development of spatial information in modeling

    USGS Publications Warehouse

    Viger, R.J.; Markstrom, S.M.; Leavesley, G.H.; ,

    2005-01-01

    The GIS Weasel is a map and Graphical User Interface (GUI) driven tool that has been developed as an aid to modelers in the delineation, characterization of geographic features, and their parameterization for use in distributed or lumped parameter physical process models. The interface does not require user expertise in geographic information systems (GIS). The user does need knowledge of how the model will use the output from the GIS Weasel. The GIS Weasel uses Workstation ArcInfo and its the Grid extension. The GIS Weasel will run on all platforms that Workstation ArcInfo runs (i.e. numerous flavors of Unix and Microsoft Windows).The GIS Weasel requires an input ArcInfo grid of some topographical description of the Area of Interest (AOI). This is normally a digital elevation model, but can be the surface of a ground water table or any other data that flow direction can be resolved from. The user may define the AOI as a custom drainage area based on an interactively specified watershed outlet point, or use a previously created map. The user is then able to use any combination of the GIS Weasel's tool set to create one or more maps for depicting different kinds of geographic features. Once the spatial feature maps have been prepared, then the GIS Weasel s many parameterization routines can be used to create descriptions of each element in each of the user s created maps. Over 200 parameterization routines currently exist, generating information about shape, area, and topological association with other features of the same or different maps, as well many types of information based on ancillary data layers such as soil and vegetation properties. These tools easily integrate other similarly formatted data sets.

  2. Use of Unified Modeling Language (UML) in Model-Based Development (MBD) For Safety-Critical Applications

    DTIC Science & Technology

    2014-12-01

    appears that UML is becoming the de facto MBD language. OMG® states the following on the MDA® FAQ page: “Although not formally required [for MBD], UML...a known limitation [42], so UML users should plan accordingly, especially for safety-critical programs. For example, “models are not used to...description of the MBD tool chain can be produced. That description could be resident in a Plan for Software Aspects of Certification (PSAC) or Software

  3. SHAWNEE LIME/LIMESTONE SCRUBBING COMPUTERIZED DESIGN/COST-ESTIMATE MODEL USERS MANUAL

    EPA Science Inventory

    The manual gives a general description of the Shawnee lime/limestone scrubbing computerized design/cost-estimate model and detailed procedures for using it. It describes all inputs and outputs, along with available options. The model, based on Shawnee Test Facility scrubbing data...

  4. KABAM Version 1.0 User's Guide and Technical Documentation - Appendix A - Description of Bioaccumulation Model

    EPA Pesticide Factsheets

    The purpose of this model is to estimate chemical concentrations (CB) and BCF and BAF values for aquatic ecosystems. KABAM is a simulation model used to predict pesticide concentrations in aquatic regions for use in exposure assessments.

  5. Universal model for collective access patterns in the Internet traffic dynamics: A superstatistical approach

    NASA Astrophysics Data System (ADS)

    Tamazian, A.; Nguyen, V. D.; Markelov, O. A.; Bogachev, M. I.

    2016-07-01

    We suggest a universal phenomenological description for the collective access patterns in the Internet traffic dynamics both at local and wide area network levels that takes into account erratic fluctuations imposed by cooperative user behaviour. Our description is based on the superstatistical approach and leads to the q-exponential inter-session time and session size distributions that are also in perfect agreement with empirical observations. The validity of the proposed description is confirmed explicitly by the analysis of complete 10-day traffic traces from the WIDE backbone link and from the local campus area network downlink from the Internet Service Provider. Remarkably, the same functional forms have been observed in the historic access patterns from single WWW servers. The suggested approach effectively accounts for the complex interplay of both “calm” and “bursty” user access patterns within a single-model setting. It also provides average sojourn time estimates with reasonable accuracy, as indicated by the queuing system performance simulation, this way largely overcoming the failure of Poisson modelling of the Internet traffic dynamics.

  6. Recent Updates of A Multi-Phase Transport (AMPT) Model

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Wei

    2008-10-01

    We will present recent updates to the AMPT model, a Monte Carlo transport model for high energy heavy ion collisions, since its first public release in 2004 and the corresponding detailed descriptions in Phys. Rev. C 72, 064901 (2005). The updates often result from user requests. Some of these updates expand the physics processes or descriptions in the model, while some updates improve the usability of the model such as providing the initial parton distributions or help avoid crashes on some operating systems. We will also explain how the AMPT model is being maintained and updated.

  7. Users guide for the hydroacoustic coverage assessment model (HydroCAM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrell, T., LLNL

    1997-12-01

    A model for predicting the detection and localization performance of hydroacoustic monitoring networks has been developed. The model accounts for major factors affecting global-scale acoustic propagation in the ocean. including horizontal refraction, travel time variability due to spatial and temporal fluctuations in the ocean, and detailed characteristics of the source. Graphical user interfaces are provided to setup the models and visualize the results. The model produces maps of network detection coverage and localization area of uncertainty, as well as intermediate results such as predicted path amplitudes, travel time and travel time variance. This Users Guide for the model is organizedmore » into three sections. First a summary of functionality available in the model is presented, including example output products. The second section provides detailed descriptions of each of models contained in the system. The last section describes how to run the model, including a summary of each data input form in the user interface.« less

  8. Manual for obscuration code with space station applications

    NASA Technical Reports Server (NTRS)

    Marhefka, R. J.; Takacs, L.

    1986-01-01

    The Obscuration Code, referred to as SHADOW, is a user-oriented computer code to determine the case shadow of an antenna in a complex environment onto the far zone sphere. The surrounding structure can be composed of multiple composite cone frustums and multiply sided flat plates. These structural pieces are ideal for modeling space station configurations. The means of describing the geometry input is compatible with the NEC-BASIC Scattering Code. In addition, an interactive mode of operation has been provided for DEC VAX computers. The first part of this document is a user's manual designed to give a description of the method used to obtain the shadow map, to provide an overall view of the operation of the computer code, to instruct a user in how to model structures, and to give examples of inputs and outputs. The second part is a code manual that details how to set up the interactive and non-interactive modes of the code and provides a listing and brief description of each of the subroutines.

  9. An Overview of the GIS Weasel

    USGS Publications Warehouse

    Viger, Roland J.

    2008-01-01

    This fact sheet provides a high-level description of the GIS Weasel, a software system designed to aid users in preparing spatial information as input to lumped and distributed parameter environmental simulation models (ESMs). The GIS Weasel provides geographic information system (GIS) tools to help create maps of geographic features relevant to the application of a user?s ESM and to generate parameters from those maps. The operation of the GIS Weasel does not require a user to be a GIS expert, only that a user has an understanding of the spatial information requirements of the model. The GIS Weasel software system provides a GIS-based graphical user interface (GUI), C programming language executables, and general utility scripts. The software will run on any computing platform where ArcInfo Workstation (version 8.1 or later) and the GRID extension are accessible. The user controls the GIS Weasel by interacting with menus, maps, and tables.

  10. RAMONA-4B a computer code with three-dimensional neutron kinetics for BWR and SBWR system transient - user`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.

    This document is the User`s Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code`s capabilities and limitations; Chapter 2 describes the code`s structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARCmore » and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs.« less

  11. Predictive minimum description length principle approach to inferring gene regulatory networks.

    PubMed

    Chaitankar, Vijender; Zhang, Chaoyang; Ghosh, Preetam; Gong, Ping; Perkins, Edward J; Deng, Youping

    2011-01-01

    Reverse engineering of gene regulatory networks using information theory models has received much attention due to its simplicity, low computational cost, and capability of inferring large networks. One of the major problems with information theory models is to determine the threshold that defines the regulatory relationships between genes. The minimum description length (MDL) principle has been implemented to overcome this problem. The description length of the MDL principle is the sum of model length and data encoding length. A user-specified fine tuning parameter is used as control mechanism between model and data encoding, but it is difficult to find the optimal parameter. In this work, we propose a new inference algorithm that incorporates mutual information (MI), conditional mutual information (CMI), and predictive minimum description length (PMDL) principle to infer gene regulatory networks from DNA microarray data. In this algorithm, the information theoretic quantities MI and CMI determine the regulatory relationships between genes and the PMDL principle method attempts to determine the best MI threshold without the need of a user-specified fine tuning parameter. The performance of the proposed algorithm is evaluated using both synthetic time series data sets and a biological time series data set (Saccharomyces cerevisiae). The results show that the proposed algorithm produced fewer false edges and significantly improved the precision when compared to existing MDL algorithm.

  12. NASA/MSFC multilayer diffusion models and computer program for operational prediction of toxic fuel hazards

    NASA Technical Reports Server (NTRS)

    Dumbauld, R. K.; Bjorklund, J. R.; Bowers, J. F.

    1973-01-01

    The NASA/MSFC multilayer diffusion models are discribed which are used in applying meteorological information to the estimation of toxic fuel hazards resulting from the launch of rocket vehicle and from accidental cold spills and leaks of toxic fuels. Background information, definitions of terms, description of the multilayer concept are presented along with formulas for determining the buoyant rise of hot exhaust clouds or plumes from conflagrations, and descriptions of the multilayer diffusion models. A brief description of the computer program is given, and sample problems and their solutions are included. Derivations of the cloud rise formulas, users instructions, and computer program output lists are also included.

  13. Documentation of the GLAS fourth order general circulation model. Volume 1: Model documentation

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, J.; Pfaendtner, J.; Takacs, L.; Takano, K.

    1983-01-01

    The volume 1, of a 3 volume technical memoranda which contains a documentation of the GLAS Fourth Order General Circulation Model is presented. Volume 1 contains the documentation, description of the stratospheric/tropospheric extension, user's guide, climatological boundary data, and some climate simulation studies.

  14. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    NASA Astrophysics Data System (ADS)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  15. The NASTRAN User's Manual (Level 15)

    NASA Technical Reports Server (NTRS)

    Mccormick, C. W. (Editor)

    1972-01-01

    The User's manual for the NASA Structural Analysis (NASTRAN) program is presented. The manual contains all information needed to solve problems with NASTRAN. The volume is instructional and encyclopedic. The manual includes instruction in structural modeling techniques, instruction in input preparation, and information to assist the interpretation of the output. Descriptions of all input data cards, restart procedures, and diagnostic messages are developed.

  16. A quasi-current representation for information needs inspired by Two-State Vector Formalism

    NASA Astrophysics Data System (ADS)

    Wang, Panpan; Hou, Yuexian; Li, Jingfei; Zhang, Yazhou; Song, Dawei; Li, Wenjie

    2017-09-01

    Recently, a number of quantum theory (QT)-based information retrieval (IR) models have been proposed for modeling session search task that users issue queries continuously in order to describe their evolving information needs (IN). However, the standard formalism of QT cannot provide a complete description for users' current IN in a sense that it does not take the 'future' information into consideration. Therefore, to seek a more proper and complete representation for users' IN, we construct a representation of quasi-current IN inspired by an emerging Two-State Vector Formalism (TSVF). With the enlightenment of the completeness of TSVF, a "two-state vector" derived from the 'future' (the current query) and the 'history' (the previous query) is employed to describe users' quasi-current IN in a more complete way. Extensive experiments are conducted on the session tracks of TREC 2013 & 2014, and show that our model outperforms a series of compared IR models.

  17. User's guide to resin infusion simulation program in the FORTRAN language

    NASA Technical Reports Server (NTRS)

    Weideman, Mark H.; Hammond, Vince H.; Loos, Alfred C.

    1992-01-01

    RTMCL is a user friendly computer code which simulates the manufacture of fabric composites by the resin infusion process. The computer code is based on the process simulation model described in reference 1. Included in the user's guide is a detailed step by step description of how to run the program and enter and modify the input data set. Sample input and output files are included along with an explanation of the results. Finally, a complete listing of the program is provided.

  18. Users manual and modeling improvements for axial turbine design and performance computer code TD2-2

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.

  19. A user's guide to the combined stand prognosis and Douglas-fir tussock moth outbreak model

    Treesearch

    Robert A. Monserud; Nicholas L. Crookston

    1982-01-01

    Documentation is given for using a simulation model combining the Stand Prognosis Model and the Douglas-fir Tussock Moth Outbreak Model. Four major areas are addressed: (1) an overview and discussion of the combined model; (2) description of input options; (3) discussion of model output, and (4) numerous examples illustrating model behavior and sensitivity.

  20. On a basic model of circulatory, fluid, and electrolyte regulation in the human system based upon the model of Guyton

    NASA Technical Reports Server (NTRS)

    White, R. J.

    1973-01-01

    A detailed description of Guyton's model and modifications are provided. Also included are descriptions of several typical experiments which the model can simulate to illustrate the model's general utility. A discussion of the problems associated with the interfacing of the model to other models such as respiratory and thermal regulation models which is prime importance since these stimuli are not present in the current model is also included. A user's guide for the operation of the model on the Xerox Sigma 3 computer is provided and two programs are described. A verification plan and procedure for performing experiments is also presented.

  1. COMPUTERIZED SHAWNEE LIME/LIMESTONE SCRUBBING MODEL USERS MANUAL

    EPA Science Inventory

    The manual gives a general description of a computerized model for estimating design and cost of lime or limestone scrubber systems for flue gas desulfurization (FGD). It supplements PB80-123037 by extending the number of scrubber options which can be evaluated. It includes spray...

  2. Institutional Transformation 2.5 Building Module Help Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villa, Daniel

    The Institutional Transformation (IX) building module is a software tool developed at Sandia National Laboratories to evaluate energy conservation measures (ECMs) on hundreds of DOE-2 building energy models simultaneously. In IX, ECMs can be designed through parameterizing DOE-2 building models and doing further processing via visual basic for applications subroutines. IX provides the functionality to handle multiple building models for different years, which enables incrementally changing a site of hundreds of buildings over time. It also enables evaluation of the effects of changing climate, comparisons between data and modeling results, and energy use of centralized utility buildings (CUBs). IX consistsmore » of a Microsoft Excel(r) user interface, Microsoft Access(r) database, and Microsoft Excel(r) CUB build utility whose functionalities are described in detail in this report. In addition to descriptions of the user interfaces, descriptions of every ECM already designed in IX is included. SAND2016-8983 IX 2.5 Help Manual« less

  3. Understanding Nomophobia: Structural Equation Modeling and Semantic Network Analysis of Smartphone Separation Anxiety.

    PubMed

    Han, Seunghee; Kim, Ki Joon; Kim, Jang Hyun

    2017-07-01

    This study explicates nomophobia by developing a research model that identifies several determinants of smartphone separation anxiety and by conducting semantic network analyses on smartphone users' verbal descriptions of the meaning of their smartphones. Structural equation modeling of the proposed model indicates that personal memories evoked by smartphones encourage users to extend their identity onto their devices. When users perceive smartphones as their extended selves, they are more likely to get attached to the devices, which, in turn, leads to nomophobia by heightening the phone proximity-seeking tendency. This finding is also supplemented by the results of the semantic network analyses revealing that the words related to memory, self, and proximity-seeking are indeed more frequently used in the high, compared with low, nomophobia group.

  4. Sandia National Laboratories environmental fluid dynamics code. Marine Hydrokinetic Module User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Scott Carlton; Roberts, Jesse D.

    2014-03-01

    This document describes the marine hydrokinetic (MHK) input file and subroutines for the Sandia National Laboratories Environmental Fluid Dynamics Code (SNL-EFDC), which is a combined hydrodynamic, sediment transport, and water quality model based on the Environmental Fluid Dynamics Code (EFDC) developed by John Hamrick [1], formerly sponsored by the U.S. Environmental Protection Agency, and now maintained by Tetra Tech, Inc. SNL-EFDC has been previously enhanced with the incorporation of the SEDZLJ sediment dynamics model developed by Ziegler, Lick, and Jones [2-4]. SNL-EFDC has also been upgraded to more accurately simulate algae growth with specific application to optimizing biomass in anmore » open-channel raceway for biofuels production [5]. A detailed description of the input file containing data describing the MHK device/array is provided, along with a description of the MHK FORTRAN routine. Both a theoretical description of the MHK dynamics as incorporated into SNL-EFDC and an explanation of the source code are provided. This user manual is meant to be used in conjunction with the original EFDC [6] and sediment dynamics SNL-EFDC manuals [7]. Through this document, the authors provide information for users who wish to model the effects of an MHK device (or array of devices) on a flow system with EFDC and who also seek a clear understanding of the source code, which is available from staff in the Water Power Technologies Department at Sandia National Laboratories, Albuquerque, New Mexico.« less

  5. Generating User-Tailored Descriptions of Online Educational Resources

    ERIC Educational Resources Information Center

    Bental, Diana; Cawsey, Alison; Eddy, Bruce

    2004-01-01

    Tailored descriptions of online educational resources can support users searching for educational resources on the World Wide Web (WWW) by helping them to assess for themselves the relevance and suitability of each resource. Suitable descriptions can be derived from the online metadata stored with each resource. The descriptions take into account…

  6. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 2: An Assessment of the Current State-of-the-Art

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Results of a state-of-the-art assessment of technology areas which affect the Earth Resources Program are presented along with a functional description of the basic earth resources system. Major areas discussed include: spacecraft flight hardware, remote sensors, data processing techniques and hardware, user models, user interfaces, and operations technology.

  7. SABRINA: an interactive solid geometry modeling program for Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, J.T.

    SABRINA is a fully interactive three-dimensional geometry modeling program for MCNP. In SABRINA, a user interactively constructs either body geometry, or surface geometry models, and interactively debugs spatial descriptions for the resulting objects. This enhanced capability significantly reduces the effort in constructing and debugging complicated three-dimensional geometry models for Monte Carlo Analysis.

  8. Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0

    NASA Technical Reports Server (NTRS)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.

  9. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.

    1977-03-01

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework.

  10. A design space of visualization tasks.

    PubMed

    Schulz, Hans-Jörg; Nocke, Thomas; Heitzler, Magnus; Schumann, Heidrun

    2013-12-01

    Knowledge about visualization tasks plays an important role in choosing or building suitable visual representations to pursue them. Yet, tasks are a multi-faceted concept and it is thus not surprising that the many existing task taxonomies and models all describe different aspects of tasks, depending on what these task descriptions aim to capture. This results in a clear need to bring these different aspects together under the common hood of a general design space of visualization tasks, which we propose in this paper. Our design space consists of five design dimensions that characterize the main aspects of tasks and that have so far been distributed across different task descriptions. We exemplify its concrete use by applying our design space in the domain of climate impact research. To this end, we propose interfaces to our design space for different user roles (developers, authors, and end users) that allow users of different levels of expertise to work with it.

  11. Rooftop Unit Comparison Calculator User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, James D.

    This document serves as a user manual for the Packaged rooftop air conditioners and heat pump units comparison calculator (RTUCC) and is an aggregation of the calculator’s website documentation. Content ranges from new-user guide material like the “Quick Start” to the more technical/algorithmic descriptions of the “Methods Pages.” There is also a section listing all the context-help topics that support the features on the “Controls” page. The appendix has a discussion of the EnergyPlus runs that supported the development of the building-response models.

  12. Pinyon, Version 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Logan; Hackenberg, Robert

    2017-02-13

    Pinyon is a tool that stores steps involved in creating a model derived from a collection of data. The main function of Pinyon is to store descriptions of calculations used to analyze or visualize the data in a database, and allow users to view the results of these calculations via a web interface. Additionally, users may also use the web interface to make adjustments to the calculations and rerun the entire collection of analysis steps automatically.

  13. Study of a tracking and data acquisition system for the 1990's. Volume 3: TDAS Communication Mission Model

    NASA Technical Reports Server (NTRS)

    Mccreary, T.

    1983-01-01

    A parametric description of the communication channels required between the user spacecraft to be supported and the user ground data systems is developed. Scenarios of mission models, which reflect a range of free flyers vs space platform usage as well as levels of NASA activity and potential support for military missions, and potential channel requirements which identify: (1) bounds on TDAS forward and return link data communication demand, and (2) the additional demand for providing navigation/tracking support are covered.

  14. Computer program for Stirling engine performance calculations

    NASA Technical Reports Server (NTRS)

    Tew, R. C., Jr.

    1983-01-01

    The thermodynamic characteristics of the Stirling engine were analyzed and modeled on a computer to support its development as a possible alternative to the automobile spark ignition engine. The computer model is documented. The documentation includes a user's manual, symbols list, a test case, comparison of model predictions with test results, and a description of the analytical equations used in the model.

  15. Adolescent Substance Abuse Treatment in the United States: Exemplary Models from a National Evaluation Study.

    ERIC Educational Resources Information Center

    Stevens, Sally J.; Morral, Andrew R.

    This book provides detailed descriptions of exemplary adolescent drug treatment models and gives the latest information on substance use and its consequences. The examinations of treatment models included in this book include programs serving adolescent substance users from a wide range of ethnic and cultural backgrounds. Chapters include: (1)…

  16. TIM Version 3.0 beta - Technical Description and User's Guidance

    EPA Pesticide Factsheets

    Provides technical information on version 3.0 of the Terrestrial Investigation Model (TIM v.3.0). Describes how TIM derives joint distributions of exposure and toxicity to calculate the risk of mortality to birds.

  17. An automated data management/analysis system for space shuttle orbiter tiles. [stress analysis

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Ballas, M.

    1982-01-01

    An engineering data management system was combined with a nonlinear stress analysis program to provide a capability for analyzing a large number of tiles on the space shuttle orbiter. Tile geometry data and all data necessary of define the tile loads environment accessed automatically as needed for the analysis of a particular tile or a set of tiles. User documentation provided includes: (1) description of computer programs and data files contained in the system; (2) definitions of all engineering data stored in the data base; (3) characteristics of the tile anaytical model; (4) instructions for preparation of user input; and (5) a sample problem to illustrate use of the system. Description of data, computer programs, and analytical models of the tile are sufficiently detailed to guide extension of the system to include additional zones of tiles and/or additional types of analyses

  18. SRB-3D Solid Rocket Booster performance prediction program. Volume 1: Engineering description/users information manual

    NASA Technical Reports Server (NTRS)

    Winkler, J. C.

    1976-01-01

    The modified Solid Rocket Booster Performance Evaluation Model (SRB-3D) was developed as an extension to the internal ballistics module of the SRB-2 performance program. This manual contains the engineering description of SRB-3D which describes the approach used to develop the 3D concept and an explanation of the modifications which were necessary to implement these concepts.

  19. The Helicopter Antenna Radiation Prediction Code (HARP)

    NASA Technical Reports Server (NTRS)

    Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.

    1990-01-01

    The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.

  20. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  1. A sectionwise defined model for the material description of 100Cr6 in the thixotropic state

    NASA Astrophysics Data System (ADS)

    Behrens, B.-A.; Chugreev, A.; Hootak, M.

    2018-05-01

    A sectionwise defined material model has been developed for the numerical description of thixoforming processes. It consists of two sections. The first one describes the material behaviour below the solidus temperature and comprises an approach from structure mechanics, whereas the second section model describes the thixotropic behaviour above the solidus temperature based on the Ostwald-de Waele power law. The material model has been implemented in a commercial FE software Simufact Forming by means of user-defined subroutines. Numerical and experimental investigations of special upsetting tests have been designed and carried out with Armco iron-coated specimens. Finally, the model parameters were fitted by reverse engineering.

  2. FRED user's manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shilling, J.

    1984-02-01

    FRED, the friendly editor, is a screen-based structured editor. This manual is intended to serve the needs of a wide range of users of the FRED text editor. Most users will find it sufficient to read the introductory material in section 2, supplemented with the full command set description in section 3. Advanced users may wish to change the keystroke sequences which invoke editor commands. Section 4 describes how to change key bindings and how to define command macros. Some users may need to modify a language description or create an entirely new language description for use with FRED. Sectionmore » 5 describes the format of the language descriptions used by the editor, and describes how to construct a language grammar. Section 6 describes known portability problems of the FRED editor and should concern only system installation personnel. The editor points out syntax errors in the file being edited and does automatic pretty printing.« less

  3. CIRCAL-2 - General-purpose on-line circuit design.

    NASA Technical Reports Server (NTRS)

    Dertouzos, M. L.; Jessel, G. P.; Stinger, J. R.

    1972-01-01

    CIRCAL-2 is a second-generation general-purpose on-line circuit-design program with the following main features: (1) multiple-analysis capability; (2) uniform and general data structures for handling text editing, network representations, and output results, regardless of analysis; (3) special techniques and structures for minimizing and controlling user-program interaction; (4) use of functionals for the description of hysteresis and heat effects; and (5) ability to define optimization procedures that 'replace' the user. The paper discusses the organization of CIRCAL-2, the aforementioned main features, and their consequences, such as a set of network elements and models general enough for most analyses and a set of functions tailored to circuit-design requirements. The presentation is descriptive, concentrating on conceptual rather than on program implementation details.

  4. Proposing Electronic Health Record Usability Requirements Based on Enriched ISO 9241 Metric Usability Model

    PubMed Central

    Farzandipour, Mehrdad; Riazi, Hossein; Jabali, Monireh Sadeqi

    2018-01-01

    Introduction: System usability assessment is among the important aspects in assessing the quality of clinical information technology, especially when the end users of the system are concerned. This study aims at providing a comprehensive list of system usability. Methods: This research is a descriptive cross-sectional one conducted using Delphi technique in three phases in 2013. After experts’ ideas were concluded, the final version of the questionnaire including 163 items in three phases was presented to 40 users of information systems in hospitals. The grading ranged from 0-4. Data analysis was conducted using SPSS software. Those requirements with a mean point of three or higher were finally confirmed. Results: The list of system usability requirements for electronic health record was designed and confirmed in nine areas including suitability for the task (24 items), self-descriptiveness (22 items), controllability (19 questions), conformity with user expectations (25 items), error tolerance (21 items), suitability for individualization (7 items), suitability for learning (19 items), visual clarity (18 items) and auditory presentation (8 items). Conclusion: A relatively comprehensive model including useful requirements for using EHR was presented which can increase functionality, effectiveness and users’ satisfaction. Thus, it is suggested that the present model be adopted by system designers and healthcare system institutions to assess those systems. PMID:29719310

  5. Myokit: A simple interface to cardiac cellular electrophysiology.

    PubMed

    Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A

    2016-01-01

    Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Program listing for the REEDM (Rocket Exhaust Effluent Diffusion Model) computer program

    NASA Technical Reports Server (NTRS)

    Bjorklund, J. R.; Dumbauld, R. K.; Cheney, C. S.; Geary, H. V.

    1982-01-01

    The program listing for the REEDM Computer Program is provided. A mathematical description of the atmospheric dispersion models, cloud-rise models, and other formulas used in the REEDM model; vehicle and source parameters, other pertinent physical properties of the rocket exhaust cloud and meteorological layering techniques; user's instructions for the REEDM computer program; and worked example problems are contained in NASA CR-3646.

  7. A Test of Strategies for Enhanced Learning of AP Descriptive Chemistry

    ERIC Educational Resources Information Center

    Kotcherlakota, Suhasini; Brooks, David W.

    2008-01-01

    The Advanced Placement (AP) Descriptive Chemistry Website allows users to practice chemistry problems. This study involved the redesign of the Website using worked examples to enhance learner performance. The population sample for the study includes users (students and teachers) interested in learning descriptive chemistry materials. The users…

  8. Extended parametric representation of compressor fans and turbines. Volume 2: Part user's manual (parametric turbine)

    NASA Technical Reports Server (NTRS)

    Coverse, G. L.

    1984-01-01

    A turbine modeling technique has been developed which will enable the user to obtain consistent and rapid off-design performance from design point input. This technique is applicable to both axial and radial flow turbine with flow sizes ranging from about one pound per second to several hundred pounds per second. The axial flow turbines may or may not include variable geometry in the first stage nozzle. A user-specified option will also permit the calculation of design point cooling flow levels and corresponding changes in efficiency for the axial flow turbines. The modeling technique has been incorporated into a time-sharing program in order to facilitate its use. Because this report contains a description of the input output data, values of typical inputs, and example cases, it is suitable as a user's manual. This report is the second of a three volume set. The titles of the three volumes are as follows: (1) Volume 1 CMGEN USER's Manual (Parametric Compressor Generator); (2) Volume 2 PART USER's Manual (Parametric Turbine); (3) Volume 3 MODFAN USER's Manual (Parametric Modulation Flow Fan).

  9. Digital Avionics Information System (DAIS): Life Cycle Cost Impact Modeling System Reliability, Maintainability, and Cost Model (RMCM)--Description. Users Guide. Final Report.

    ERIC Educational Resources Information Center

    Goclowski, John C.; And Others

    The Reliability, Maintainability, and Cost Model (RMCM) described in this report is an interactive mathematical model with a built-in sensitivity analysis capability. It is a major component of the Life Cycle Cost Impact Model (LCCIM), which was developed as part of the DAIS advanced development program to be used to assess the potential impacts…

  10. Improved Point Analysis Model (IPAM) (Users Guide)

    DTIC Science & Technology

    1991-02-01

    Descriptions 7 Stratus fractus of bad weather* or Cumulus fractus of bad weather, or both ( pannus ), usually below Altostratus or Nimbostratus. 8...Cumulonimbus without anvil or fibrous upper part, Cumulus, Stratocumulus, Stratus or pannus . / Stratocumulus, Stratus, Cumulus and Cumulonimbus invisible owing

  11. Advanced space system analysis software. Technical, user, and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Zimbelman, H. F.

    1981-01-01

    The LASS computer program provides a tool for interactive preliminary and conceptual design of LSS. Eight program modules were developed, including four automated model geometry generators, an associated mass properties module, an appendage synthesizer module, an rf analysis module, and an orbital transfer analysis module. The existing rigid body controls analysis module was modified to permit analysis of effects of solar pressure on orbital performance. A description of each module, user instructions, and programmer information are included.

  12. Numerical simulation of dynamics of brushless dc motors for aerospace and other applications. Volume 2: User's guide to computer EMA model

    NASA Technical Reports Server (NTRS)

    Demerdash, N. A. O.; Nehl, T. W.

    1979-01-01

    A description and user's guide of the computer program developed to simulate the dynamics of an electromechanical actuator for aerospace applications are presented. The effects of the stator phase currents on the permanent magnets of the rotor are examined. The voltage and current waveforms present in the power conditioner network during the motoring, regenerative braking, and plugging modes of operation are presented and discussed.

  13. CELCAP: A Computer Model for Cogeneration System Analysis

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A description of the CELCAP cogeneration analysis program is presented. A detailed description of the methodology used by the Naval Civil Engineering Laboratory in developing the CELCAP code and the procedures for analyzing cogeneration systems for a given user are given. The four engines modeled in CELCAP are: gas turbine with exhaust heat boiler, diesel engine with waste heat boiler, single automatic-extraction steam turbine, and back-pressure steam turbine. Both the design point and part-load performances are taken into account in the engine models. The load model describes how the hourly electric and steam demand of the user is represented by 24 hourly profiles. The economic model describes how the annual and life-cycle operating costs that include the costs of fuel, purchased electricity, and operation and maintenance of engines and boilers are calculated. The CELCAP code structure and principal functions of the code are described to how the various components of the code are related to each other. Three examples of the application of the CELCAP code are given to illustrate the versatility of the code. The examples shown represent cases of system selection, system modification, and system optimization.

  14. A New Browser-based, Ontology-driven Tool for Generating Standardized, Deep Descriptions of Geoscience Models

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.

    2016-12-01

    Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts such as variables, objects, quantities, operations, processes and assumptions. The purpose of this talk is to present details of the new ontology and to then demonstrate the MCM Tool for several hydrologic models.

  15. ENHANCED STREAM WATER QUALITY MODELS QUAL2E AND QUAL2E-UNCAS: DOCUMENTATION AND USER MANUAL

    EPA Science Inventory

    The manual is a major revision of the original QUAL2E program documentation released in 1985. It includes a description of the recent modifications and improvements to the widely used water quality models QUAL-II and QUAL2E. The enhancements include an extensive capability for un...

  16. FY 1992-1993 RDT&E Descriptive Summaries: DARPA

    DTIC Science & Technology

    1991-02-01

    combining natural language and user workflow model information. * Determine effectiveness of auditory models as preprocessors for robust speech...for indexing and retrieving design knowledge. * Evaluate ability of message understanding systems to extract crisis -situation data from news wires...energy effects , underwater vehicles, neutrino detection, speech, tailored nuclear weapons, hypervelocity, nanosecond timing, and MAD/RPV. FY 1991 Planned

  17. The Dynamics of Mobile Learning Utilization in Vocational Education: Frame Model Perspective Review

    ERIC Educational Resources Information Center

    Mahande, Ridwan Daud; Susanto, Adhi; Surjono, Herman Dwi

    2017-01-01

    This study aimed to describe the dynamics of content aspects, user aspects and social aspects of mobile learning utilization (m-learning) in vocational education from the FRAME Model perspective review. This study was quantitative descriptive research. The population in this study was teachers and students of state vocational school and private…

  18. An Associative Index Model for the Results List Based on Vannevar Bush's Selection Concept

    ERIC Educational Resources Information Center

    Cole, Charles; Julien, Charles-Antoine; Leide, John E.

    2010-01-01

    Introduction: We define the results list problem in information search and suggest the "associative index model", an ad-hoc, user-derived indexing solution based on Vannevar Bush's description of an associative indexing approach for his memex machine. We further define what selection means in indexing terms with reference to Charles…

  19. Digital Troposcatter Performance Model

    DTIC Science & Technology

    1983-12-01

    Dist Speia DIIBUTON STATEMR AO Approved tot public relemg ** - DistributionUnlimited __________ Communications. Control and Information Systems ...for digital troposcatter communication system design is described. Propagation and modem performance *are modeled. These include Path Loss and RSL...designing digital troposcatter systems . A User’s Manual Report discusses the use of the computer program TROPO. The description of the structure and logical

  20. Proceedings of the 3rd Annual Conference on Aerospace Computational Control, volume 1

    NASA Technical Reports Server (NTRS)

    Bernard, Douglas E. (Editor); Man, Guy K. (Editor)

    1989-01-01

    Conference topics included definition of tool requirements, advanced multibody component representation descriptions, model reduction, parallel computation, real time simulation, control design and analysis software, user interface issues, testing and verification, and applications to spacecraft, robotics, and aircraft.

  1. THE U.S. ENVIRONMENTAL PROTECTION AGENCY VERSION OF POSITIVE MATRIX FACTORIZATION

    EPA Science Inventory

    The abstract describes some of the special features of the EPA's version of Positive Matrix Factorization that is freely distributed. Features include descriptions of the Graphical User Interface, an approach for estimating errors in the modeled solutions, and future development...

  2. Information technology acceptance in health information management.

    PubMed

    Abdekhoda, M; Ahmadi, M; Dehnad, A; Hosseini, A F

    2014-01-01

    User acceptance of information technology has been a significant area of research for more than two decades in the field of information technology. This study assessed the acceptance of information technology in the context of Health Information Management (HIM) by utilizing Technology Acceptance Model (TAM) which was modified and applied to assess user acceptance of health information technology as well as viability of TAM as a research construct in the context of HIM. This was a descriptive- analytical study in which a sample of 187 personnel from a population of 363 personnel, working in medical records departments of hospitals affiliated to Tehran University of Medical Sciences, was selected. Users' perception of applying information technology was studied by a researcher-developed questionnaire. Collected data were analyzed by SPSS software (version16) using descriptive statistics and regression analysis. The results suggest that TAM is a useful construct to assess user acceptance of information technology in the context of HIM. The findings also evidenced the perceived ease of use (PEOU) and perceived usefulness (PE) were positively associated with favorable users' attitudes towards HIM. PU was relatively more associated (r= 0.22, p = 0.05) than PEOU (r = 0.014, p = 0.05) with favorable user attitudes towards HIM. Users' perception of usefulness and ease of use are important determinants providing the incentive for users to accept information technologies when the application of a successful HIM system is attempted. The findings of the present study suggest that user acceptance is a key element and should subsequently be the major concern of health organizations and health policy makers.

  3. Small Business Innovations

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The purpose of QASE RT is to enable system analysts and software engineers to evaluate performance and reliability implications of design alternatives. The program resulted from two Small Business Innovation Research (SBIR) projects. After receiving a description of the system architecture and workload from the user, QASE RT translates the system description into simulation models and executes them. Simulation provides detailed performance evaluation. The results of the evaluations are service and response times, offered load and device utilizations and functional availability.

  4. The Lake Tahoe Basin Land Use Simulation Model

    USGS Publications Warehouse

    Forney, William M.; Oldham, I. Benson

    2011-01-01

    This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.

  5. Computer Aided Design of Polyhedron Solids to Model Air in Com-Geom Descriptions

    DTIC Science & Technology

    1983-08-01

    34The GIFT Code User Manual, Volume I, Introduction and Input Requirements," BRL Report No. 1802, July 1975 (Unclassified). (AD B0060Z7LK 2G...Kuehl, L. Bain and M. Reisinger, "The GIFT Code User Manual, Volume II, The Output Options," BRL Report ARBRL-TR-02189, September 1979...is generated from the GIFT code under op- tion XSECT. This option produces plot files which define cross- sectional views of the COM-GEOM

  6. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  7. Digital computer programs for generating oblique orthographic projections and contour plots

    NASA Technical Reports Server (NTRS)

    Giles, G. L.

    1975-01-01

    User and programer documentation is presented for two programs for automatic plotting of digital data. One of the programs generates oblique orthographic projections of three-dimensional numerical models and the other program generates contour plots of data distributed in an arbitrary planar region. A general description of the computational algorithms, user instructions, and complete listings of the programs is given. Several plots are included to illustrate various program options, and a single example is described to facilitate learning the use of the programs.

  8. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  9. Math Description Engine Software Development Kit

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Dexter, Dan E.; Hodgson, Terry R.

    2010-01-01

    The Math Description Engine Software Development Kit (MDE SDK) can be used by software developers to make computer-rendered graphs more accessible to blind and visually-impaired users. The MDE SDK generates alternative graph descriptions in two forms: textual descriptions and non-verbal sound renderings, or sonification. It also enables display of an animated trace of a graph sonification on a visual graph component, with color and line-thickness options for users having low vision or color-related impairments. A set of accessible graphical user interface widgets is provided for operation by end users and for control of accessible graph displays. Version 1.0 of the MDE SDK generates text descriptions for 2D graphs commonly seen in math and science curriculum (and practice). The mathematically rich text descriptions can also serve as a virtual math and science assistant for blind and sighted users, making graphs more accessible for everyone. The MDE SDK has a simple application programming interface (API) that makes it easy for programmers and Web-site developers to make graphs accessible with just a few lines of code. The source code is written in Java for cross-platform compatibility and to take advantage of Java s built-in support for building accessible software application interfaces. Compiled-library and NASA Open Source versions are available with API documentation and Programmer s Guide at http:/ / prim e.jsc.n asa. gov.

  10. A Novel Recommendation System to Match College Events and Groups to Students

    NASA Astrophysics Data System (ADS)

    Qazanfari, K.; Youssef, A.; Keane, K.; Nelson, J.

    2017-10-01

    With the recent increase in data online, discovering meaningful opportunities can be time-consuming and complicated for many individuals. To overcome this data overload challenge, we present a novel text-content-based recommender system as a valuable tool to predict user interests. To that end, we develop a specific procedure to create user models and item feature-vectors, where items are described in free text. The user model is generated by soliciting from a user a few keywords and expanding those keywords into a list of weighted near-synonyms. The item feature-vectors are generated from the textual descriptions of the items, using modified tf-idf values of the users’ keywords and their near-synonyms. Once the users are modeled and the items are abstracted into feature vectors, the system returns the maximum-similarity items as recommendations to that user. Our experimental evaluation shows that our method of creating the user models and item feature-vectors resulted in higher precision and accuracy in comparison to well-known feature-vector-generating methods like Glove and Word2Vec. It also shows that stemming and the use of a modified version of tf-idf increase the accuracy and precision by 2% and 3%, respectively, compared to non-stemming and the standard tf-idf definition. Moreover, the evaluation results show that updating the user model from usage histories improves the precision and accuracy of the system. This recommender system has been developed as part of the Agnes application, which runs on iOS and Android platforms and is accessible through the Agnes website.

  11. Identifying Frequent Users of an Urban Emergency Medical Service Using Descriptive Statistics and Regression Analyses.

    PubMed

    Norman, Chenelle; Mello, Michael; Choi, Bryan

    2016-01-01

    This retrospective cohort study provides a descriptive analysis of a population that frequently uses an urban emergency medical service (EMS) and identifies factors that contribute to use among all frequent users. For purposes of this study we divided frequent users into the following groups: low- frequent users (4 EMS transports in 2012), medium-frequent users (5 to 6 EMS transports in 2012), high-frequent users (7 to 10 EMS transports in 2012) and super-frequent users (11 or more EMS transports in 2012). Overall, we identified 539 individuals as frequent users. For all groups of EMS frequent users (i.e. low, medium, high and super) one or more hospital admissions, receiving a referral for follow-up care upon discharge, and having no insurance were found to be statistically significant with frequent EMS use (P<0.05). Within the diagnostic categories, 41.61% of super-frequent users had a diagnosis of "primarily substance abuse/misuse" and among low-frequent users a majority, 53.33%, were identified as having a "reoccurring (medical) diagnosis." Lastly, relative risk ratios for the highest group of users, super-frequent users, were 3.34 (95% CI [1.90-5.87]) for obtaining at least one referral for follow-up care, 13.67 (95% CI [5.60-33.34]) for having four or more hospital admissions and 5.95 (95% CI [1.80-19.63]) for having a diagnoses of primarily substance abuse/misuse. Findings from this study demonstrate that among low- and medium-frequent users a majority of patients are using EMS for reoccurring medical conditions. This could potentially be avoided with better care management. In addition, this study adds to the current literature that illustrates a strong correlation between substance abuse/misuse and high/super-frequent EMS use. For the subgroup analysis among individuals 65 years of age and older, we did not find any of the independent variables included in our model to be statistically significant with frequent EMS use.

  12. User's guide to data obtained by the Aerospace Corporation energetic particle spectrometer on ATS-6

    NASA Technical Reports Server (NTRS)

    Paulikas, G. A.; Hilton, H. H.

    1977-01-01

    Descriptions of the energetic particle detector are offered with calibration data, as part of a user's guide to the data obtained by ATS 6. Information on instrumental and operational anomalies and a description of the procedures used to reduce the data are also presented along with a description of the format of the data.

  13. Information transfer satellite concept study. Volume 4: computer manual

    NASA Technical Reports Server (NTRS)

    Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.

    1971-01-01

    The Satellite Telecommunications Analysis and Modeling Program (STAMP) provides the user with a flexible and comprehensive tool for the analysis of ITS system requirements. While obtaining minimum cost design points, the program enables the user to perform studies over a wide range of user requirements and parametric demands. The program utilizes a total system approach wherein the ground uplink and downlink, the spacecraft, and the launch vehicle are simultaneously synthesized. A steepest descent algorithm is employed to determine the minimum total system cost design subject to the fixed user requirements and imposed constraints. In the process of converging to the solution, the pertinent subsystem tradeoffs are resolved. This report documents STAMP through a technical analysis and a description of the principal techniques employed in the program.

  14. SABRINA: an interactive three-dimensional geometry-mnodeling program for MCNP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, J.T. III

    SABRINA is a fully interactive three-dimensional geometry-modeling program for MCNP, a Los Alamos Monte Carlo code for neutron and photon transport. In SABRINA, a user constructs either body geometry or surface geometry models and debugs spatial descriptions for the resulting objects. This enhanced capability significantly reduces effort in constructing and debugging complicated three-dimensional geometry models for Monte Carlo analysis. 2 refs., 33 figs.

  15. NASA AVOSS Fast-Time Wake Prediction Models: User's Guide

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew

    2014-01-01

    The National Aeronautics and Space Administration (NASA) is developing and testing fast-time wake transport and decay models to safely enhance the capacity of the National Airspace System (NAS). The fast-time wake models are empirical algorithms used for real-time predictions of wake transport and decay based on aircraft parameters and ambient weather conditions. The aircraft dependent parameters include the initial vortex descent velocity and the vortex pair separation distance. The atmospheric initial conditions include vertical profiles of temperature or potential temperature, eddy dissipation rate, and crosswind. The current distribution includes the latest versions of the APA (3.4) and the TDP (2.1) models. This User's Guide provides detailed information on the model inputs, file formats, and the model output. An example of a model run and a brief description of the Memphis 1995 Wake Vortex Dataset is also provided.

  16. A User's Guide for the Differential Reduced Ejector/Mixer Analysis "DREA" Program. 1.0

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence J.; Nadell, Shari-Beth

    1999-01-01

    A system of analytical and numerical two-dimensional mixer/ejector nozzle models that require minimal empirical input has been developed and programmed for use in conceptual and preliminary design. This report contains a user's guide describing the operation of the computer code, DREA (Differential Reduced Ejector/mixer Analysis), that contains these mathematical models. This program is currently being adopted by the Propulsion Systems Analysis Office at the NASA Glenn Research Center. A brief summary of the DREA method is provided, followed by detailed descriptions of the program input and output files. Sample cases demonstrating the application of the program are presented.

  17. Simplifying the Reuse and Interoperability of Geoscience Data Sets and Models with Semantic Metadata that is Human-Readable and Machine-actionable

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2017-12-01

    Standardized, deep descriptions of digital resources (e.g. data sets, computational models, software tools and publications) make it possible to develop user-friendly software systems that assist scientists with the discovery and appropriate use of these resources. Semantic metadata makes it possible for machines to take actions on behalf of humans, such as automatically identifying the resources needed to solve a given problem, retrieving them and then automatically connecting them (despite their heterogeneity) into a functioning workflow. Standardized model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. A carefully constructed, unambiguous and rules-based schema to address this problem, called the Geoscience Standard Names ontology will be presented that utilizes Semantic Web best practices and technologies. It has also been designed to work across science domains and to be readable by both humans and machines.

  18. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1993-01-01

    Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.

  19. Mission Analysis Program for Solar Electric Propulsion (MAPSEP). Volume 1: Analytical manual for earth orbital MAPSEP

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An introduction to the MAPSEP organization and a detailed analytical description of all models and algorithms are given. These include trajectory and error covariance propagation methods, orbit determination processes, thrust modeling, and trajectory correction (guidance) schemes. Earth orbital MAPSEP contains the capability of analyzing almost any currently projected low thrust mission from low earth orbit to super synchronous altitudes. Furthermore, MAPSEP is sufficiently flexible to incorporate extended dynamic models, alternate mission strategies, and almost any other system requirement imposed by the user. As in the interplanetary version, earth orbital MAPSEP represents a trade-off between precision modeling and computational speed consistent with defining necessary system requirements. It can be used in feasibility studies as well as in flight operational support. Pertinent operational constraints are available both implicitly and explicitly. However, the reader should be warned that because of program complexity, MAPSEP is only as good as the user and will quickly succumb to faulty user inputs.

  20. Computer-program documentation of an interactive-accounting model to simulate streamflow, water quality, and water-supply operations in a river basin

    USGS Publications Warehouse

    Burns, A.W.

    1988-01-01

    This report describes an interactive-accounting model used to simulate streamflow, chemical-constituent concentrations and loads, and water-supply operations in a river basin. The model uses regression equations to compute flow from incremental (internode) drainage areas. Conservative chemical constituents (typically dissolved solids) also are computed from regression equations. Both flow and water quality loads are accumulated downstream. Optionally, the model simulates the water use and the simplified groundwater systems of a basin. Water users include agricultural, municipal, industrial, and in-stream users , and reservoir operators. Water users list their potential water sources, including direct diversions, groundwater pumpage, interbasin imports, or reservoir releases, in the order in which they will be used. Direct diversions conform to basinwide water law priorities. The model is interactive, and although the input data exist in files, the user can modify them interactively. A major feature of the model is its color-graphic-output options. This report includes a description of the model, organizational charts of subroutines, and examples of the graphics. Detailed format instructions for the input data, example files of input data, definitions of program variables, and listing of the FORTRAN source code are Attachments to the report. (USGS)

  1. Data model and relational database design for the New England Water-Use Data System (NEWUDS)

    USGS Publications Warehouse

    Tessler, Steven

    2001-01-01

    The New England Water-Use Data System (NEWUDS) is a database for the storage and retrieval of water-use data. NEWUDS can handle data covering many facets of water use, including (1) tracking various types of water-use activities (withdrawals, returns, transfers, distributions, consumptive-use, wastewater collection, and treatment); (2) the description, classification and location of places and organizations involved in water-use activities; (3) details about measured or estimated volumes of water associated with water-use activities; and (4) information about data sources and water resources associated with water use. In NEWUDS, each water transaction occurs unidirectionally between two site objects, and the sites and conveyances form a water network. The core entities in the NEWUDS model are site, conveyance, transaction/rate, location, and owner. Other important entities include water resources (used for withdrawals and returns), data sources, and aliases. Multiple water-exchange estimates can be stored for individual transactions based on different methods or data sources. Storage of user-defined details is accommodated for several of the main entities. Numerous tables containing classification terms facilitate detailed descriptions of data items and can be used for routine or custom data summarization. NEWUDS handles single-user and aggregate-user water-use data, can be used for large or small water-network projects, and is available as a stand-alone Microsoft? Access database structure. Users can customize and extend the database, link it to other databases, or implement the design in other relational database applications.

  2. GERTS GQ User's Manual.

    ERIC Educational Resources Information Center

    Akiba, Y.; And Others

    This user's manual for the simulation program Graphical Evaluation and Review Technique (GERT) GQ contains sections on nodes, branches, program input description and format, and program output, as well as examples. Also included is a programmer's manual which contains information on scheduling, subroutine descriptions, COMMON Variables, and…

  3. KABAM Version 1.0 User's Guide and Technical Documentation - Appendix F -Description of Equations Used to Calculate the BCF, BAF, BMF, and BSAF Values

    EPA Pesticide Factsheets

    Describes equations for bioconcentration, bioaccumulation, biomagnification and biota-sediment accumulation factors used in KABAM V1.0. KABAM is a simulation model used to predict pesticide concentrations in aquatic regions for use in exposure assessments.

  4. COMO: a numerical model for predicting furnace performance in axisymmetric geometries. Volume 1. Technical summary. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiveland, W.A.; Oberjohn, W.J.; Cornelius, D.K.

    1985-12-01

    This report summarizes the work conducted during a 30-month contract with the United States Department of Energy (DOE) Pittsburgh Energy Technology Center (PETC). The general objective is to develop and verify a computer code capable of modeling the major aspects of pulverized coal combustion. Achieving this objective will lead to design methods applicable to industrial and utility furnaces. The combustion model (COMO) is based mainly on an existing Babcock and Wilcox (B and W) computer program. The model consists of a number of relatively independent modules that represent the major processes involved in pulverized coal combustion: flow, heterogeneous and homogeneousmore » chemical reaction, and heat transfer. As models are improved or as new ones are developed, this modular structure allows portions of the COMO model to be updated with minimal impact on the remainder of the program. The report consists of two volumes. This volume (Volume 1) contains a technical summary of the COMO model, results of predictions for gas phase combustion, pulverized coal combustion, and a detailed description of the COMO model. Volume 2 is the Users Guide for COMO and contains detailed instructions for preparing the input data and a description of the program output. Several example cases have been included to aid the user in usage of the computer program for pulverized coal applications. 66 refs., 41 figs., 21 tabs.« less

  5. What's all the talk about? Topic modelling in a mental health Internet support group.

    PubMed

    Carron-Arthur, Bradley; Reynolds, Julia; Bennett, Kylie; Bennett, Anthony; Griffiths, Kathleen M

    2016-10-28

    The majority of content in an Internet Support Group (ISG) is contributed by 1 % of the users ('super users'). Computational methods, such as topic modelling, can provide a large-scale quantitative objective description of this content. Such methods may provide a new perspective on the nature of engagement on ISGs including the role of super users and their possible effect on other users. A topic model was computed for all posts (N = 131,004) in the ISG BlueBoard using Latent Dirichlet Allocation. A model containing 25 topics was selected on the basis of intelligibility as determined by diagnostic metrics and qualitative investigation. This model yielded 21 substantive topics for further analysis. Two chi-square tests were conducted separately for each topic to ascertain: (i) if the odds of super users' and other users' posting differed for each topic; and (ii) if for super users the odds of posting differed depending on whether the response was to a super user or to another user. The 21 substantive topics covered a range of issues related to mental health and peer-support. There were significantly higher odds that super users wrote content on 13 topics, with the greatest effects being for Parenting Role (OR [95%CI] = 7.97 [7.85-8.10]), Co-created Fiction (4.22 [4.17-4.27]), Mental Illness (3.13 [3.11-3.16]) and Positive Change (2.82 [2.79-2.84]). There were significantly lower odds for super users on 7 topics, with the greatest effects being for the topics Depression (OR = 0.27 [0.27-0.28]), Medication (0.36 [0.36-0.37]), Therapy (0.55 [0.54-0.55]) and Anxiety (0.55 [0.55-0.55]). However, super users were significantly more likely to write content on 5 out of these 7 topics when responding to other users than when responding to fellow super users. The findings suggest that super users serve the role of emotionally supportive companions with a focus on topics broadly resembling the consumer/carer model of recovery. Other users engage in topics with a greater focus on experiential knowledge, disclosure and informational support, a pattern resembling the clinical symptom-focussed approach to recovery. However, super users modify their content in response to other users in a manner consistent with being 'active help providers'.

  6. Sierra/SolidMechanics 4.48 User's Guide: Addendum for Shock Capabilities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose

    This is an addendum to the Sierra/SolidMechanics 4.48 User's Guide that documents additional capabilities available only in alternate versions of the Sierra/SolidMechanics (Sierra/SM) code. These alternate versions are enhanced to provide capabilities that are regulated under the U.S. Department of State's International Traffic in Arms Regulations (ITAR) export-control rules. The ITAR regulated codes are only distributed to entities that comply with the ITAR export-control requirements. The ITAR enhancements to Sierra/SM in- clude material models with an energy-dependent pressure response (appropriate for very large deformations and strain rates) and capabilities for blast modeling. Since this is an addendum to the standardmore » Sierra/SM user's guide, please refer to that document first for general descriptions of code capability and use.« less

  7. Technical report series on global modeling and data assimilation. Volume 1: Documentation of the Goddard Earth Observing System (GEOS) General Circulation Model, version 1

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); Takacs, Lawrence L.; Molod, Andrea; Wang, Tina

    1994-01-01

    This technical report documents Version 1 of the Goddard Earth Observing System (GEOS) General Circulation Model (GCM). The GEOS-1 GCM is being used by NASA's Data Assimilation Office (DAO) to produce multiyear data sets for climate research. This report provides a documentation of the model components used in the GEOS-1 GCM, a complete description of model diagnostics available, and a User's Guide to facilitate GEOS-1 GCM experiments.

  8. Catalog of Wargaming and Military Simulation Models.

    DTIC Science & Technology

    1982-05-01

    ix War Games and Simulations ............................ 1-823 Functional Index ................................ Appendix A Data Collection Sheet...34AGTM (An Air and Ground Theatre Model); User’s Guide and Program Description," Jan 1974 (NU) TIME REQUIREMENTS: Collection of the data base can be...to exposures. Facilities can be provided in any series to collect and output data on any specific subject, appropriate to the level of the game. 97

  9. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  10. Langley Atmospheric Information Retrieval System (LAIRS): System description and user's guide

    NASA Technical Reports Server (NTRS)

    Boland, D. E., Jr.; Lee, T.

    1982-01-01

    This document presents the user's guide, system description, and mathematical specifications for the Langley Atmospheric Information Retrieval System (LAIRS). It also includes a description of an optimal procedure for operational use of LAIRS. The primary objective of the LAIRS Program is to make it possible to obtain accurate estimates of atmospheric pressure, density, temperature, and winds along Shuttle reentry trajectories for use in postflight data reduction.

  11. Discovering the influential users oriented to viral marketing based on online social networks

    NASA Astrophysics Data System (ADS)

    Zhu, Zhiguo

    2013-08-01

    The target of viral marketing on the platform of popular online social networks is to rapidly propagate marketing information at lower cost and increase sales, in which a key problem is how to precisely discover the most influential users in the process of information diffusion. A novel method is proposed in this paper for helping companies to identify such users as seeds to maximize information diffusion in the viral marketing. Firstly, the user trust network oriented to viral marketing and users’ combined interest degree in the network including isolated users are extensively defined. Next, we construct a model considering the time factor to simulate the process of information diffusion in viral marketing and propose a dynamic algorithm description. Finally, experiments are conducted with a real dataset extracted from the famous SNS website Epinions. The experimental results indicate that the proposed algorithm has better scalability and is less time-consuming. Compared with the classical model, the proposed algorithm achieved a better performance than does the classical method on the two aspects of network coverage rate and time-consumption in our four sub-datasets.

  12. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed Central

    Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation. PMID:12463921

  13. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  14. User manual for SPLASH (Single Panel Lamp and Shroud Helper).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larsen, Marvin Elwood

    2006-02-01

    The radiant heat test facility develops test sets providing well-characterized thermal environments, often representing fires. Many of the components and procedures have become standardized to such an extent that the development of a specialized design tool to determine optimal configurations for radiant heat experiments was appropriate. SPLASH (Single Panel Lamp and Shroud Helper) is that tool. SPLASH is implemented as a user-friendly, Windows-based program that allows a designer to describe a test setup in terms of parameters such as number of lamps, power, position, and separation distance. This document is a user manual for that software. Any incidental descriptions ofmore » theory are only for the purpose of defining the model inputs. The theory for the underlying model is described in SAND2005-2947 (Ref. [1]). SPLASH provides a graphical user interface to define lamp panel and shroud designs parametrically, solves the resulting radiation enclosure problem for up to 2500 surfaces, and provides post-processing to facilitate understanding and documentation of analyzed designs.« less

  15. Engine structures modeling software system: Computer code. User's manual

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.

  16. P-Care BPJS Acceptance Model in Primary Health Centers.

    PubMed

    Markam, Hosizah

    2017-01-01

    Electronic Medical Records (EMR) are increasingly adopted in healthcare facilities. Recently, implementation failure of electronic information systems is known to be caused by not only the quality of technical aspects, but also the user's behavior. It is known as applying the Technology Acceptance Model (TAM). This research aimed to analyze the acceptance model of p-care BPJS in the primary health centers. A total sample of 30 p-care BPJS users was drawn by multistage random sampling in which of these 30 primary health centers participated. Data analysis used both descriptive and inferential statistics. In the phase of structural model, it indicated that p-care BPJS acceptance model in the primary health centers was formed by Perceived Ease of Use (PEOU) and Perceived Usefulness (PU) through Attitude towards use of p-care BPJS and Behavioral Intention to use p-care BPJS.

  17. Orbital Maneuvering Engine Feed System Coupled Stability Investigation, Computer User's Manual

    NASA Technical Reports Server (NTRS)

    Schuman, M. D.; Fertig, K. W.; Hunting, J. K.; Kahn, D. R.

    1975-01-01

    An operating manual for the feed system coupled stability model was given, in partial fulfillment of a program designed to develop, verify, and document a digital computer model that can be used to analyze and predict engine/feed system coupled instabilities in pressure-fed storable propellant propulsion systems over a frequency range of 10 to 1,000 Hz. The first section describes the analytical approach to modelling the feed system hydrodynamics, combustion dynamics, chamber dynamics, and overall engineering model structure, and presents the governing equations in each of the technical areas. This is followed by the program user's guide, which is a complete description of the structure and operation of the computerized model. Last, appendices provide an alphabetized FORTRAN symbol table, detailed program logic diagrams, computer code listings, and sample case input and output data listings.

  18. Data model and relational database design for the New Jersey Water-Transfer Data System (NJWaTr)

    USGS Publications Warehouse

    Tessler, Steven

    2003-01-01

    The New Jersey Water-Transfer Data System (NJWaTr) is a database design for the storage and retrieval of water-use data. NJWaTr can manage data encompassing many facets of water use, including (1) the tracking of various types of water-use activities (withdrawals, returns, transfers, distributions, consumptive-use, wastewater collection, and treatment); (2) the storage of descriptions, classifications and locations of places and organizations involved in water-use activities; (3) the storage of details about measured or estimated volumes of water associated with water-use activities; and (4) the storage of information about data sources and water resources associated with water use. In NJWaTr, each water transfer occurs unidirectionally between two site objects, and the sites and conveyances form a water network. The core entities in the NJWaTr model are site, conveyance, transfer/volume, location, and owner. Other important entities include water resource (used for withdrawals and returns), data source, permit, and alias. Multiple water-exchange estimates based on different methods or data sources can be stored for individual transfers. Storage of user-defined details is accommodated for several of the main entities. Many tables contain classification terms to facilitate the detailed description of data items and can be used for routine or custom data summarization. NJWaTr accommodates single-user and aggregate-user water-use data, can be used for large or small water-network projects, and is available as a stand-alone Microsoft? Access database. Data stored in the NJWaTr structure can be retrieved in user-defined combinations to serve visualization and analytical applications. Users can customize and extend the database, link it to other databases, or implement the design in other relational database applications.

  19. Software Tool Integrating Data Flow Diagrams and Petri Nets

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  20. An R package for analyzing and modeling ranking data

    PubMed Central

    2013-01-01

    Background In medical informatics, psychology, market research and many other fields, researchers often need to analyze and model ranking data. However, there is no statistical software that provides tools for the comprehensive analysis of ranking data. Here, we present pmr, an R package for analyzing and modeling ranking data with a bundle of tools. The pmr package enables descriptive statistics (mean rank, pairwise frequencies, and marginal matrix), Analytic Hierarchy Process models (with Saaty’s and Koczkodaj’s inconsistencies), probability models (Luce model, distance-based model, and rank-ordered logit model), and the visualization of ranking data with multidimensional preference analysis. Results Examples of the use of package pmr are given using a real ranking dataset from medical informatics, in which 566 Hong Kong physicians ranked the top five incentives (1: competitive pressures; 2: increased savings; 3: government regulation; 4: improved efficiency; 5: improved quality care; 6: patient demand; 7: financial incentives) to the computerization of clinical practice. The mean rank showed that item 4 is the most preferred item and item 3 is the least preferred item, and significance difference was found between physicians’ preferences with respect to their monthly income. A multidimensional preference analysis identified two dimensions that explain 42% of the total variance. The first can be interpreted as the overall preference of the seven items (labeled as “internal/external”), and the second dimension can be interpreted as their overall variance of (labeled as “push/pull factors”). Various statistical models were fitted, and the best were found to be weighted distance-based models with Spearman’s footrule distance. Conclusions In this paper, we presented the R package pmr, the first package for analyzing and modeling ranking data. The package provides insight to users through descriptive statistics of ranking data. Users can also visualize ranking data by applying a thought multidimensional preference analysis. Various probability models for ranking data are also included, allowing users to choose that which is most suitable to their specific situations. PMID:23672645

  1. An R package for analyzing and modeling ranking data.

    PubMed

    Lee, Paul H; Yu, Philip L H

    2013-05-14

    In medical informatics, psychology, market research and many other fields, researchers often need to analyze and model ranking data. However, there is no statistical software that provides tools for the comprehensive analysis of ranking data. Here, we present pmr, an R package for analyzing and modeling ranking data with a bundle of tools. The pmr package enables descriptive statistics (mean rank, pairwise frequencies, and marginal matrix), Analytic Hierarchy Process models (with Saaty's and Koczkodaj's inconsistencies), probability models (Luce model, distance-based model, and rank-ordered logit model), and the visualization of ranking data with multidimensional preference analysis. Examples of the use of package pmr are given using a real ranking dataset from medical informatics, in which 566 Hong Kong physicians ranked the top five incentives (1: competitive pressures; 2: increased savings; 3: government regulation; 4: improved efficiency; 5: improved quality care; 6: patient demand; 7: financial incentives) to the computerization of clinical practice. The mean rank showed that item 4 is the most preferred item and item 3 is the least preferred item, and significance difference was found between physicians' preferences with respect to their monthly income. A multidimensional preference analysis identified two dimensions that explain 42% of the total variance. The first can be interpreted as the overall preference of the seven items (labeled as "internal/external"), and the second dimension can be interpreted as their overall variance of (labeled as "push/pull factors"). Various statistical models were fitted, and the best were found to be weighted distance-based models with Spearman's footrule distance. In this paper, we presented the R package pmr, the first package for analyzing and modeling ranking data. The package provides insight to users through descriptive statistics of ranking data. Users can also visualize ranking data by applying a thought multidimensional preference analysis. Various probability models for ranking data are also included, allowing users to choose that which is most suitable to their specific situations.

  2. V and V of Lexical, Syntactic and Semantic Properties for Interactive Systems Through Model Checking of Formal Description of Dialog

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.; Martinie, Celia; Palanque, Philippe

    2013-01-01

    During early phases of the development of an interactive system, future system properties are identified (through interaction with end users in the brainstorming and prototyping phase of the application, or by other stakehold-ers) imposing requirements on the final system. They can be specific to the application under development or generic to all applications such as usability principles. Instances of specific properties include visibility of the aircraft altitude, speed… in the cockpit and the continuous possibility of disengaging the autopilot in whatever state the aircraft is. Instances of generic properties include availability of undo (for undoable functions) and availability of a progression bar for functions lasting more than four seconds. While behavioral models of interactive systems using formal description techniques provide complete and unambiguous descriptions of states and state changes, it does not provide explicit representation of the absence or presence of properties. Assessing that the system that has been built is the right system remains a challenge usually met through extensive use and acceptance tests. By the explicit representation of properties and the availability of tools to support checking these properties, it becomes possible to provide developers with means for systematic exploration of the behavioral models and assessment of the presence or absence of these properties. This paper proposes the synergistic use two tools for checking both generic and specific properties of interactive applications: Petshop and Java PathFinder. Petshop is dedicated to the description of interactive system behavior. Java PathFinder is dedicated to the runtime verification of Java applications and as an extension dedicated to User Interfaces. This approach is exemplified on a safety critical application in the area of interactive cockpits for large civil aircrafts.

  3. Noise produced by turbulent flow into a rotor: Users manual for noise calculation

    NASA Technical Reports Server (NTRS)

    Amiet, R. K.; Egolf, C. G.; Simonich, J. C.

    1989-01-01

    A users manual for a computer program for the calculation of noise produced by turbulent flow into a helicopter rotor is presented. These inputs to the program are obtained from the atmospheric turbulence model and mean flow distortion calculation, described in another volume of this set of reports. Descriptions of the various program modules and subroutines, their function, programming structure, and the required input and output variables are included. This routine is incorporated as one module of NASA's ROTONET helicopter noise prediction program.

  4. Studying the HIT-Complexity Interchange.

    PubMed

    Kuziemsky, Craig E; Borycki, Elizabeth M; Kushniruk, Andre W

    2016-01-01

    The design and implementation of health information technology (HIT) is challenging, particularly when it is being introduced into complex settings. While complex adaptive system (CASs) can be a valuable means of understanding relationships between users, HIT and tasks, much of the existing work using CASs is descriptive in nature. This paper addresses that issue by integrating a model for analyzing task complexity with approaches for HIT evaluation and systems analysis. The resulting framework classifies HIT-user tasks and issues as simple, complicated or complex, and provides insight on how to study them.

  5. The pEst version 2.1 user's manual

    NASA Technical Reports Server (NTRS)

    Murray, James E.; Maine, Richard E.

    1987-01-01

    This report is a user's manual for version 2.1 of pEst, a FORTRAN 77 computer program for interactive parameter estimation in nonlinear dynamic systems. The pEst program allows the user complete generality in definig the nonlinear equations of motion used in the analysis. The equations of motion are specified by a set of FORTRAN subroutines; a set of routines for a general aircraft model is supplied with the program and is described in the report. The report also briefly discusses the scope of the parameter estimation problem the program addresses. The report gives detailed explanations of the purpose and usage of all available program commands and a description of the computational algorithms used in the program.

  6. User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.

    1982-01-01

    This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.

  7. TogoTable: cross-database annotation system using the Resource Description Framework (RDF) data model.

    PubMed

    Kawano, Shin; Watanabe, Tsutomu; Mizuguchi, Sohei; Araki, Norie; Katayama, Toshiaki; Yamaguchi, Atsuko

    2014-07-01

    TogoTable (http://togotable.dbcls.jp/) is a web tool that adds user-specified annotations to a table that a user uploads. Annotations are drawn from several biological databases that use the Resource Description Framework (RDF) data model. TogoTable uses database identifiers (IDs) in the table as a query key for searching. RDF data, which form a network called Linked Open Data (LOD), can be searched from SPARQL endpoints using a SPARQL query language. Because TogoTable uses RDF, it can integrate annotations from not only the reference database to which the IDs originally belong, but also externally linked databases via the LOD network. For example, annotations in the Protein Data Bank can be retrieved using GeneID through links provided by the UniProt RDF. Because RDF has been standardized by the World Wide Web Consortium, any database with annotations based on the RDF data model can be easily incorporated into this tool. We believe that TogoTable is a valuable Web tool, particularly for experimental biologists who need to process huge amounts of data such as high-throughput experimental output. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of themore » physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.« less

  9. SYSTID - A flexible tool for the analysis of communication systems.

    NASA Technical Reports Server (NTRS)

    Dawson, C. T.; Tranter, W. H.

    1972-01-01

    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  10. Tree value system: description and assumptions.

    Treesearch

    D.G. Briggs

    1989-01-01

    TREEVAL is a microcomputer model that calculates tree or stand values and volumes based on product prices, manufacturing costs, and predicted product recovery. It was designed as an aid in evaluating management regimes. TREEVAL calculates values in either of two ways, one based on optimized tree bucking using dynamic programming and one simulating the results of user-...

  11. 16 CFR 305.3 - Description of covered products.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... operation. Some models may require user intervention to initiate these different segments of the cycle after... inner reflective coating on the outer bulb to direct the light, an R, PAR, or similar bulb shapes... bulb to direct the light, an R, PAR, ER, BR, BPAR, or similar bulb shapes with E26 medium screw bases...

  12. The Chaos Theory of Careers: A User's Guide

    ERIC Educational Resources Information Center

    Bright, Jim E. H.; Pryor, Robert G. L.

    2005-01-01

    The purpose of this article is to set out the key elements of the Chaos Theory of Careers. The complexity of influences on career development presents a significant challenge to traditional predictive models of career counseling. Chaos theory can provide a more appropriate description of career behavior, and the theory can be applied with clients…

  13. Program documentation. Program description and user information for the hydraulics/auxiliary power unit (HYDRA) computer program. [for the space shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Redwine, W. J.

    1979-01-01

    A timeline containing altitude, control surface deflection rates and angles, hinge moment loads, thrust vector control gimbal rates, and main throttle settings is used to derive the model. The timeline is constructed from the output of one or more trajectory simulation programs.

  14. User's manual: Subsonic/supersonic advanced panel pilot code

    NASA Technical Reports Server (NTRS)

    Moran, J.; Tinoco, E. N.; Johnson, F. T.

    1978-01-01

    Sufficient instructions for running the subsonic/supersonic advanced panel pilot code were developed. This software was developed as a vehicle for numerical experimentation and it should not be construed to represent a finished production program. The pilot code is based on a higher order panel method using linearly varying source and quadratically varying doublet distributions for computing both linearized supersonic and subsonic flow over arbitrary wings and bodies. This user's manual contains complete input and output descriptions. A brief description of the method is given as well as practical instructions for proper configurations modeling. Computed results are also included to demonstrate some of the capabilities of the pilot code. The computer program is written in FORTRAN IV for the SCOPE 3.4.4 operations system of the Ames CDC 7600 computer. The program uses overlay structure and thirteen disk files, and it requires approximately 132000 (Octal) central memory words.

  15. Investigation of the Constitutive Model Used in Nonlinear, Incremental Structural Analyses.

    DTIC Science & Technology

    1998-06-01

    package, ABAQUS , was chosen for performing NISA studies in part because user supplied subroutines could be used for constitutive relationships. After a...loading and the shrinkage and thermally induced strains determined from control specimens. The majority of creep tests are uniaxial compressive tests...Kennedy, and Perry (1970). Description of FE Model The tests were simulated using the finite element (FE) program ABAQUS and the aging viscoelastic

  16. HyRAM V1.0 User Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Zumwalt, Hannah Ruth; Clark, Andrew Jordan

    2016-03-01

    Hydrogen Risk Assessment Models (HyRAM) is a prototype software toolkit that integrates data and methods relevant to assessing the safety of hydrogen fueling and storage infrastructure. The HyRAM toolkit integrates deterministic and probabilistic models for quantifying accident scenarios, predicting physical effects, and characterizing the impact of hydrogen hazards, including thermal effects from jet fires and thermal pressure effects from deflagration. HyRAM version 1.0 incorporates generic probabilities for equipment failures for nine types of components, and probabilistic models for the impact of heat flux on humans and structures, with computationally and experimentally validated models of various aspects of gaseous hydrogen releasemore » and flame physics. This document provides an example of how to use HyRAM to conduct analysis of a fueling facility. This document will guide users through the software and how to enter and edit certain inputs that are specific to the user-defined facility. Description of the methodology and models contained in HyRAM is provided in [1]. This User’s Guide is intended to capture the main features of HyRAM version 1.0 (any HyRAM version numbered as 1.0.X.XXX). This user guide was created with HyRAM 1.0.1.798. Due to ongoing software development activities, newer versions of HyRAM may have differences from this guide.« less

  17. Red Storm usage model :Version 1.12.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jefferson, Karen L.; Sturtevant, Judith E.

    Red Storm is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Sandia National Laboratories (SNL). The Red Storm Usage Model (RSUM) documents the capabilities and the environment provided for the FY05 Tri-Lab Level II Limited Availability Red Storm User Environment Milestone and the FY05 SNL Level II Limited Availability Red Storm Platform Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL),more » and SNL. Additionally, the Red Storm Usage Model maps the provided capabilities to the Tri-Lab ASC Computing Environment (ACE) requirements. The ACE requirements reflect the high performance computing requirements for the ASC community and have been updated in FY05 to reflect the community's needs. For each section of the RSUM, Appendix I maps the ACE requirements to the Limited Availability User Environment capabilities and includes a description of ACE requirements met and those requirements that are not met in that particular section. The Red Storm Usage Model, along with the ACE mappings, has been issued and vetted throughout the Tri-Lab community.« less

  18. Grid-coordinate generation program

    USGS Publications Warehouse

    Cosner, Oliver J.; Horwich, Esther

    1974-01-01

    This program description of the grid-coordinate generation program is written for computer users who are familiar with digital aquifer models. The program computes the coordinates for a variable grid -used in the 'Pinder Model' (a finite-difference aquifer simulator), for input to the CalComp GPCP (general purpose contouring program). The program adjusts the y-value by a user-supplied constant in order to transpose the origin of the model grid from the upper left-hand corner to the lower left-hand corner of the grid. The user has the options of, (1.) choosing the boundaries of the plot; (2.) adjusting the z-values (altitudes) by a constant; (3.) deleting superfluous z-values and (4.) subtracting the simulated surfaces from each other to obtain the decline. Output of this program includes the fixed format CNTL data cards and the other data cards required for input to GPCP. The output from GPCP then is used to produce a potentiometric map or a decline map by means of the CalComp plotter.

  19. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions.more » The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations.« less

  20. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    PubMed

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined.

  1. Aircraft/Air Traffic Management Functional Analysis Model. Version 2.0; User's Guide

    NASA Technical Reports Server (NTRS)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) a National Aeronautics and Space Administration (NASA) contract. This document provides a guide for using the model in analysis. Those interested in making enhancements or modification to the model should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Technical Description.

  2. Effect of descriptive information and experience on automation reliance.

    PubMed

    Yuviler-Gavish, Nirit; Gopher, Daniel

    2011-06-01

    The present research addresses the issue of reliance on decision support systems for the long-term (DSSLT), which help users develop decision-making strategies and long-term planning. It is argued that providing information about a system's future performance in an experiential manner, as compared with a descriptive manner, encourages users to increase their reliance level. Establishing appropriate reliance on DSSLT is contingent on the system developer's ability to provide users with information about the system's future performance. A sequence of three studies contrasts the effect on automation reliance of providing descriptive information versus experience for DSSLT with two different positive expected values of recommendations. Study I demonstrated that when automation reliance was determined solely on the basis of description, it was relatively low, but it increased significantly when a decision was made after experience with 50 training simulations. Participants were able to learn to increase their automation reliance levels when they encountered the same type of recommendation again. Study 2 showed that the absence of preliminary descriptive information did not affect the automation reliance levels obtained after experience. Study 3 demonstrated that participants were able to generalize their learning about increasing reliance levels to new recommendations. Using experience rather than description to give users information about future performance in DSSLT can help increase automation reliance levels. Implications for designing DSSLT and decision support systems in general are discussed.

  3. TRAP/SEE Code Users Manual for Predicting Trapped Radiation Environments

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    TRAP/SEE is a PC-based computer code with a user-friendly interface which predicts the ionizing radiation exposure of spacecraft having orbits in the Earth's trapped radiation belts. The code incorporates the standard AP8 and AE8 trapped proton and electron models but also allows application of an improved database interpolation method. The code treats low-Earth as well as highly-elliptical Earth orbits, taking into account trajectory perturbations due to gravitational forces from the Moon and Sun, atmospheric drag, and solar radiation pressure. Orbit-average spectra, peak spectra per orbit, and instantaneous spectra at points along the orbit trajectory are calculated. Described in this report are the features, models, model limitations and uncertainties, input and output descriptions, and example calculations and applications for the TRAP/SEE code.

  4. Tracking and data relay satellite system configuration and tradeoff study. Volume 5: TDRS spacecraft design, part 1

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A dual spin stabilized TDR spacecraft design is presented for low data rate (LDR) and medium data rate (MDR) user spacecraft telecommunication relay service. The relay satellite provides command and data return channels for unmanned users together with duplex voice and data communication channels for manned user spacecraft. TDRS/ground links are in the Ku band. Command links are provided at UHF for LDR users and S band for MDR users. Voice communication channels are provided at UHF/VHF for LDR users and at S band for MDR users. The spacecraft is designed for launch on the Delta 2914 with system deployment planned for 1978. This volume contains a description of the overall TDR spacecraft configuration, a detailed description of the spacecraft subsystems, a reliability analysis, and a product effectiveness plan.

  5. Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks

    NASA Astrophysics Data System (ADS)

    Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.

    2010-12-01

    Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC).

  6. Non-RF Chain of Custody Item Monitor (CoCIM) User Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brotz, Jay Kristoffer; Wade, James Rokwel; Schwartz, Steven Robert

    This User Manual contains a description of the wired and infrared (IR) variants of the Chain of Custody Item Monitor (CoCIM), the Coordinator for reading stored messages, and the inspector Message Viewer user interface (UI) software, as well as instructions for use. This manual does not include descriptions or use instructions for the radio frequency (RF) variant of the CoCIM. The intended audience is planners and participants in treaty verification exercises where chain of custody (CoC) elements are required.

  7. AIDPRF/PRFAID user's manual

    NASA Technical Reports Server (NTRS)

    Buck, C. H.

    1975-01-01

    The program documentation for the PRF ARTWORK/AIDS conversion program, which serves as the interface between the outputs of the PRF ARTWORK and AIDS programs, was presented. The document has a two-fold purpose, the first of which is a description of the software design including flowcharts of the design at the functional level. The second purpose is to provide the user with a detailed description of the input parameters and formats necessary to execute the program and a description of the output produced when the program is executed.

  8. AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-10-01

    AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parametermore » modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table. (RWR)« less

  9. An XML Data Model for Inverted Image Indexing

    NASA Astrophysics Data System (ADS)

    So, Simon W.; Leung, Clement H. C.; Tse, Philip K. C.

    2003-01-01

    The Internet world makes increasing use of XML-based technologies. In multimedia data indexing and retrieval, the MPEG-7 standard for Multimedia Description Scheme is specified using XML. The flexibility of XML allows users to define other markup semantics for special contexts, construct data-centric XML documents, exchange standardized data between computer systems, and present data in different applications. In this paper, the Inverted Image Indexing paradigm is presented and modeled using XML Schema.

  10. PROGRAM VSAERO: A computer program for calculating the non-linear aerodynamic characteristics of arbitrary configurations: User's manual

    NASA Technical Reports Server (NTRS)

    Maskew, B.

    1982-01-01

    VSAERO is a computer program used to predict the nonlinear aerodynamic characteristics of arbitrary three-dimensional configurations in subsonic flow. Nonlinear effects of vortex separation and vortex surface interaction are treated in an iterative wake-shape calculation procedure, while the effects of viscosity are treated in an iterative loop coupling potential-flow and integral boundary-layer calculations. The program employs a surface singularity panel method using quadrilateral panels on which doublet and source singularities are distributed in a piecewise constant form. This user's manual provides a brief overview of the mathematical model, instructions for configuration modeling and a description of the input and output data. A listing of a sample case is included.

  11. The inner zone electron model AE-5

    NASA Technical Reports Server (NTRS)

    Teague, M. J.; Vette, J. I.

    1972-01-01

    A description is given of the work performed in the development of the inner radiation zone electron model, AE-5. A complete description of the omnidirectional flux model is given for energy thresholds E sub T in the range 4.0 E sub T/(MeV) 0.04 and for L values in the range 2.8 L 1.2 for an epoch of October 1967. Confidence codes for certain regions of B-L space and certain energies are given based on data coverage and the assumptions made in the analysis. The electron model programs that can be supplied to a user are referred to. One of these, a program for accessing the model flux at arbitrary points in B-L space and arbitrary energies, includes the latest outer zone electron model and proton model. The model AE-5, is based on data from five satellites, OGO 1, OGO 3, 1963-38C, OV3-3, and Explorer 26, spanning the period December 1964 to December 1967.

  12. A Fractional Cartesian Composition Model for Semi-Spatial Comparative Visualization Design.

    PubMed

    Kolesar, Ivan; Bruckner, Stefan; Viola, Ivan; Hauser, Helwig

    2017-01-01

    The study of spatial data ensembles leads to substantial visualization challenges in a variety of applications. In this paper, we present a model for comparative visualization that supports the design of according ensemble visualization solutions by partial automation. We focus on applications, where the user is interested in preserving selected spatial data characteristics of the data as much as possible-even when many ensemble members should be jointly studied using comparative visualization. In our model, we separate the design challenge into a minimal set of user-specified parameters and an optimization component for the automatic configuration of the remaining design variables. We provide an illustrated formal description of our model and exemplify our approach in the context of several application examples from different domains in order to demonstrate its generality within the class of comparative visualization problems for spatial data ensembles.

  13. User's Guide for Monthly Vector Wind Profile Model

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    1999-01-01

    The background, theoretical concepts, and methodology for construction of vector wind profiles based on a statistical model are presented. The derived monthly vector wind profiles are to be applied by the launch vehicle design community for establishing realistic estimates of critical vehicle design parameter dispersions related to wind profile dispersions. During initial studies a number of months are used to establish the model profiles that produce the largest monthly dispersions of ascent vehicle aerodynamic load indicators. The largest monthly dispersions for wind, which occur during the winter high-wind months, are used for establishing the design reference dispersions for the aerodynamic load indicators. This document includes a description of the computational process for the vector wind model including specification of input data, parameter settings, and output data formats. Sample output data listings are provided to aid the user in the verification of test output.

  14. Injector Design Tool Improvements: User's manual for FDNS V.4.5

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Shang, Huan-Min; Wei, Hong; Liu, Jiwen

    1998-01-01

    The major emphasis of the current effort is in the development and validation of an efficient parallel machine computational model, based on the FDNS code, to analyze the fluid dynamics of a wide variety of liquid jet configurations for general liquid rocket engine injection system applications. This model includes physical models for droplet atomization, breakup/coalescence, evaporation, turbulence mixing and gas-phase combustion. Benchmark validation cases for liquid rocket engine chamber combustion conditions will be performed for model validation purpose. Test cases may include shear coaxial, swirl coaxial and impinging injection systems with combinations LOXIH2 or LOXISP-1 propellant injector elements used in rocket engine designs. As a final goal of this project, a well tested parallel CFD performance methodology together with a user's operation description in a final technical report will be reported at the end of the proposed research effort.

  15. The DART dispersion analysis research tool: A mechanistic model for predicting fission-product-induced swelling of aluminum dispersion fuels. User`s guide for mainframe, workstation, and personal computer applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rest, J.

    1995-08-01

    This report describes the primary physical models that form the basis of the DART mechanistic computer model for calculating fission-product-induced swelling of aluminum dispersion fuels; the calculated results are compared with test data. In addition, DART calculates irradiation-induced changes in the thermal conductivity of the dispersion fuel, as well as fuel restructuring due to aluminum fuel reaction, amorphization, and recrystallization. Input instructions for execution on mainframe, workstation, and personal computers are provided, as is a description of DART output. The theory of fission gas behavior and its effect on fuel swelling is discussed. The behavior of these fission products inmore » both crystalline and amorphous fuel and in the presence of irradiation-induced recrystallization and crystalline-to-amorphous-phase change phenomena is presented, as are models for these irradiation-induced processes.« less

  16. A new bio-inspired, population-level approach to the socioeconomic evolution of dynamic spectrum access services

    NASA Astrophysics Data System (ADS)

    Horvath, Denis; Gazda, Juraj; Brutovsky, Branislav

    Evolutionary species and quasispecies models provide the universal and flexible basis for a large-scale description of the dynamics of evolutionary systems, which can be built conceived as a constraint satisfaction dynamics. It represents a general framework to design and study many novel, technologically contemporary models and their variants. Here, we apply the classical quasispecies concept to model the emerging dynamic spectrum access (DSA) markets. The theory describes the mechanisms of mimetic transfer, competitive interactions between socioeconomic strata of the end-users, their perception of the utility and inter-operator switching in the variable technological environments of the operators offering the wireless spectrum services. The algorithmization and numerical modeling demonstrate the long-term evolutionary socioeconomic changes which reflect the end-user preferences and results of the majorization of their irrational decisions in the same manner as the prevailing tendencies which are embodied in the efficient market hypothesis.

  17. The SBOL Stack: A Platform for Storing, Publishing, and Sharing Synthetic Biology Designs.

    PubMed

    Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Pocock, Matthew; Flanagan, Keith; Hallinan, Jennifer; Wipat, Anil

    2016-06-17

    Recently, synthetic biologists have developed the Synthetic Biology Open Language (SBOL), a data exchange standard for descriptions of genetic parts, devices, modules, and systems. The goals of this standard are to allow scientists to exchange designs of biological parts and systems, to facilitate the storage of genetic designs in repositories, and to facilitate the description of genetic designs in publications. In order to achieve these goals, the development of an infrastructure to store, retrieve, and exchange SBOL data is necessary. To address this problem, we have developed the SBOL Stack, a Resource Description Framework (RDF) database specifically designed for the storage, integration, and publication of SBOL data. This database allows users to define a library of synthetic parts and designs as a service, to share SBOL data with collaborators, and to store designs of biological systems locally. The database also allows external data sources to be integrated by mapping them to the SBOL data model. The SBOL Stack includes two Web interfaces: the SBOL Stack API and SynBioHub. While the former is designed for developers, the latter allows users to upload new SBOL biological designs, download SBOL documents, search by keyword, and visualize SBOL data. Since the SBOL Stack is based on semantic Web technology, the inherent distributed querying functionality of RDF databases can be used to allow different SBOL stack databases to be queried simultaneously, and therefore, data can be shared between different institutes, centers, or other users.

  18. ETA-MACRO: A user's guide

    NASA Astrophysics Data System (ADS)

    Manne, A. S.

    1981-02-01

    The ETA-MACRO model is designed to estimate the extent of two way linkage between the energy sector and the balance of the economy. It represents a merger between ETA (a process analysis for energy technology assessment) together with a macroeconomic growth model providing for substitution between capital, labor, and energy inputs. The ETA-MACRO allows explicitly for: (1) energy economy interactions; (2) cost effective conservation; (3) interfuel substitution, and (4) new supply technologies, each with its own difficulties and uncertainties on dates and rates of introduction. This user's guide includes an overview of the model, an illustrative application to long term US energy projections, and technical descriptions of the macro and ETA submodels. It also includes an analysis of how market penetration rates may be related to the profitability of new technologies. Finally, the appendices provide a detailed guide to the computer implementation.

  19. archAR: an archaeological augmented reality experience

    NASA Astrophysics Data System (ADS)

    Wiley, Bridgette; Schulze, Jürgen P.

    2015-03-01

    We present an application for Android phones or tablets called "archAR" that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD's Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm's Vuforia API, we use an image target as a map and overlay a three-dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to "zoom" into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.

  20. Porphyry copper deposit model: Chapter B in Mineral deposit models for resource assessment

    USGS Publications Warehouse

    Ayuso, Robert A.; Barton, Mark D.; Blakely, Richard J.; Bodnar, Robert J.; Dilles, John H.; Gray, Floyd; Graybeal, Fred T.; Mars, John L.; McPhee, Darcy K.; Seal, Robert R.; Taylor, Ryan D.; Vikre, Peter G.; John, David A.

    2010-01-01

    This report contains a revised descriptive model of porphyry copper deposits (PCDs), the world's largest source (about 60 percent) and resource (about 65 percent) of copper and a major source of molybdenum, gold and silver. Despite relatively low grades (average 0.44 percent copper in 2008), PCDs have significant economic and societal impacts due to their large size (commonly hundreds of millions to billions of metric tons), long mine lives (decades), and high production rates (billions of kilograms of copper per year). The revised model describes the geotectonic setting of PCDs, and provides extensive regional- to deposit-scale descriptions and illustrations of geological, geochemical, geophysical, and geoenvironmental characteristics. Current genetic theories are reviewed and evaluated, knowledge gaps are identified, and a variety of exploration and assessment guides are presented. A summary is included for users seeking overviews of specific topics.

  1. Cost and Usage Study of the Educational Resources Information Center (ERIC) System. A Descriptive Summary.

    ERIC Educational Resources Information Center

    Heinmiller, Joseph L.

    Based on data gathered from a number of complementary sources, this study provides a detailed descriptive analysis of both the direct and indirect costs incurred by the Federal government in operating the ERIC system, and the user population and user demand for ERIC products and services. Data sources included a survey of ERIC's U.S. intermediate…

  2. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  3. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less

  4. REXOR 2 rotorcraft simulation model. Volume 1: Engineering documentation

    NASA Technical Reports Server (NTRS)

    Reaser, J. S.; Kretsinger, P. H.

    1978-01-01

    A rotorcraft nonlinear simulation called REXOR II, divided into three volumes, is described. The first volume is a development of rotorcraft mechanics and aerodynamics. The second is a development and explanation of the computer code required to implement the equations of motion. The third volume is a user's manual, and contains a description of code input/output as well as operating instructions.

  5. Statistical Package User’s Guide.

    DTIC Science & Technology

    1980-08-01

    261 C. STACH Nonparametric Descriptive Statistics ... ......... ... 265 D. CHIRA Coefficient of Concordance...135 I.- -a - - W 7- Test Data: This program was tested using data from John Neter and William Wasserman, Applied Linear Statistical Models: Regression...length of data file e. new fileý name (not same as raw data file) 5. Printout as optioned for only. Comments: Ranked data are used for program CHIRA

  6. Impact of Information and Communication Technology on Information Seeking Behavior of Users in Astronomy and Astrophysics Centers of India: A Survey

    NASA Astrophysics Data System (ADS)

    Sahu, H. K.; Singh, S. N.

    2010-10-01

    This study is based on a survey designed to determine the Information Seeking Behavior (ISB) of Astronomy and Astrophysics users in India. The main objective of the study is to determine the sources consulted and the general pattern of the information-gathering system of users and the impact of Information and Communication Technology (ICT) on the Astronomy and Astrophysics user's Information Seeking Behavior. It examines various Information and Communication Technology-based resources and methods of access and use. A descriptive sample stratified method has been used and data was collected using a questionnaire as the main tool. The response rate was 72%. Descriptive statistics were also employed and data have been presented in tables and graphs. The study is supported by earlier studies. It shows that Astronomy and Astrophysics users have developed a unique Information Seeking Behavior to carry out their education and research. The vast majority of respondents reported that more information is available from a variety of e-resources. Consequently, they are able to devote more time to seek out relevant information in the current Information and Communication Technology scenario. The study also indicates that respondents use a variety of information resources including e-resources for teaching and research. Books and online databases such as the NASA Astrophysics Data System (ADS) were considered more important as formal sources of information. E-mail and face-to-face communications are used extensively by users as informal sources of information. It also reveals that despite the presence of electronic sources, Astronomy and Astrophysics users are still using printed materials. This study should to help to improve various Information and Communication Technology-based services. It also suggests that GOI should adopt Information and Communication Technology-based Information Centers and Libraries services and recommends a network-based model for Astronomy and Astrophysics users.

  7. Online Tools for Astronomy and Cosmochemistry

    NASA Technical Reports Server (NTRS)

    Meyer, B. S.

    2005-01-01

    Over the past year, the Webnucleo Group at Clemson University has been developing a web site with a number of interactive online tools for astronomy and cosmochemistry applications. The site uses SHP (Simplified Hypertext Preprocessor), which, because of its flexibility, allows us to embed almost any computer language into our web pages. For a description of SHP, please see http://www.joeldenny.com/ At our web site, an internet user may mine large and complex data sets, such as our stellar evolution models, and make graphs or tables of the results. The user may also run some of our detailed nuclear physics and astrophysics codes, such as our nuclear statistical equilibrium code, which is written in fortran and C. Again, the user may make graphs and tables and download the results.

  8. Updates to watershed modeling in the Potholes Reservoir basin, Washington-a supplement to Scientific Investigation Report 2009-5081

    USGS Publications Warehouse

    Mastin, Mark

    2012-01-01

    A previous collaborative effort between the U.S. Geological Survey and the Bureau of Reclamation resulted in a watershed model for four watersheds that discharge into Potholes Reservoir, Washington. Since the model was constructed, two new meteorological sites have been established that provide more reliable real-time information. The Bureau of Reclamation was interested in incorporating this new information into the existing watershed model developed in 2009, and adding measured snowpack information to update simulated results and to improve forecasts of runoff. This report includes descriptions of procedures to aid a user in making model runs, including a description of the Object User Interface for the watershed model with details on specific keystrokes to generate model runs for the contributing basins. A new real-time, data-gathering computer program automates the creation of the model input files and includes the new meteorological sites. The 2009 watershed model was updated with the new sites and validated by comparing simulated results to measured data. As in the previous study, the updated model (2012 model) does a poor job of simulating individual storms, but a reasonably good job of simulating seasonal runoff volumes. At three streamflow-gaging stations, the January 1 to June 30 retrospective forecasts of runoff volume for years 2010 and 2011 were within 40 percent of the measured runoff volume for five of the six comparisons, ranging from -39.4 to 60.3 percent difference. A procedure for collecting measured snowpack data and using the data in the watershed model for forecast model runs, based on the Ensemble Streamflow Prediction method, is described, with an example that uses 2004 snow-survey data.

  9. 75 FR 63253 - State-56, Network User Account Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ... DEPARTMENT OF STATE [Public Notice 7210] State-56, Network User Account Records SUMMARY: Notice is hereby given that the Department of State proposes to create a system of records, Network User Account... named ``Network User Account Records.'' It is also proposed that the new system description will be...

  10. User's manual for three dimensional FDTD version D code for scattering from frequency-dependent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code version D is a 3-D numerical electromagnetic scattering code based upon the finite difference time domain technique (FDTD). The manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction; description of the FDTD method; operation; resource requirements; version D code capabilities; a brief description of the default scattering geometry; a brief description of each subroutine; a description of the include file; a section briefly discussing Radar Cross Section computations; a section discussing some scattering results; a sample problem setup section; a new problem checklist; references and figure titles. The FDTD technique models transient electromagnetic scattering and interactions with objects of arbitrary shape and/or material composition. In the FDTD method, Maxwell's curl equations are discretized in time-space and all derivatives (temporal and spatial) are approximated by central differences.

  11. MAPA: Implementation of the Standard Interchange Format and use for analyzing lattices

    NASA Astrophysics Data System (ADS)

    Shasharina, Svetlana G.; Cary, John R.

    1997-05-01

    MAPA (Modular Accelerator Physics Analysis) is an object oriented application for accelerator design and analysis with a Motif based graphical user interface. MAPA has been ported to AIX, Linux, HPUX, Solaris, and IRIX. MAPA provides an intuitive environment for accelerator study and design. The user can bring up windows for fully nonlinear analysis of accelerator lattices in any number of dimensions. The current graphical analysis methods of Lifetime plots and Surfaces of Section have been used to analyze the improved lattice designs of Wan, Cary, and Shasharina (this conference). MAPA can now read and write Standard Interchange Format (MAD) accelerator description files and it has a general graphical user interface for adding, changing, and deleting elements. MAPA's consistency checks prevent deletion of used elements and prevent creation of recursive beam lines. Plans include development of a richer set of modeling tools and the ability to invoke existing modeling codes through the MAPA interface. MAPA will be demonstrated on a Pentium 150 laptop running Linux.

  12. [Assessment levels of the user's satisfaction in a private hospital].

    PubMed

    da Cruz, Wilma Batista Souza; Melleiro, Marta Maria

    2010-03-01

    The objective of this study was to analyze the satisfaction of the users of a private hospital in terms of a number of attributes of the services in units. This exploratory, descriptive study used a quantitative approach and was developed in a private hospital in the city of São Paulo. The casuistry consisted of 71 users and data collection was performed during the period from March to August 2007, using a derivative of the scale model of the Service Quality (SERVQUAL) of the evaluative model of Parasuraman et al. The level of overall satisfaction ranged around 95%. The assurance (96%) and reliability (96%) were considered the most important dimensions of quality, followed by empathy (95%), responsiveness (93%) and tangibility (88%). Medical and nursing staffs introduced high levels of satisfaction. 91% has intention to recommend the hospital. This research provided the knowledge of the attributes most important in terms of satisfaction and contributed to confirming or reshaping assistance and management processes.

  13. DYGABCD: A program for calculating linear A, B, C, and D matrices from a nonlinear dynamic engine simulation

    NASA Technical Reports Server (NTRS)

    Geyser, L. C.

    1978-01-01

    A digital computer program, DYGABCD, was developed that generates linearized, dynamic models of simulated turbofan and turbojet engines. DYGABCD is based on an earlier computer program, DYNGEN, that is capable of calculating simulated nonlinear steady-state and transient performance of one- and two-spool turbojet engines or two- and three-spool turbofan engines. Most control design techniques require linear system descriptions. For multiple-input/multiple-output systems such as turbine engines, state space matrix descriptions of the system are often desirable. DYGABCD computes the state space matrices commonly referred to as the A, B, C, and D matrices required for a linear system description. The report discusses the analytical approach and provides a users manual, FORTRAN listings, and a sample case.

  14. Semantic message oriented middleware for publish/subscribe networks

    NASA Astrophysics Data System (ADS)

    Li, Han; Jiang, Guofei

    2004-09-01

    The publish/subscribe paradigm of Message Oriented Middleware provides a loosely coupled communication model between distributed applications. Traditional publish/subscribe middleware uses keywords to match advertisements and subscriptions and does not support deep semantic matching. To this end, we designed and implemented a Semantic Message Oriented Middleware system to provide such capabilities for semantic description and matching. We adopted the DARPA Agent Markup Language and Ontology Inference Layer, a formal knowledge representation language for expressing sophisticated classifications and enabling automated inference, as the topic description language in our middleware system. A simple description logic inference system was implemented to handle the matching process between the subscriptions of subscribers and the advertisements of publishers. Moreover our middleware system also has a security architecture to support secure communication and user privilege control.

  15. IWR-MAIN Water Use Forecasting System. Version 5.1. User’s Manual and System Description

    DTIC Science & Technology

    1987-12-01

    Crosschecks for Input Data 1-68 11-1 Organization of the IWR-MAIN System H-8 11-2 Example of Econometric Demand Model 11-9 11-3 Example of Unit Use Coefficient...Unaccounted (entry does not affect default Loss and free service calculations) Y Conservation Data City Name: Test City USA Fl-Hetp, F2-return to monu, F4...socioeconomic data. 1-11 (1) Internal Growth Models The IWR-MAIN program contains a subroutine called GROWTH which uses econometric growth models based on

  16. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    PubMed Central

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined. PMID:22172142

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rybicki, E.F.; Luiskutty, C.T.; Sutrick, J.S.

    This User's Manual contains information for four fracture/proppant models. TUPROP1 contains a Geertsma and de Klerk type fracture model. The section of the program utilizing the proppant fracture geometry data from the pseudo three-dimensional highly elongated fracture model is called TUPROPC. The analogous proppant section of the program that was modified to accept fracture shape data from SA3DFRAC is called TUPROPS. TUPROPS also includes fracture closure. Finally there is the penny fracture and its proppant model, PENNPROP. In the first three chapters, the proppant sections are based on the same theory for determining the proppant distribution but have modifications tomore » support variable height fractures and modifications to accept fracture geometry from three different fracture models. Thus, information about each proppant model in the User's Manual builds on information supplied in the previous chapter. The exception to the development of combined treatment models is the penny fracture and its proppant model. In this case, a completely new proppant model was developed. A description of how to use the combined treatment model for the penny fracture is contained in Chapter 4. 2 refs.« less

  18. Formal analysis and automatic generation of user interfaces: approach, methodology, and an algorithm.

    PubMed

    Heymann, Michael; Degani, Asaf

    2007-04-01

    We present a formal approach and methodology for the analysis and generation of user interfaces, with special emphasis on human-automation interaction. A conceptual approach for modeling, analyzing, and verifying the information content of user interfaces is discussed. The proposed methodology is based on two criteria: First, the interface must be correct--that is, given the interface indications and all related information (user manuals, training material, etc.), the user must be able to successfully perform the specified tasks. Second, the interface and related information must be succinct--that is, the amount of information (mode indications, mode buttons, parameter settings, etc.) presented to the user must be reduced (abstracted) to the minimum necessary. A step-by-step procedure for generating the information content of the interface that is both correct and succinct is presented and then explained and illustrated via two examples. Every user interface is an abstract description of the underlying system. The correspondence between the abstracted information presented to the user and the underlying behavior of a given machine can be analyzed and addressed formally. The procedure for generating the information content of user interfaces can be automated, and a software tool for its implementation has been developed. Potential application areas include adaptive interface systems and customized/personalized interfaces.

  19. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 2: Program descriptions

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The structure and functions of each reporting software program for the Software Engineering Laboratory data base are described. Baseline diagrams, module descriptions, and listings of program generation files are included.

  20. Towards health care process description framework: an XML DTD design.

    PubMed Central

    Staccini, P.; Joubert, M.; Quaranta, J. F.; Aymard, S.; Fieschi, D.; Fieschi, M.

    2001-01-01

    The development of health care and hospital information systems has to meet users needs as well as requirements such as the tracking of all care activities and the support of quality improvement. The use of process-oriented analysis is of-value to provide analysts with: (i) a systematic description of activities; (ii) the elicitation of the useful data to perform and record care tasks; (iii) the selection of relevant decision-making support. But paper-based tools are not a very suitable way to manage and share the documentation produced during this step. The purpose of this work is to propose a method to implement the results of process analysis according to XML techniques (eXtensible Markup Language). It is based on the IDEF0 activity modeling language (Integration DEfinition for Function modeling). A hierarchical description of a process and its components has been defined through a flat XML file with a grammar of proper metadata tags. Perspectives of this method are discussed. PMID:11825265

  1. Solid Modeling Aerospace Research Tool (SMART) user's guide, version 2.0

    NASA Technical Reports Server (NTRS)

    Mcmillin, Mark L.; Spangler, Jan L.; Dahmen, Stephen M.; Rehder, John J.

    1993-01-01

    The Solid Modeling Aerospace Research Tool (SMART) software package is used in the conceptual design of aerospace vehicles. It provides a highly interactive and dynamic capability for generating geometries with Bezier cubic patches. Features include automatic generation of commonly used aerospace constructs (e.g., wings and multilobed tanks); cross-section skinning; wireframe and shaded presentation; area, volume, inertia, and center-of-gravity calculations; and interfaces to various aerodynamic and structural analysis programs. A comprehensive description of SMART and how to use it is provided.

  2. A 3D Model Based Imdoor Navigation System for Hubei Provincial Museum

    NASA Astrophysics Data System (ADS)

    Xu, W.; Kruminaite, M.; Onrust, B.; Liu, H.; Xiong, Q.; Zlatanova, S.

    2013-11-01

    3D models are more powerful than 2D maps for indoor navigation in a complicate space like Hubei Provincial Museum because they can provide accurate descriptions of locations of indoor objects (e.g., doors, windows, tables) and context information of these objects. In addition, the 3D model is the preferred navigation environment by the user according to the survey. Therefore a 3D model based indoor navigation system is developed for Hubei Provincial Museum to guide the visitors of museum. The system consists of three layers: application, web service and navigation, which is built to support localization, navigation and visualization functions of the system. There are three main strengths of this system: it stores all data needed in one database and processes most calculations on the webserver which make the mobile client very lightweight, the network used for navigation is extracted semi-automatically and renewable, the graphic user interface (GUI), which is based on a game engine, has high performance of visualizing 3D model on a mobile display.

  3. DEVA: An extensible ontology-based annotation model for visual document collections

    NASA Astrophysics Data System (ADS)

    Jelmini, Carlo; Marchand-Maillet, Stephane

    2003-01-01

    The description of visual documents is a fundamental aspect of any efficient information management system, but the process of manually annotating large collections of documents is tedious and far from being perfect. The need for a generic and extensible annotation model therefore arises. In this paper, we present DEVA, an open, generic and expressive multimedia annotation framework. DEVA is an extension of the Dublin Core specification. The model can represent the semantic content of any visual document. It is described in the ontology language DAML+OIL and can easily be extended with external specialized ontologies, adapting the vocabulary to the given application domain. In parallel, we present the Magritte annotation tool, which is an early prototype that validates the DEVA features. Magritte allows to manually annotating image collections. It is designed with a modular and extensible architecture, which enables the user to dynamically adapt the user interface to specialized ontologies merged into DEVA.

  4. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar

    2018-03-19

    The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  5. The computational structural mechanics testbed procedures manual

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1991-01-01

    The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.

  6. Self-descriptions on LinkedIn: Recruitment or friendship identity?

    PubMed

    Garcia, Danilo; Cloninger, Kevin M; Granjard, Alexandre; Molander-Söderholm, Kristian; Amato, Clara; Sikström, Sverker

    2018-04-26

    We used quantitative semantics to find clusters of words in LinkedIn users' self-descriptions to an employer or a friend. Some of these clusters discriminated between worker and friend conditions (e.g., flexible vs. caring) and between LinkedIn users with high and low education (e.g., analytical vs. messy). © 2018 The Institute of Psychology, Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  7. Gun Control for VBE-E: User Guide and Technical Description

    DTIC Science & Technology

    2006-11-01

    Defence R& D Canada – Atlantic DEFENCE DÉFENSE & Gun Control for VBE-E User Guide and Technical Description Tania E. Wentzell Technical Memorandum...Defence R& D Canada – Atlantic Technical Memorandum DRDC Atlantic TM 2006-245 November 2006 DRDC Atlantic TM 2006-245...component of the distributed experimentation environment used by the Virtual Combat System (VCS) Group at Defence R& D Canada – Atlantic (DRDC Atlantic

  8. Social Anxiety and Cannabis-Related Impairment: The Synergistic Influences of Peer and Parent Descriptive and Injunctive Normative Perceptions.

    PubMed

    Foster, Dawn W; Garey, Lorra; Buckner, Julia D; Zvolensky, Michael J

    2016-06-06

    Cannabis users, especially socially anxious cannabis users, are influenced by perceptions of other's use. The present study tested whether social anxiety interacted with perceptions about peer and parent beliefs to predict cannabis-related problems. Participants were 148 (36.5% female, 60.1% non-Hispanic Caucasian) current cannabis users aged 18-36 (M = 21.01, SD = 3.09) who completed measures of perceived descriptive and injunctive norms, social anxiety, and cannabis use behaviors. Hierarchical multiple regressions were employed to investigate the predictive value of the social anxiety X parent injunctive norms X peer norms interaction terms on cannabis use behaviors. Higher social anxiety was associated with more cannabis problems. A three-way interaction emerged between social anxiety, parent injunctive norms, and peer descriptive norms, with respect to cannabis problems. Social anxiety was positively related to more cannabis problems when parent injunctive norms were high (i.e., perceived approval) and peer descriptive norms were low. Results further showed that social anxiety was positively related to more cannabis problems regardless of parent injunctive norms. The present work suggests that it may be important to account for parent influences when addressing normative perceptions among young adult cannabis users. Additional research is needed to determine whether interventions incorporating feedback regarding parent norms impacts cannabis use frequency and problems.

  9. Understanding Calculus beyond Computations: A Descriptive Study of the Parallel Meanings and Expectations of Teachers and Users of Calculus

    ERIC Educational Resources Information Center

    Ferguson, Leann J.

    2012-01-01

    Calculus is an important tool for building mathematical models of the world around us and is thus used in a variety of disciplines, such as physics and engineering. These disciplines rely on calculus courses to provide the mathematical foundation needed for success in their courses. Unfortunately, due to the basal conceptions of what it means to…

  10. User's manual for the time-dependent INERTIA code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, A.W.; Bennett, R.B.

    1985-01-01

    The time-dependent INERTIA code is described. This code models the effects of neutral beam momentum input in tokamaks as predicted by the time-dependent formulation of the Stacey-Sigmar formalism. The operation and architecture of the code are described, as are the supplementary plotting and impurity line radiation routines. A short description of the steady-state version of the INERTIA code is also provided.

  11. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  12. CERENA: ChEmical REaction Network Analyzer--A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics.

    PubMed

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/.

  13. CERENA: ChEmical REaction Network Analyzer—A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics

    PubMed Central

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J.; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/. PMID:26807911

  14. A novel gene network inference algorithm using predictive minimum description length approach.

    PubMed

    Chaitankar, Vijender; Ghosh, Preetam; Perkins, Edward J; Gong, Ping; Deng, Youping; Zhang, Chaoyang

    2010-05-28

    Reverse engineering of gene regulatory networks using information theory models has received much attention due to its simplicity, low computational cost, and capability of inferring large networks. One of the major problems with information theory models is to determine the threshold which defines the regulatory relationships between genes. The minimum description length (MDL) principle has been implemented to overcome this problem. The description length of the MDL principle is the sum of model length and data encoding length. A user-specified fine tuning parameter is used as control mechanism between model and data encoding, but it is difficult to find the optimal parameter. In this work, we proposed a new inference algorithm which incorporated mutual information (MI), conditional mutual information (CMI) and predictive minimum description length (PMDL) principle to infer gene regulatory networks from DNA microarray data. In this algorithm, the information theoretic quantities MI and CMI determine the regulatory relationships between genes and the PMDL principle method attempts to determine the best MI threshold without the need of a user-specified fine tuning parameter. The performance of the proposed algorithm was evaluated using both synthetic time series data sets and a biological time series data set for the yeast Saccharomyces cerevisiae. The benchmark quantities precision and recall were used as performance measures. The results show that the proposed algorithm produced less false edges and significantly improved the precision, as compared to the existing algorithm. For further analysis the performance of the algorithms was observed over different sizes of data. We have proposed a new algorithm that implements the PMDL principle for inferring gene regulatory networks from time series DNA microarray data that eliminates the need of a fine tuning parameter. The evaluation results obtained from both synthetic and actual biological data sets show that the PMDL principle is effective in determining the MI threshold and the developed algorithm improves precision of gene regulatory network inference. Based on the sensitivity analysis of all tested cases, an optimal CMI threshold value has been identified. Finally it was observed that the performance of the algorithms saturates at a certain threshold of data size.

  15. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  16. Evaluating the effectiveness of clinical medical librarian programs: a systematic review of the literature*

    PubMed Central

    Wagner, Kay Cimpl; Byrd, Gary D.

    2004-01-01

    Objective: This study was undertaken to determine if a systematic review of the evidence from thirty years of literature evaluating clinical medical librarian (CML) programs could help clarify the effectiveness of this outreach service model. Methods: A descriptive review of the CML literature describes the general characteristics of these services as they have been implemented, primarily in teaching-hospital settings. Comprehensive searches for CML studies using quantitative or qualitative evaluation methods were conducted in the medical, allied health, librarianship, and social sciences literature. Findings: Thirty-five studies published between 1974 and 2001 met the review criteria. Most (30) evaluated single, active programs and used descriptive research methods (e.g., use statistics or surveys/questionnaires). A weighted average of 89% of users in twelve studies found CML services useful and of high quality, and 65% of users in another overlapping, but not identical, twelve studies said these services contributed to improved patient care. Conclusions: The total amount of research evidence for CML program effectiveness is not great and most of it is descriptive rather than comparative or analytically qualitative. Standards are needed to consistently evaluate CML or informationist programs in the future. A carefully structured multiprogram study including three to five of the best current programs is needed to define the true value of these services. PMID:14762460

  17. Propagation model for the Land Mobile Satellite channel in urban environments

    NASA Technical Reports Server (NTRS)

    Sforza, M.; Dibernardo, G.; Cioni, R.

    1993-01-01

    This paper presents the major characteristics of a simulation package capable of performing a complete narrow and wideband analysis of the mobile satellite communication channel in urban environments for any given orbital configuration. The wavelength-to-average urban geometrical dimension ratio has required the use of the Geometrical Theory of Diffraction (GTD). For the RF frequency range, the model has been designed to be (1 up to 60 GHz) extended to include effects of non-perfect conductivity and surface roughness. Taking advantage of the inherent capabilities of such a high frequency method, we are able to provide a complete description of the electromagnetic field at the mobile terminal. Using the information made available at the ray-tracer and GTD solver outputs, the Land Mobile Satellite (LMS) urban model can also give a detailed description of the communication channel in terms of power delay profiles, Doppler spectra, channel scattering functions, and so forth. Statistical data, e.g. cumulative distribution functions, level crossing rates or distributions of fades are also provided. The user can access the simulation tool through a Design-CAD user-friendly interface by means of which she can effectively design her own urban layout and run consequently all the envisaged routines. The software is optimized in its execution time so that numerous runs can be achieved in a considerably short time.

  18. LTCP 2D Graphical User Interface. Application Description and User's Guide

    NASA Technical Reports Server (NTRS)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  19. Factors influencing nurses' acceptance of hospital information systems in Iran: application of the Unified Theory of Acceptance and Use of Technology.

    PubMed

    Sharifian, Roxana; Askarian, Fatemeh; Nematolahi, Mohtaram; Farhadi, Payam

    User acceptance is a precondition for successful implementation of hospital information systems (HISs). Increasing investment in information technology by healthcare organisations internationally has made user acceptance an important issue in technology implementation and management. Despite the increased focus on hospital information systems, there continues to be user resistance. The present study aimed to investigate the factors affecting hospital information systems nurse-user acceptance of HISs, based on the Unified Theory of Acceptance and Use of Technology (UTAUT), in the Shiraz University of Medical Sciences teaching hospitals. A descriptive-analytical research design was employed to study nurses' adoption and use of HISs. Data collection was undertaken using a cross-sectional survey of nurses (n=303). The research model was examined using the LISREL path confirmatory modeling. The results demonstrated that the nurses' behavioural intention (BI) to use hospital information systems was predicted by Performance Expectancy (PE) (β= 2.34, p<0.01), Effort Expectancy (EE) (β= 2.21, p<0.01), Social Influence (SI) (β= 2.63, p<0.01) and Facilitating Conditions (FC) (β= 2.84, p<0.01). The effects of these antecedents of BI explained 72.8% of the variance in nurses' intention to use hospital information systems (R2 = 0.728). Application of the research model suggested that nurses' acceptance of HISs was influenced by performance expectancy, effort expectancy, social influence and facilitating conditions, with performance expectancy having the strongest effect on user intention.

  20. C-statistic fitting routines: User's manual and reference guide

    NASA Technical Reports Server (NTRS)

    Nousek, John A.; Farwana, Vida

    1991-01-01

    The computer program is discussed which can read several input files and provide a best set of values for the functions provided by the user, using either C-statistic or the chi(exp 2) statistic method. The program consists of one main routine and several functions and subroutines. Detail descriptions of each function and subroutine is presented. A brief description of the C-statistic and the reason for its application is also presented.

  1. Knowledge acquisition and learning process description in context of e-learning

    NASA Astrophysics Data System (ADS)

    Kiselev, B. G.; Yakutenko, V. A.; Yuriev, M. A.

    2017-01-01

    This paper investigates the problem of design of e-learning and MOOC systems. It describes instructional design-based approaches to e-learning systems design: IMS Learning Design, MISA and TELOS. To solve this problem we present Knowledge Field of Educational Environment with Competence boundary conditions - instructional engineering method for self-learning systems design. It is based on the simplified TELOS approach and enables a user to create their individual learning path by choosing prerequisite and target competencies. The paper provides the ontology model for the described instructional engineering method, real life use cases and the classification of the presented model. Ontology model consists of 13 classes and 15 properties. Some of them are inherited from Knowledge Field of Educational Environment and some are new and describe competence boundary conditions and knowledge validation objects. Ontology model uses logical constraints and is described using OWL 2 standard. To give TELOS users better understanding of our approach we list mapping between TELOS and KFEEC.

  2. Evaluation of the US Food and Drug Administration sentinel analysis tools in confirming previously observed drug-outcome associations: The case of clindamycin and Clostridium difficile infection.

    PubMed

    Carnahan, Ryan M; Kuntz, Jennifer L; Wang, Shirley V; Fuller, Candace; Gagne, Joshua J; Leonard, Charles E; Hennessy, Sean; Meyer, Tamra; Archdeacon, Patrick; Chen, Chih-Ying; Panozzo, Catherine A; Toh, Sengwee; Katcoff, Hannah; Woodworth, Tiffany; Iyer, Aarthi; Axtman, Sophia; Chrischilles, Elizabeth A

    2018-03-13

    The Food and Drug Administration's Sentinel System developed parameterized, reusable analytic programs for evaluation of medical product safety. Research on outpatient antibiotic exposures, and Clostridium difficile infection (CDI) with non-user reference groups led us to expect a higher rate of CDI among outpatient clindamycin users vs penicillin users. We evaluated the ability of the Cohort Identification and Descriptive Analysis and Propensity Score Matching tools to identify a higher rate of CDI among clindamycin users. We matched new users of outpatient dispensings of oral clindamycin or penicillin from 13 Data Partners 1:1 on propensity score and followed them for up to 60 days for development of CDI. We used Cox proportional hazards regression stratified by Data Partner and matched pair to compare CDI incidence. Propensity score models at 3 Data Partners had convergence warnings and a limited range of predicted values. We excluded these Data Partners despite adequate covariate balance after matching. From the 10 Data Partners where these models converged without warnings, we identified 807 919 new clindamycin users and 8 815 441 new penicillin users eligible for the analysis. The stratified analysis of 807 769 matched pairs included 840 events among clindamycin users and 290 among penicillin users (hazard ratio 2.90, 95% confidence interval 2.53, 3.31). This evaluation produced an expected result and identified several potential enhancements to the Propensity Score Matching tool. This study has important limitations. CDI risk may have been related to factors other than the inherent properties of the drugs, such as duration of use or subsequent exposures. Copyright © 2018 John Wiley & Sons, Ltd.

  3. User`s manual for the CDC-1 digitizer controller

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferron, J.R.

    1994-09-01

    A detailed description of how to use the CDC-1 digitizer controller is given. The CDC-1 is used with the CAMAC format digitizer models in the TRAQ series (manufactured by DSP Technology Inc.), the DAD-1 data acquisition daughter board, and the Intel i860-based SuperCard-2 (manufactured, by CSP Inc.) to form a high speed data acquisition and real time analysis system. Data can be transferred to the memory on the SuperCard-2 at a rate as high as 40 million 14-bit samples per second. Depending on the model of TRAQ digitizer in use, digitizing rates up to 3.33 MHz are supported (with eightmore » data channels), or, for instance, at a sample rate of 100 kHz, 384 data channels can be acquired.« less

  4. Online shopping interface components: relative importance as peripheral and central cues.

    PubMed

    Warden, Clyde A; Wu, Wann-Yih; Tsai, Dungchun

    2006-06-01

    The Elaboration Likelihood Model (ELM) uses central (more thoughtful) and peripheral (less thoughtful) routes of persuasion to maximize communication effectiveness. This research implements ELM to investigate the relative importance of different aspects of the user experience in online shopping. Of all the issues surrounding online shopping, convenience, access to information, and trust were found to be the most important. These were implemented in an online conjoint shopping task. Respondents were found to use the central route of the ELM on marketing messages that involved issues of minimizing travel, information access, and assurances of system security. Users employed the peripheral ELM route when considering usability, price comparison, and personal information protection. A descriptive model of Web-based marketing components, their roles in the central and peripheral routes, and their relative importance to online consumer segments was developed.

  5. Time determination for spacecraft users of the Navstar Global Positioning System /GPS/

    NASA Technical Reports Server (NTRS)

    Grenchik, T. J.; Fang, B. T.

    1977-01-01

    Global Positioning System (GPS) navigation is performed by time measurements. A description is presented of a two body model of spacecraft motion. Orbit determination is the process of inferring the position, velocity, and clock offset of the user from measurements made of the user motion in the Newtonian coordinate system. To illustrate the effect of clock errors and the accuracy with which the user spacecraft time and orbit may be determined, a low-earth-orbit spacecraft (Seasat) as tracked by six Phase I GPS space vehicles is considered. The obtained results indicate that in the absence of unmodeled dynamic parameter errors clock biases may be determined to the nanosecond level. There is, however, a high correlation between the clock bias and the uncertainty in the gravitational parameter GM, i.e., the product of the universal gravitational constant and the total mass of the earth. It is, therefore, not possible to determine clock bias to better than 25 nanosecond accuracy in the presence of a gravitational error of one part per million.

  6. Computer program BL2D for solving two-dimensional and axisymmetric boundary layers

    NASA Technical Reports Server (NTRS)

    Iyer, Venkit

    1995-01-01

    This report presents the formulation, validation, and user's manual for the computer program BL2D. The program is a fourth-order-accurate solution scheme for solving two-dimensional or axisymmetric boundary layers in speed regimes that range from low subsonic to hypersonic Mach numbers. A basic implementation of the transition zone and turbulence modeling is also included. The code is a result of many improvements made to the program VGBLP, which is described in NASA TM-83207 (February 1982), and can effectively supersede it. The code BL2D is designed to be modular, user-friendly, and portable to any machine with a standard fortran77 compiler. The report contains the new formulation adopted and the details of its implementation. Five validation cases are presented. A detailed user's manual with the input format description and instructions for running the code is included. Adequate information is presented in the report to enable the user to modify or customize the code for specific applications.

  7. Applications Technology Satellite and Communications Technology Satellite user experiments for 1967 - 1980 reference book, volume 1

    NASA Technical Reports Server (NTRS)

    Engler, N. A.; Nash, J. F.; Strange, J. D.

    1980-01-01

    A description of each of the satellites is given and a brief summary of each user experiment is presented. A Cross Index of User Experiments sorted by various parameters and a listing of keywords versus Experiment Number are presented.

  8. National Solar Radiation Database 1991-2010 Update: User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilcox, S. M.

    This user's manual provides information on the updated 1991-2010 National Solar Radiation Database. Included are data format descriptions, data sources, production processes, and information about data uncertainty.

  9. Modelling Psychological Needs for User-dependent Contextual Suggestion

    DTIC Science & Technology

    2014-11-01

    neighborhood, night club, park, place of worship, restaurant, RV park, shopping mall, stadium, synagogue, university and zoo . The search ra- dius is set...them do not differ much from each other. For example, the desired description texts for zoos and aquariums may not differ signif- icantly, and similarly...district, place of worship, library as another group, and park, zoo , aquarium, natural reserve as yet another group. We have manually dis- tributed

  10. Arc program documentation

    NASA Technical Reports Server (NTRS)

    Mcmillan, J. D.

    1976-01-01

    A description of the input and output files and the data control cards for the altimeter residual computation (ARC) computer program is given. The program acts as the final altimeter preprocessor before the data is reformatted for external users. It calculates all parameters necessary for the computation of the altimeter observation residuals and the sea surface height. Mathematical models used for calculating tropospheric refraction, geoid height, tide height, ephemeris, and orbit geometry are described.

  11. MuSim, a Graphical User Interface for Multiple Simulation Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Thomas; Cummings, Mary Anne; Johnson, Rolland

    2016-06-01

    MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X,more » and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.« less

  12. Microcomputer pollution model for civilian airports and Air Force Bases. Model application and background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segal, H.M.

    1988-08-01

    This is one of three reports describing the Emissions and Dispersion Modeling System (EDMS). All reports use the same main title--A MICROCOMPUTER MODEL FOR CIVILIAN AIRPORTS AND AIR FORCE BASES--but different subtitles. The subtitles are: (1) USER'S GUIDE - ISSUE 2 (FAA-EE-88-3/ESL-TR-88-54); (2) MODEL DESCRIPTION (FAA-EE-88-4/ESL-TR-88-53); (S) MODEL APPLICATION AND BACKGROUND (FAA-EE-88-5/ESL-TR-88-55). The first and second reports above describe the EDMS model and provide instructions for its use. This is the third report. IT consists of an accumulation of five key documents describing the development and use of the EDMS model. This report is prepared in accordance with discussions withmore » the EPA and requirements outlined in the March 27, 1980 Federal Register for submitting air-quality models to the EPA. Contents: Model Development and Use - Its Chronology and Reports; Monitoring Concorde EMissions; The Influence of Aircraft Operations on Air Quality at Airports; Simplex A - A simplified Atmospheric Dispersion Model for Airport Use -(User's Guide); Microcomputer Graphics in Atmospheric Dispersion Modeling; Pollution from Motor Vehicles and Aircraft at Stapleton International Airport (Abbreviated Report).« less

  13. MODPATH-LGR; documentation of a computer program for particle tracking in shared-node locally refined grids by using MODFLOW-LGR

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.

    2011-01-01

    The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.

  14. Routine High-Resolution Forecasts/Analyses for the Pacific Disaster Center: User Manual

    NASA Technical Reports Server (NTRS)

    Roads, John; Han, J.; Chen, S.; Burgan, R.; Fujioka, F.; Stevens, D.; Funayama, D.; Chambers, C.; Bingaman, B.; McCord, C.; hide

    2001-01-01

    Enclosed herein is our HWCMO user manual. This manual constitutes the final report for our NASA/PDC grant, NASA NAG5-8730, "Routine High Resolution Forecasts/Analysis for the Pacific Disaster Center". Since the beginning of the grant, we have routinely provided experimental high resolution forecasts from the RSM/MSM for the Hawaii Islands, while working to upgrade the system to include: (1) a more robust input of NCEP analyses directly from NCEP; (2) higher vertical resolution, with increased forecast accuracy; (3) faster delivery of forecast products and extension of initial 1-day forecasts to 2 days; (4) augmentation of our basic meteorological and simplified fireweather forecasts to firedanger and drought forecasts; (5) additional meteorological forecasts with an alternate mesoscale model (MM5); and (6) the feasibility of using our modeling system to work in higher-resolution domains and other regions. In this user manual, we provide a general overview of the operational system and the mesoscale models as well as more detailed descriptions of the models. A detailed description of daily operations and a cost analysis is also provided. Evaluations of the models are included although it should be noted that model evaluation is a continuing process and as potential problems are identified, these can be used as the basis for making model improvements. Finally, we include our previously submitted answers to particular PDC questions (Appendix V). All of our initially proposed objectives have basically been met. In fact, a number of useful applications (VOG, air pollution transport) are already utilizing our experimental output and we believe there are a number of other applications that could make use of our routine forecast/analysis products. Still, work still remains to be done to further develop this experimental weather, climate, fire danger and drought prediction system. In short, we would like to be a part of a future PDC team, if at all possible, to further develop and apply the system for the Hawaiian and other Pacific Islands as well as the entire Pacific Basin.

  15. A Vertically Flow-Following, Icosahedral Grid Model for Medium-Range and Seasonal Prediction. Part 1: Model Description

    NASA Technical Reports Server (NTRS)

    Bleck, Rainer; Bao, Jian-Wen; Benjamin, Stanley G.; Brown, John M.; Fiorino, Michael; Henderson, Thomas B.; Lee, Jin-Luen; MacDonald, Alexander E.; Madden, Paul; Middlecoff, Jacques; hide

    2015-01-01

    A hydrostatic global weather prediction model based on an icosahedral horizontal grid and a hybrid terrain following/ isentropic vertical coordinate is described. The model is an extension to three spatial dimensions of a previously developed, icosahedral, shallow-water model featuring user-selectable horizontal resolution and employing indirect addressing techniques. The vertical grid is adaptive to maximize the portion of the atmosphere mapped into the isentropic coordinate subdomain. The model, best described as a stacked shallow-water model, is being tested extensively on real-time medium-range forecasts to ready it for possible inclusion in operational multimodel ensembles for medium-range to seasonal prediction.

  16. Hierarchical video summarization based on context clustering

    NASA Astrophysics Data System (ADS)

    Tseng, Belle L.; Smith, John R.

    2003-11-01

    A personalized video summary is dynamically generated in our video personalization and summarization system based on user preference and usage environment. The three-tier personalization system adopts the server-middleware-client architecture in order to maintain, select, adapt, and deliver rich media content to the user. The server stores the content sources along with their corresponding MPEG-7 metadata descriptions. In this paper, the metadata includes visual semantic annotations and automatic speech transcriptions. Our personalization and summarization engine in the middleware selects the optimal set of desired video segments by matching shot annotations and sentence transcripts with user preferences. Besides finding the desired contents, the objective is to present a coherent summary. There are diverse methods for creating summaries, and we focus on the challenges of generating a hierarchical video summary based on context information. In our summarization algorithm, three inputs are used to generate the hierarchical video summary output. These inputs are (1) MPEG-7 metadata descriptions of the contents in the server, (2) user preference and usage environment declarations from the user client, and (3) context information including MPEG-7 controlled term list and classification scheme. In a video sequence, descriptions and relevance scores are assigned to each shot. Based on these shot descriptions, context clustering is performed to collect consecutively similar shots to correspond to hierarchical scene representations. The context clustering is based on the available context information, and may be derived from domain knowledge or rules engines. Finally, the selection of structured video segments to generate the hierarchical summary efficiently balances between scene representation and shot selection.

  17. A General Model for Performance Evaluation in DS-CDMA Systems with Variable Spreading Factors

    NASA Astrophysics Data System (ADS)

    Chiaraluce, Franco; Gambi, Ennio; Righi, Giorgia

    This paper extends previous analytical approaches for the study of CDMA systems to the relevant case of multipath environments where users can operate at different bit rates. This scenario is of interest for the Wideband CDMA strategy employed in UMTS, and the model permits the performance comparison of classic and more innovative spreading signals. The method is based on the characteristic function approach, that allows to model accurately the various kinds of interferences. Some numerical examples are given with reference to the ITU-R M. 1225 Recommendations, but the analysis could be extended to different channel descriptions.

  18. Linking multiple biodiversity informatics platforms with Darwin Core Archives

    PubMed Central

    2014-01-01

    Abstract We describe an implementation of the Darwin Core Archive (DwC-A) standard that allows for the exchange of biodiversity information contained within the Scratchpads virtual research environment with external collaborators. Using this single archive file Scratchpad users can expose taxonomies, specimen records, species descriptions and a range of other data to a variety of third-party aggregators and tools (currently Encyclopedia of Life, eMonocot Portal, CartoDB, and the Common Data Model) for secondary use. This paper describes our technical approach to dynamically building and validating Darwin Core Archives for the 600+ Scratchpad user communities, which can be used to serve the diverse data needs of all of our content partners. PMID:24723785

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christon, Mark A.; Bakosi, Jozsef; Lowrie, Robert B.

    Hydra-TH is a hybrid finite-element/finite-volume code built using the Hydra toolkit specifically to attack a broad class of incompressible, viscous fluid dynamics problems prevalent in the thermalhydraulics community. The purpose for this manual is provide sufficient information for an experience analyst to use Hydra-TH in an effective way. The Hydra-TH User's Manual present a brief overview of capabilities and visualization interfaces. The execution and restart models are described before turning to the detailed description of keyword input. Finally, a series of example problems are presented with sufficient data to permit the user to verify the local installation of Hydra-TH, andmore » to permit a convenient starting point for more detailed and complex analyses.« less

  20. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  1. A Modular Repository-based Infrastructure for Simulation Model Storage and Execution Support in the Context of In Silico Oncology and In Silico Medicine.

    PubMed

    Christodoulou, Nikolaos A; Tousert, Nikolaos E; Georgiadi, Eleni Ch; Argyri, Katerina D; Misichroni, Fay D; Stamatakos, Georgios S

    2016-01-01

    The plethora of available disease prediction models and the ongoing process of their application into clinical practice - following their clinical validation - have created new needs regarding their efficient handling and exploitation. Consolidation of software implementations, descriptive information, and supportive tools in a single place, offering persistent storage as well as proper management of execution results, is a priority, especially with respect to the needs of large healthcare providers. At the same time, modelers should be able to access these storage facilities under special rights, in order to upgrade and maintain their work. In addition, the end users should be provided with all the necessary interfaces for model execution and effortless result retrieval. We therefore propose a software infrastructure, based on a tool, model and data repository that handles the storage of models and pertinent execution-related data, along with functionalities for execution management, communication with third-party applications, user-friendly interfaces to access and use the infrastructure with minimal effort and basic security features.

  2. A Modular Repository-based Infrastructure for Simulation Model Storage and Execution Support in the Context of In Silico Oncology and In Silico Medicine

    PubMed Central

    Christodoulou, Nikolaos A.; Tousert, Nikolaos E.; Georgiadi, Eleni Ch.; Argyri, Katerina D.; Misichroni, Fay D.; Stamatakos, Georgios S.

    2016-01-01

    The plethora of available disease prediction models and the ongoing process of their application into clinical practice – following their clinical validation – have created new needs regarding their efficient handling and exploitation. Consolidation of software implementations, descriptive information, and supportive tools in a single place, offering persistent storage as well as proper management of execution results, is a priority, especially with respect to the needs of large healthcare providers. At the same time, modelers should be able to access these storage facilities under special rights, in order to upgrade and maintain their work. In addition, the end users should be provided with all the necessary interfaces for model execution and effortless result retrieval. We therefore propose a software infrastructure, based on a tool, model and data repository that handles the storage of models and pertinent execution-related data, along with functionalities for execution management, communication with third-party applications, user-friendly interfaces to access and use the infrastructure with minimal effort and basic security features. PMID:27812280

  3. A wideband channel model for land mobile satellite systems

    NASA Technical Reports Server (NTRS)

    Jahn, Axel; Buonomo, Sergio; Sforza, Mario; Lutz, Erich

    1995-01-01

    A wideband channel model for Land Mobile Satellite (LMS) services is presented which characterizes the time-varying transmission channel between a satellite and a mobile user terminal. The channel model statistic parameters are the results of fitting procedures to measured data. The data used for fitting have a time resolution of 33 ns corresponding to a bandwidth of 30 MHz. Thus, the model is capable to characterize the channel behaviour for a wide range of services e.g., voice transmission, digital audio broadcasting (DAB), and spread spectrum modulation schemes. The model is presented for different environments and scenarios. The model is derived for a quasi-mobile user with hand-held terminal being in two different environments: rural and urban. The parameters needed for the description are (a) the number of echoes, (b) the distribution of the echo power, and (c) the distribution of the echo delay. It is shown that the direct path follows a Rician distribution whereas the reflected paths are Rayleigh/lognormal distributed. The parameters are given for an elevation angle of 25 deg.

  4. Configuration Analysis Tool (CAT). System Description and users guide (revision 1)

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.

    1982-01-01

    A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.

  5. User's guide and description of the streamline divergence computer program. [turbulent convective heat transfer

    NASA Technical Reports Server (NTRS)

    Sulyma, P. R.; Mcanally, J. V.

    1975-01-01

    The streamline divergence program was developed to demonstrate the capability to trace inviscid surface streamlines and to calculate outflow-corrected laminar and turbulent convective heating rates on surfaces subjected to exhaust plume impingement. The analytical techniques used in formulating this program are discussed. A brief description of the streamline divergence program is given along with a user's guide. The program input and output for a sample case are also presented.

  6. A three-dimensional, compressible, laminar boundary-layer method for general fuselages. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Wie, Yong-Sun

    1990-01-01

    This user's manual contains a complete description of the computer programs developed to calculate three-dimensional, compressible, laminar boundary layers for perfect gas flow on general fuselage shapes. These programs include the 3-D boundary layer program (3DBLC), the body-oriented coordinate program (BCC), and the streamline coordinate program (SCC). Subroutine description, input, output and sample case are discussed. The complete FORTRAN listings of the computer programs are given.

  7. Social anxiety and cannabis-related impairment: The synergistic influences of peer and parent descriptive and injunctive normative perceptions

    PubMed Central

    Foster, Dawn W.; Garey, Lorra; Buckner, Julia D.; Zvolensky, Michael J.

    2016-01-01

    Objectives Cannabis users, especially socially anxious cannabis users, are influenced by perceptions of other’s use. The present study tested whether social anxiety interacted with perceptions about peer and parent beliefs to predict cannabis-related problems. Methods Participants were 148 (36.5% female, 60.1% non-Hispanic Caucasian) current cannabis users aged 18–36 (M = 21.01, SD = 3.09) who completed measures of perceived descriptive and injunctive norms, social anxiety, and cannabis use behaviors. Hierarchical multiple regressions were employed to investigate the predictive value of the social anxiety × parent injunctive norms × peer norms interaction terms on cannabis use behaviors. Results Higher social anxiety was associated with more cannabis problems. A three-way interaction emerged between social anxiety, parent injunctive norms, and peer descriptive norms, with respect to cannabis problems. Social anxiety was positively related to more cannabis problems when parent injunctive norms were high (i.e., perceived approval) and peer descriptive norms were low. Results further showed that social anxiety was positively related to more cannabis problems regardless of parent injunctive norms. Conclusions The present work suggest that it may be important to account for parent influences when addressing normative perceptions among young adult cannabis users. Additional research is needed to determine whether interventions incorporating feedback regarding parent norms impacts cannabis use frequency and problems. PMID:27144526

  8. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  9. A rule-based approach to model checking of UML state machines

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  10. Crew appliance computer program manual, volume 1

    NASA Technical Reports Server (NTRS)

    Russell, D. J.

    1975-01-01

    Trade studies of numerous appliance concepts for advanced spacecraft galley, personal hygiene, housekeeping, and other areas were made to determine which best satisfy the space shuttle orbiter and modular space station mission requirements. Analytical models of selected appliance concepts not currently included in the G-189A Generalized Environmental/Thermal Control and Life Support Systems (ETCLSS) Computer Program subroutine library were developed. The new appliance subroutines are given along with complete analytical model descriptions, solution methods, user's input instructions, and validation run results. The appliance components modeled were integrated with G-189A ETCLSS models for shuttle orbiter and modular space station, and results from computer runs of these systems are presented.

  11. Enhanced analysis and users manual for radial-inflow turbine conceptual design code RTD

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1995-01-01

    Modeling enhancements made to a radial-inflow turbine conceptual design code are documented in this report. A stator-endwall clearance-flow model was added for use with pivoting vanes. The rotor calculations were modified to account for swept blades and splitter blades. Stator and rotor trailing-edge losses and a vaneless-space loss were added to the loss model. Changes were made to the disk-friction and rotor-clearance loss calculations. The loss model was then calibrated based on experimental turbine performance. A complete description of code input and output along with sample cases are included in the report.

  12. Towards Semantic Modelling of Business Processes for Networked Enterprises

    NASA Astrophysics Data System (ADS)

    Furdík, Karol; Mach, Marián; Sabol, Tomáš

    The paper presents an approach to the semantic modelling and annotation of business processes and information resources, as it was designed within the FP7 ICT EU project SPIKE to support creation and maintenance of short-term business alliances and networked enterprises. A methodology for the development of the resource ontology, as a shareable knowledge model for semantic description of business processes, is proposed. Systematically collected user requirements, conceptual models implied by the selected implementation platform as well as available ontology resources and standards are employed in the ontology creation. The process of semantic annotation is described and illustrated using an example taken from a real application case.

  13. Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon

    2013-01-01

    This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.

  14. Transient Ejector Analysis (TEA) code user's guide

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1993-01-01

    A FORTRAN computer program for the semi analytic prediction of unsteady thrust augmenting ejector performance has been developed, based on a theoretical analysis for ejectors. That analysis blends classic self-similar turbulent jet descriptions with control-volume mixing region elements. Division of the ejector into an inlet, diffuser, and mixing region allowed flexibility in the modeling of the physics for each region. In particular, the inlet and diffuser analyses are simplified by a quasi-steady-analysis, justified by the assumption that pressure is the forcing function in those regions. Only the mixing region is assumed to be dominated by viscous effects. The present work provides an overview of the code structure, a description of the required input and output data file formats, and the results for a test case. Since there are limitations to the code for applications outside the bounds of the test case, the user should consider TEA as a research code (not as a production code), designed specifically as an implementation of the proposed ejector theory. Program error flags are discussed, and some diagnostic routines are presented.

  15. Computer program for nonlinear static stress analysis of shuttle thermal protection system: User's manual

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Wallas, M.

    1981-01-01

    User documentation is presented for a computer program which considers the nonlinear properties of the strain isolator pad (SIP) in the static stress analysis of the shuttle thermal protection system. This program is generalized to handle an arbitrary SIP footprint including cutouts for instrumentation and filler bar. Multiple SIP surfaces are defined to model tiles in unique locations such as leading edges, intersections, and penetrations. The nonlinearity of the SIP is characterized by experimental stress displacement data for both normal and shear behavior. Stresses in the SIP are calculated using a Newton iteration procedure to determine the six rigid body displacements of the tile which develop reaction forces in the SIP to equilibrate the externally applied loads. This user documentation gives an overview of the analysis capabilities, a detailed description of required input data and an example to illustrate use of the program.

  16. User Acceptance of Internet Banking Service in Malaysia

    NASA Astrophysics Data System (ADS)

    Yenyuen, Yee; Yeow, P. H. P.

    The study is the first research in Malaysia that investigates user acceptance of Internet banking service (IBS) based on Unified Theory of Acceptance and Use of Technology model (Venkatesh, Morris, Davis and Davis, 2003). Two hundred and eighty questionnaires were distributed and collected from two major cities, Kuala Lumpur and Melaka. Descriptive statistics was used to analyse the data. The results show that Malaysians have intentions of using IBS (mean rating of close to 4.00). Moreover, Malaysians recognize the benefits of IBS by giving a high mean rating (close to 4.00) to performance expectancy. However, they give relative low mean ratings (close to 3.00) on other indicators of Behavioural Intention to Use IBS such as effort expectancy, social influence, facilitating conditions and perceived credibility. Recommendations were given to promote a safe, efficient and conducive environment for user adoption of Internet banking.

  17. Results of a transparent expert consultation on patient and public involvement in palliative care research.

    PubMed

    Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J

    2015-12-01

    Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. To determine an optimal user-involvement model for palliative care research. We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Participants involved in palliative care research were invited to a global research institute, UK. A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. © The Author(s) 2015.

  18. Results of a transparent expert consultation on patient and public involvement in palliative care research

    PubMed Central

    Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J

    2015-01-01

    Background: Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. Aim: To determine an optimal user-involvement model for palliative care research. Design: We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Setting/participants: Participants involved in palliative care research were invited to a global research institute, UK. Results: A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. Conclusion: For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. PMID:25931336

  19. Slave finite element for non-linear analysis of engine structures. Volume 2: Programmer's manual and user's manual

    NASA Technical Reports Server (NTRS)

    Witkop, D. L.; Dale, B. J.; Gellin, S.

    1991-01-01

    The programming aspects of SFENES are described in the User's Manual. The information presented is provided for the installation programmer. It is sufficient to fully describe the general program logic and required peripheral storage. All element generated data is stored externally to reduce required memory allocation. A separate section is devoted to the description of these files thereby permitting the optimization of Input/Output (I/O) time through efficient buffer descriptions. Individual subroutine descriptions are presented along with the complete Fortran source listings. A short description of the major control, computation, and I/O phases is included to aid in obtaining an overall familiarity with the program's components. Finally, a discussion of the suggested overlay structure which allows the program to execute with a reasonable amount of memory allocation is presented.

  20. Agile Model Driven Development of Electronic Health Record-Based Specialty Population Registries.

    PubMed

    Kannan, Vaishnavi; Fish, Jason C; Willett, DuWayne L

    2016-02-01

    The transformation of the American healthcare payment system from fee-for-service to value-based care increasingly makes it valuable to develop patient registries for specialized populations, to better assess healthcare quality and costs. Recent widespread adoption of Electronic Health Records (EHRs) in the U.S. now makes possible construction of EHR-based specialty registry data collection tools and reports, previously unfeasible using manual chart abstraction. But the complexities of specialty registry EHR tools and measures, along with the variety of stakeholders involved, can result in misunderstood requirements and frequent product change requests, as users first experience the tools in their actual clinical workflows. Such requirements churn could easily stall progress in specialty registry rollout. Modeling a system's requirements and solution design can be a powerful way to remove ambiguities, facilitate shared understanding, and help evolve a design to meet newly-discovered needs. "Agile Modeling" retains these values while avoiding excessive unused up-front modeling in favor of iterative incremental modeling. Using Agile Modeling principles and practices, in calendar year 2015 one institution developed 58 EHR-based specialty registries, with 111 new data collection tools, supporting 134 clinical process and outcome measures, and enrolling over 16,000 patients. The subset of UML and non-UML models found most consistently useful in designing, building, and iteratively evolving EHR-based specialty registries included User Stories, Domain Models, Use Case Diagrams, Decision Trees, Graphical User Interface Storyboards, Use Case text descriptions, and Solution Class Diagrams.

  1. Intrinsic Radiation Source Generation with the ISC Package: Data Comparisons and Benchmarking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solomon, Clell J. Jr.

    The characterization of radioactive emissions from unstable isotopes (intrinsic radiation) is necessary for shielding and radiological-dose calculations from radioactive materials. While most radiation transport codes, e.g., MCNP [X-5 Monte Carlo Team, 2003], provide the capability to input user prescribed source definitions, such as radioactive emissions, they do not provide the capability to calculate the correct radioactive-source definition given the material compositions. Special modifications to MCNP have been developed in the past to allow the user to specify an intrinsic source, but these modification have not been implemented into the primary source base [Estes et al., 1988]. To facilitate the descriptionmore » of the intrinsic radiation source from a material with a specific composition, the Intrinsic Source Constructor library (LIBISC) and MCNP Intrinsic Source Constructor (MISC) utility have been written. The combination of LIBISC and MISC will be herein referred to as the ISC package. LIBISC is a statically linkable C++ library that provides the necessary functionality to construct the intrinsic-radiation source generated by a material. Furthermore, LIBISC provides the ability use different particle-emission databases, radioactive-decay databases, and natural-abundance databases allowing the user flexibility in the specification of the source, if one database is preferred over others. LIBISC also provides functionality for aging materials and producing a thick-target bremsstrahlung photon source approximation from the electron emissions. The MISC utility links to LIBISC and facilitates the description of intrinsic-radiation sources into a format directly usable with the MCNP transport code. Through a series of input keywords and arguments the MISC user can specify the material, age the material if desired, and produce a source description of the radioactive emissions from the material in an MCNP readable format. Further details of using the MISC utility can be obtained from the user guide [Solomon, 2012]. The remainder of this report presents a discussion of the databases available to LIBISC and MISC, a discussion of the models employed by LIBISC, a comparison of the thick-target bremsstrahlung model employed, a benchmark comparison to plutonium and depleted-uranium spheres, and a comparison of the available particle-emission databases.« less

  2. AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services.

    PubMed

    Correa, Miria C; Deus, Helena F; Vasconcelos, Ana T; Hayashi, Yuki; Ajani, Jaffer A; Patnana, Srikrishna V; Almeida, Jonas S

    2010-10-26

    AGUIA is a front-end web application originally developed to manage clinical, demographic and biomolecular patient data collected during clinical trials at MD Anderson Cancer Center. The diversity of methods involved in patient screening and sample processing generates a variety of data types that require a resource-oriented architecture to capture the associations between the heterogeneous data elements. AGUIA uses a semantic web formalism, resource description framework (RDF), and a bottom-up design of knowledge bases that employ the S3DB tool as the starting point for the client's interface assembly. The data web service, S3DB, meets the necessary requirements of generating the RDF and of explicitly distinguishing the description of the domain from its instantiation, while allowing for continuous editing of both. Furthermore, it uses an HTTP-REST protocol, has a SPARQL endpoint, and has open source availability in the public domain, which facilitates the development and dissemination of this application. However, S3DB alone does not address the issue of representing content in a form that makes sense for domain experts. We identified an autonomous set of descriptors, the GBox, that provides user and domain specifications for the graphical user interface. This was achieved by identifying a formalism that makes use of an RDF schema to enable the automatic assembly of graphical user interfaces in a meaningful manner while using only resources native to the client web browser (JavaScript interpreter, document object model). We defined a generalized RDF model such that changes in the graphic descriptors are automatically and immediately (locally) reflected into the configuration of the client's interface application. The design patterns identified for the GBox benefit from and reflect the specific requirements of interacting with data generated by clinical trials, and they contain clues for a general purpose solution to the challenge of having interfaces automatically assembled for multiple and volatile views of a domain. By coding AGUIA in JavaScript, for which all browsers include a native interpreter, a solution was found that assembles interfaces that are meaningful to the particular user, and which are also ubiquitous and lightweight, allowing the computational load to be carried by the client's machine.

  3. Factors of collaborative working: a framework for a collaboration model.

    PubMed

    Patel, Harshada; Pettitt, Michael; Wilson, John R

    2012-01-01

    The ability of organisations to support collaborative working environments is of increasing importance as they move towards more distributed ways of working. Despite the attention collaboration has received from a number of disparate fields, there is a lack of a unified understanding of the component factors of collaboration. As part of our work on a European Integrated Project, CoSpaces, collaboration and collaborative working and the factors which define it were examined through the literature and new empirical work with a number of partner user companies in the aerospace, automotive and construction sectors. This was to support development of a descriptive human factors model of collaboration - the CoSpaces Collaborative Working Model (CCWM). We identified seven main categories of factors involved in collaboration: Context, Support, Tasks, Interaction Processes, Teams, Individuals, and Overarching Factors, and summarised these in a framework which forms a basis for the model. We discuss supporting evidence for the factors which emerged from our fieldwork with user partners, and use of the model in activities such as collaboration readiness profiling. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their experiments, including the ability to model the beam line, the shielding of samples and sample holders, and the estimates of basic physical and biological outputs of the designed experiments. We present an overview of the GERM code GUI, as well as providing training applications.

  5. Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their experiments, including the ability to model the beam line, the shielding of samples and sample holders, and the estimates of basic physical and biological outputs of the designed experiments. We present an overview of the GERMcode GUI, as well as providing training applications.

  6. Introducing Online Bibliographic Service to its Users: The Online Presentation

    ERIC Educational Resources Information Center

    Crane, Nancy B.; Pilachowski, David M.

    1978-01-01

    A description of techniques for introducing online services to new user groups includes discussion of terms and their definitions, evolution of online searching, advantages and disadvantages of online searching, production of the data bases, search strategies, Boolean logic, costs and charges, "do's and don'ts," and a user search questionnaire. (J…

  7. A Comparison of Marijuana Users and Nonusers On A Number of Personality Variables

    ERIC Educational Resources Information Center

    Simon, William E.; And Others

    1974-01-01

    The present study compares marijuana users and nonusers in terms of (a) psychological needs, (b) self-descriptions, (c) self-esteem, (d) academic achievement, (e) ordinal position of birth, and (f) attitudes toward the legalization of various items. Some significant differences were reported between groups of users. (Author/PC)

  8. Nonlinear Curve-Fitting Program

    NASA Technical Reports Server (NTRS)

    Everhart, Joel L.; Badavi, Forooz F.

    1989-01-01

    Nonlinear optimization algorithm helps in finding best-fit curve. Nonlinear Curve Fitting Program, NLINEAR, interactive curve-fitting routine based on description of quadratic expansion of X(sup 2) statistic. Utilizes nonlinear optimization algorithm calculating best statistically weighted values of parameters of fitting function and X(sup 2) minimized. Provides user with such statistical information as goodness of fit and estimated values of parameters producing highest degree of correlation between experimental data and mathematical model. Written in FORTRAN 77.

  9. Study of a Tracking and Data Acquisition System (TDAS) in the 1990's

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Progress in concept definition studies, operational assessments, and technology demonstrations for the Tracking and Data Acquisition System (TDAS) is reported. The proposed TDAS will be the follow-on to the Tracking and Data Relay Satellite System and will function as a key element of the NASA End-to-End Data System, providing the tracking and data acquisition interface between user accessible data ports on Earth and the user's spaceborne equipment. Technical activities of the "spacecraft data system architecture' task and the "communication mission model' task are emphasized. The objective of the first task is to provide technology forecasts for sensor data handling, navigation and communication systems, and estimate corresponding costs. The second task is concerned with developing a parametric description of the required communication channels. Other tasks with significant activity include the "frequency plan and radio interference model' and the "Viterbi decoder/simulator study'.

  10. The Community WRF-Hydro Modeling System Version 4 Updates: Merging Toward Capabilities of the National Water Model

    NASA Astrophysics Data System (ADS)

    McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.

    2017-12-01

    The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.

  11. GAS eleven node thermal model (GEM)

    NASA Technical Reports Server (NTRS)

    Butler, Dan

    1988-01-01

    The Eleven Node Thermal Model (GEM) of the Get Away Special (GAS) container was originally developed based on the results of thermal tests of the GAS container. The model was then used in the thermal analysis and design of several NASA/GSFC GAS experiments, including the Flight Verification Payload, the Ultraviolet Experiment, and the Capillary Pumped Loop. The model description details the five cu ft container both with and without an insulated end cap. Mass specific heat values are also given so that transient analyses can be performed. A sample problem for each configuration is included as well so that GEM users can verify their computations. The model can be run on most personal computers with a thermal analyzer solution routine.

  12. Similarity and Difference in Drug Addiction Process Between Heroin- and Methamphetamine-Dependent Users.

    PubMed

    Wang, Ziyun; Li, Wei-Xiu; Zhi-Min, Liu

    2017-03-21

    This study aimed to compare the drug addiction process between Chinese heroin- and methamphetamine (MA)-dependent users via a modified 4-stage addiction model (experimentation, occasional use, regular use, and compulsive use). A descriptive study was conducted among 683 eligible participants. In the statistical analysis, we selected 340 heroin- and 295 MA-dependent users without illicit drug use prior to onset of heroin or MA use. The addiction process of heroin-dependent users was shorter than that of MA-dependent users, with shorter transitions from the onset of drug-use to the first drug craving (19.5 vs. 50.0 days), regular use (30.0 vs. 60.0 days), and compulsive use (50.0 vs. 85.0 days). However, no significant differences in the addiction process were observed in frequency of drug administration, except that heroin users reported more administrations of the drug (20.0 vs. 15.0) before progressing to the stage of compulsive drug use. A larger proportion of regular heroin users progressed to use illicit drugs recklessly than did MA users. Most heroin and MA users reported psychological dependence as their primary motivation for compulsive drug use, but more heroin users selected uncomfortable symptoms upon ceasing drug use as further reason to continue. Our results suggest that typical heroin and MA users may experience a similar four-stage addiction process, but MA users might undergo a longer addiction process (in days). More research is necessary to further explore factors influencing the drug addiction process.

  13. TUNS user guide supplement: Data dictionary

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Provided is a data dictionary for the Technology Utilization Network System (TUNS) providing for each element name the long name, data type, data size, descriptive name and description, data of PRI clause, legal values, and location used.

  14. Belowground Carbon Cycling Processes at the Molecular Scale: An EMSL Science Theme Advisory Panel Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, Nancy J.; Brown, Gordon E.; Plata, Charity

    2014-02-21

    As part of the Belowground Carbon Cycling Processes at the Molecular Scale workshop, an EMSL Science Theme Advisory Panel meeting held in February 2013, attendees discussed critical biogeochemical processes that regulate carbon cycling in soil. The meeting attendees determined that as a national scientific user facility, EMSL can provide the tools and expertise needed to elucidate the molecular foundation that underlies mechanistic descriptions of biogeochemical processes that control carbon allocation and fluxes at the terrestrial/atmospheric interface in landscape and regional climate models. Consequently, the workshop's goal was to identify the science gaps that hinder either development of mechanistic description ofmore » critical processes or their accurate representation in climate models. In part, this report offers recommendations for future EMSL activities in this research area. The workshop was co-chaired by Dr. Nancy Hess (EMSL) and Dr. Gordon Brown (Stanford University).« less

  15. Improving Patient Experience and Primary Care Quality for Patients With Complex Chronic Disease Using the Electronic Patient-Reported Outcomes Tool: Adopting Qualitative Methods Into a User-Centered Design Approach.

    PubMed

    Steele Gray, Carolyn; Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl

    2016-02-18

    Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis-Possible, Implementable, (to be) Challenged, (to be) Killed-guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers.

  16. Agile Model Driven Development of Electronic Health Record-Based Specialty Population Registries

    PubMed Central

    Kannan, Vaishnavi; Fish, Jason C.; Willett, DuWayne L.

    2018-01-01

    The transformation of the American healthcare payment system from fee-for-service to value-based care increasingly makes it valuable to develop patient registries for specialized populations, to better assess healthcare quality and costs. Recent widespread adoption of Electronic Health Records (EHRs) in the U.S. now makes possible construction of EHR-based specialty registry data collection tools and reports, previously unfeasible using manual chart abstraction. But the complexities of specialty registry EHR tools and measures, along with the variety of stakeholders involved, can result in misunderstood requirements and frequent product change requests, as users first experience the tools in their actual clinical workflows. Such requirements churn could easily stall progress in specialty registry rollout. Modeling a system’s requirements and solution design can be a powerful way to remove ambiguities, facilitate shared understanding, and help evolve a design to meet newly-discovered needs. “Agile Modeling” retains these values while avoiding excessive unused up-front modeling in favor of iterative incremental modeling. Using Agile Modeling principles and practices, in calendar year 2015 one institution developed 58 EHR-based specialty registries, with 111 new data collection tools, supporting 134 clinical process and outcome measures, and enrolling over 16,000 patients. The subset of UML and non-UML models found most consistently useful in designing, building, and iteratively evolving EHR-based specialty registries included User Stories, Domain Models, Use Case Diagrams, Decision Trees, Graphical User Interface Storyboards, Use Case text descriptions, and Solution Class Diagrams. PMID:29750222

  17. WMT: The CSDMS Web Modeling Tool

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.

  18. ATR National Scientific User Facility 2013 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ulrich, Julie A.; Robertson, Sarah

    2015-03-01

    This is the 2013 Annual Report for the Advanced Test Reactor National Scientific User Facility. This report includes information on university-run research projects along with a description of the program and the capabilities offered researchers.

  19. NASTRAN user's guide: Level 15

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The NASTRAN structural analysis system is presented. This user's guide is an essential addition to the original four NASTRAN manuals. Clear, brief descriptions of capabilities with example input are included, with references to the location of more complete information.

  20. Moving through MOOCs: Understanding the Progression of Users in Massive Open Online Courses

    ERIC Educational Resources Information Center

    Perna, Laura W.; Ruby, Alan; Boruch, Robert F.; Wang, Nicole; Scull, Janie; Ahmad, Seher; Evans, Chad

    2014-01-01

    This paper reports on the progress of users through 16 Coursera courses taught by University of Pennsylvania faculty for the first time between June 2012 and July 2013. Using descriptive analyses, this study advances knowledge by considering two definitions of massive open online course (MOOC) users (registrants and starters), comparing two…

  1. Assessment of the Adequacy of U.S.-Canadian Infrastructure to Accommodate Trade through Eastern Border Crossings. Appendix 1. Descriptive Profiles of Maine Frontier

    DOT National Transportation Integrated Search

    1999-06-08

    This document describes the process used in developing a list of rural Intelligent Transportation Systems (ITS) user needs. It gives information on a workshop focusing on rural ITS user needs, and it also presents a list of rural ITS user needs based...

  2. 15 CFR Supplement No. 8 to Part 748 - Information Required in Requests for Validated End-User (Veu) Authorization

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... overview of any business activity or corporate relationship that the entity has with either government or... structure, ownership and business of the prospective validated end-user. Include a description of the entity...-site reviews by U.S. Government officials to verify the end-user's compliance with the conditions of...

  3. Users' Satisfaction with Library Services: A Case Study of Delta State University Library

    ERIC Educational Resources Information Center

    Ikolo, Violet E.

    2015-01-01

    The study focused on users' satisfaction with library services at the Delta State University main Library, Abraka, Delta State. The objective was to find out if users are satisfied with the services, facilities, the library environment, information sources and staff of the library. Using the descriptive survey design, the population for the study…

  4. Model documentation report: Transportation sector model of the National Energy Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-03-01

    This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. This document serves three purposes. First, it is a reference document providing a detailed description of TRAN for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, 57(b)(1)). Third, it permits continuity inmore » model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements.« less

  5. Shuttle Communications and Tracking Systems Modeling and TDRSS Link Simulations Studies

    NASA Technical Reports Server (NTRS)

    Chie, C. M.; Dessouky, K.; Lindsey, W. C.; Tsang, C. S.; Su, Y. T.

    1985-01-01

    An analytical simulation package (LinCsim) which allows the analytical verification of data transmission performance through TDRSS satellites was modified. The work involved the modeling of the user transponder, TDRS, TDRS ground terminal, and link dynamics for forward and return links based on the TDRSS performance specifications (4) and the critical design reviews. The scope of this effort has recently been expanded to include the effects of radio frequency interference (RFI) on the bit error rate (BER) performance of the S-band return links. The RFI environment and the modified TDRSS satellite and ground station hardware are being modeled in accordance with their description in the applicable documents.

  6. Play-Personas: Behaviours and Belief Systems in User-Centred Game Design

    NASA Astrophysics Data System (ADS)

    Canossa, Alessandro; Drachen, Anders

    Game designers attempt to ignite affective, emotional responses from players via engineering game designs to incite definite user experiences. Theories of emotion state that definite emotional responses are individual, and caused by the individual interaction sequence or history. Engendering desired emotions in the audience of traditional audiovisual media is a considerable challenge; however it is potentially even more difficult to achieve the same goal for the audience of interactive entertainment, because a substantial degree of control rests in the hand of the end user rather than the designer. This paper presents a possible solution to the challenge of integrating the user in the design of interactive entertainment such as computer games by employing the "persona" framework introduced by Alan Cooper. This approach is already in use in interaction design. The method can be improved by complementing the traditional narrative description of personas with quantitative, data-oriented models of predicted patterns of user behaviour for a specific computer game Additionally, persona constructs can be applied both as design-oriented metaphors during the development of games, and as analytical lenses to existing games, e.g. for evaluation of patterns of player behaviour.

  7. Purple Computational Environment With Mappings to ACE Requirements for the General Availability User Environment Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barney, B; Shuler, J

    2006-08-21

    Purple is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Lawrence Livermore National Laboratory (LLNL). The Purple Computational Environment documents the capabilities and the environment provided for the FY06 LLNL Level 1 General Availability Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories, but also documents needs of the LLNL and Alliance users working in the unclassified environment. Additionally,more » the Purple Computational Environment maps the provided capabilities to the Trilab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the General Availability user environment capabilities of the ASC community. Appendix A lists these requirements and includes a description of ACE requirements met and those requirements that are not met for each section of this document. The Purple Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the Tri-lab community.« less

  8. CIFOG: Cosmological Ionization Fields frOm Galaxies

    NASA Astrophysics Data System (ADS)

    Hutter, Anne

    2018-03-01

    CIFOG is a versatile MPI-parallelised semi-numerical tool to perform simulations of the Epoch of Reionization. From a set of evolving cosmological gas density and ionizing emissivity fields, it computes the time and spatially dependent ionization of neutral hydrogen (HI), neutral (HeI) and singly ionized helium (HeII) in the intergalactic medium (IGM). The code accounts for HII, HeII, HeIII recombinations, and provides different descriptions for the photoionization rate that are used to calculate the residual HI fraction in ionized regions. This tool has been designed to be coupled to semi-analytic galaxy formation models or hydrodynamical simulations. The modular fashion of the code allows the user to easily introduce new descriptions for recombinations and the photoionization rate.

  9. PredictABEL: an R package for the assessment of risk prediction models.

    PubMed

    Kundu, Suman; Aulchenko, Yurii S; van Duijn, Cornelia M; Janssens, A Cecile J W

    2011-04-01

    The rapid identification of genetic markers for multifactorial diseases from genome-wide association studies is fuelling interest in investigating the predictive ability and health care utility of genetic risk models. Various measures are available for the assessment of risk prediction models, each addressing a different aspect of performance and utility. We developed PredictABEL, a package in R that covers descriptive tables, measures and figures that are used in the analysis of risk prediction studies such as measures of model fit, predictive ability and clinical utility, and risk distributions, calibration plot and the receiver operating characteristic plot. Tables and figures are saved as separate files in a user-specified format, which include publication-quality EPS and TIFF formats. All figures are available in a ready-made layout, but they can be customized to the preferences of the user. The package has been developed for the analysis of genetic risk prediction studies, but can also be used for studies that only include non-genetic risk factors. PredictABEL is freely available at the websites of GenABEL ( http://www.genabel.org ) and CRAN ( http://cran.r-project.org/).

  10. Software Engineering Laboratory (SEL) Data Base Maintenance System (DBAM) user's guide and system description

    NASA Technical Reports Server (NTRS)

    Lo, P. S.; Card, D.

    1983-01-01

    The Software Engineering Laboratory (SEL) Data Base Maintenance System (DBAM) is explained. The various software facilities of the SEL, DBAM operating procedures, and DBAM system information are described. The relationships among DBAM components (baseline diagrams), component descriptions, overlay descriptions, indirect command file listings, file definitions, and sample data collection forms are provided.

  11. A reference model for scientific information interchange

    NASA Technical Reports Server (NTRS)

    Reich, Lou; Sawyer, Don; Davis, Randy

    1993-01-01

    This paper presents an overview of an Information Interchange Reference Model (IIRM) currently being developed by individuals participating in the Consultative Committee for Space Data Systems (CCSDS) Panel 2, the Planetary Data Systems (PDS), and the Committee on Earth Observing Satellites (CEOS). This is an ongoing research activity and is not an official position by these bodies. This reference model provides a framework for describing and assessing current and proposed methodologies for information interchange within and among the space agencies. It is hoped that this model will improve interoperability between the various methodologies. As such, this model attempts to address key information interchange issues as seen by the producers and users of space-related data and to put them into a coherent framework. Information is understood as the knowledge (e.g., the scientific content) represented by data. Therefore, concern is not primarily on mechanisms for transferring data from user to user (e.g., compact disk read-only memory (CD-ROM), wide-area networks, optical tape, and so forth) but on how information is encoded as data and how the information content is maintained with minimal loss or distortion during transmittal. The model assumes open systems, which means that the protocols or methods used should be fully described and the descriptions publicly available. Ideally these protocols are promoted by recognized standards organizations using processes that permit involvement by those most likely to be affected, thereby enhancing the protocol's stability and the likelihood of wide support.

  12. Enhanced capabilities and modified users manual for axial-flow compressor conceptual design code CSPAN

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.; Lavelle, Thomas M.

    1995-01-01

    Modifications made to the axial-flow compressor conceptual design code CSPAN are documented in this report. Endwall blockage and stall margin predictions were added. The loss-coefficient model was upgraded. Default correlations for rotor and stator solidity and aspect-ratio inputs and for stator-exit tangential velocity inputs were included in the code along with defaults for aerodynamic design limits. A complete description of input and output along with sample cases are included.

  13. An Infrared Spectral Radiance Code for the Auroral Thermosphere (AARC)

    DTIC Science & Technology

    1987-11-24

    Program Description and Usage 136 3,1 Main Modules 136 3.2 Input, Output, and Program Communication 138 3.2.1 Input of User-Defined Program Control ...a test date set with which to compare the model predic- tions. Secondly, a number of theoretical papers are available describing some of the basic...necessary since secondary electrons aro a very important source of molecular nitrogen in vibrationally excited states [N2(v)), and the N2 (v) controls

  14. The NorWeST summer stream temperature model and scenarios for the western U.S.: A crowd-sourced database and new geospatial tools foster a user community and predict broad climate warming of rivers and streams

    Treesearch

    Daniel J. Isaak; Seth J. Wenger; Erin E. Peterson; Jay M. Ver Hoef; David E. Nagel; Charles H. Luce; Steven W. Hostetler; Jason B. Dunham; Brett B. Roper; Sherry P. Wollrab; Gwynne L. Chandler; Dona L. Horan; Sharon Parkes-Payne

    2017-01-01

    Thermal regimes are fundamental determinants of aquatic ecosystems, which makes description and prediction of temperatures critical during a period of rapid global change. The advent of inexpensive temperature sensors dramatically increased monitoring in recent decades, and although most monitoring is done by individuals for agency-specific purposes, collectively these...

  15. COMBIC, Combined Obscuration Model for Battlefield Induced Contaminants: Volume 1-Technical Documentation and Users Guide

    DTIC Science & Technology

    2000-08-01

    12345678901234567890123456789012345678901234567890123456789012345678901234567890 WAVL WAVE1 WAVE2 MULDV Name Units Typically Description WAVE1 µm 1.06 Wavelength used for...the calculation. Alternatively, one can specify either frequency or wavenumber by using a FREQ or WVNUM record instead of WAVL. If WAVE2 is not...specified, WAVE1 is the single wave- length used; if WAVE2 is specified, the modules will attempt to do their calculation for a range of wavelengths. There

  16. User Manual for the AZ-101 Data Acquisition System (AS-101 DAS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BRAYTON, D.D.

    2000-02-17

    User manual for the TK AZ-101 Waste Retrieval System Data Acquisition System. The purpose of this document is to describe use of the AZ-101 Data Acquisition System (AZ-101 DAS). The AZ-101 DAS is provided to fulfill the requirements for data collection and monitoring as defined in Letters of Instruction (LOI) from Numatec Hanford Corporation (NHC) to Fluor Federal Services (FFS). For a complete description of the system, including design, please refer to the AZ-101 DAS System Description document, RPP-5572.

  17. Operational manual for two-dimensional transonic code TSFOIL

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.

    1978-01-01

    This code solves the two-dimensional, transonic, small-disturbance equations for flow past lifting airfoils in both free air and various wind-tunnel environments by using a variant of the finite-difference method. A description of the theoretical and numerical basis of the code is provided, together with complete operating instructions and sample cases for the general user. In addition, a programmer's manual is also presented to assist the user interested in modifying the code. Included in the programmer's manual are a dictionary of subroutine variables in common and a detailed description of each subroutine.

  18. SIG: a general-purpose signal processing program. User's manual. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, D.; Azevedo, S.

    1985-05-09

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time-domain and frequenccy-domain signals. The manual contains a complete description of the SIG program from the user's stand-point. A brief exercise in using SIG is shown. Complete descriptions are given of each command in the SIG core. General information about the SIG structure, command processor, and graphics options are provided. An example usage of SIG for solving a problem is developed, and error message formats are briefly discussed. (LEW)

  19. Software Description for the O’Hare Runway Configuration Management System. Volume I. Technical Description,

    DTIC Science & Technology

    1982-10-01

    spent in preparing this document. 00. EXECUTIVE SUMMARY The O’Hare Runway Configuration Management System (CMS) is an interactive multi-user computer ...MITRE Washington’s Computer Center. Currently, CMS is housed in an IBM 4341 computer with VM/SP operating system. CMS employs the IBM’s Display...iV 0O, o 0 .r4L /~ wA 0U 00 00 0 w vi O’Hare, it will operate on a dedicated mini- computer which permits multi-tasking (that is, multiple users

  20. User involvement in the implementation of clinical guidelines for common mental health disorders: a review and compilation of strategies and resources.

    PubMed

    Moreno, Eliana M; Moriana, Juan Antonio

    2016-08-09

    There is now broad consensus regarding the importance of involving users in the process of implementing guidelines. Few studies, however, have addressed this issue, let alone the implementation of guidelines for common mental health disorders. The aim of this study is to compile and describe implementation strategies and resources related to common clinical mental health disorders targeted at service users. The literature was reviewed and resources for the implementation of clinical guidelines were compiled using the PRISMA model. A mixed qualitative and quantitative analysis was performed based on a series of categories developed ad hoc. A total of 263 items were included in the preliminary analysis and 64 implementation resources aimed at users were analysed in depth. A wide variety of types, sources and formats were identified, including guides (40%), websites (29%), videos and leaflets, as well as instruments for the implementation of strategies regarding information and education (64%), self-care, or users' assessment of service quality. The results reveal the need to establish clear criteria for assessing the quality of implementation materials in general and standardising systems to classify user-targeted strategies. The compilation and description of key elements of strategies and resources for users can be of interest in designing materials and specific actions for this target audience, as well as improving the implementation of clinical guidelines.

  1. FLUSH: A tool for the design of slush hydrogen flow systems

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1990-01-01

    As part of the National Aerospace Plane Project an analytical model was developed to perform calculations for in-line transfer of solid-liquid mixtures of hydrogen. This code, called FLUSH, calculates pressure drop and solid fraction loss for the flow of slush hydrogen through pipe systems. The model solves the steady-state, one-dimensional equation of energy to obtain slush loss estimates. A description of the code is provided as well as a guide for users of the program. Preliminary results are also presented showing the anticipated degradation of slush hydrogen solid content for various piping systems.

  2. Capabilities overview of the MORET 5 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Cochet, B.; Jinaphanh, A.; Heulers, L.; Jacquet, O.

    2014-06-01

    The MORET code is a simulation tool that solves the transport equation for neutrons using the Monte Carlo method. It allows users to model complex three-dimensional geometrical configurations, describe the materials, define their own tallies in order to analyse the results. The MORET code has been initially designed to perform calculations for criticality safety assessments. New features has been introduced in the MORET 5 code to expand its use for reactor applications. This paper presents an overview of the MORET 5 code capabilities, going through the description of materials, the geometry modelling, the transport simulation and the definition of the outputs.

  3. System modelling for LISA Pathfinder

    NASA Astrophysics Data System (ADS)

    Diaz-Aguiló, Marc; Grynagier, Adrien; Rais, Boutheina

    LISA Pathfinder is the technology demonstrator for LISA, a space-borne gravitational waves observatory. The goal of the mission is to characterise the dynamics of the LISA Technology Package (LTP) to prove that on-board experimental conditions are compatible with the de-tection of gravitational waves. The LTP is a drag-free dynamics experiment which includes a control loop with sensors (interferometric and capacitive), actuators (capacitive actuators and thrusters), controlled disturbances (magnetic coils, heaters) and which is subject to various endogenous or exogenous noise sources such as infrared pressure or solar wind. The LTP experiment features new hardware which was never flown in space. The mission has a tight operation timeline as it is constrained to about 100 days. It is therefore vital to have efficient and precise means of investigation and diagnostics to be used during the on-orbit operations. These will be conducted using the LTP Data Analysis toolbox (LTPDA) which allows for simulation, parameter identification and various analyses (covariance analysis, state estimation) given an experimental model. The LTPDA toolbox therefore contains a series of models which are state-space representations of each component in the LTP. The State-Space Models (SSM) are objects of a state-space class within the LTPDA toolbox especially designed to address all the requirements of this tool. The user has access to a set of linear models which represent every satellite subsystem; the models are available in different forms representing 1D, 2D and 3D systems, each with settable symbolic and numeric parameters. To limit the possible errors, the models can be automatically linked to produce composite systems and closed-loops of the LTP. Finally, for the sake of completeness, accuracy and maintainability of the tool, the models contain all the physical information they mimic (i.e. variable units, description of parameters, description of inputs/outputs, etc). Models developed for this work include the fixed-point linearized equations of motion for the LTP and the linear models for sensors and actuators with their noise modelling blocks issued from the analysis of the individual actuators. The drag-free controller model includes the dis-crete delays expected in the hardware. In this work we briefly describe the software architecture, in order to concentrate then on the physical description of the models. This is supported by an overview of different user scenarios and some examples of model analysis that highlight the advantages of this high-level programming engineering toolbox for space mission data analysis and calibration.

  4. Data Link Test and Analysis System/TCAS Monitor User's Guide

    DOT National Transportation Integrated Search

    1991-02-01

    This document is a user's guide for the Data Link Test and Analysis System : (DATAS) Traffic Alert and Collision Avoidance System (TCAS) monitor application. : It provides a brief overall hardware description of DATAS configured as a TCAS : Monitor, ...

  5. Computer Program for Calculation of Complex Chemical Equilibrium Compositions and Applications II. Users Manual and Program Description. 2; Users Manual and Program Description

    NASA Technical Reports Server (NTRS)

    McBride, Bonnie J.; Gordon, Sanford

    1996-01-01

    This users manual is the second part of a two-part report describing the NASA Lewis CEA (Chemical Equilibrium with Applications) program. The program obtains chemical equilibrium compositions of complex mixtures with applications to several types of problems. The topics presented in this manual are: (1) details for preparing input data sets; (2) a description of output tables for various types of problems; (3) the overall modular organization of the program with information on how to make modifications; (4) a description of the function of each subroutine; (5) error messages and their significance; and (6) a number of examples that illustrate various types of problems handled by CEA and that cover many of the options available in both input and output. Seven appendixes give information on the thermodynamic and thermal transport data used in CEA; some information on common variables used in or generated by the equilibrium module; and output tables for 14 example problems. The CEA program was written in ANSI standard FORTRAN 77. CEA should work on any system with sufficient storage. There are about 6300 lines in the source code, which uses about 225 kilobytes of memory. The compiled program takes about 975 kilobytes.

  6. Human factors aspects of control room design

    NASA Technical Reports Server (NTRS)

    Jenkins, J. P.

    1983-01-01

    A plan for the design and analysis of a multistation control room is reviewed. It is found that acceptance of the computer based information system by the uses in the control room is mandatory for mission and system success. Criteria to improve computer/user interface include: match of system input/output with user; reliability, compatibility and maintainability; easy to learn and little training needed; self descriptive system; system under user control; transparent language, format and organization; corresponds to user expectations; adaptable to user experience level; fault tolerant; dialog capability user communications needs reflected in flexibility, complexity, power and information load; integrated system; and documentation.

  7. An overview of reference user services during the ATDRSS (Advanced Tracking and Data Relay Satellite System) era

    NASA Technical Reports Server (NTRS)

    Weinberg, Aaron

    1989-01-01

    The Tracking and Data Relay Satellite System (TDRSS) is an integral part of the overall NASA Space Network (SN) that will continue to evolve into the 1990's. Projections for the first decade of the 21st century indicate the need for an SN evolution that must accommodate growth int he LEO user population and must further support the introduction of new/improved user services. A central ingredient of this evolution is an Advanced TDRSS (ATDRSS) follow-on to the current TDRSS that must initiate operations by the late 1990's in a manner that permits an orderly transition from the TDRSS to the ATDRSS era. An SN/ATDRSS architectural and operational concept that will satisfy the above goals is being developed. To this date, an SN/ATDRSS baseline concept was established that provides users with an end-to-end data transport (ENDAT) service. An expanded description of the baseline ENDAT concept, from the user perspective, is provided with special emphasis on the TDRSS/ATDRSS evolution. A high-level description of the end-to-end system that identifies the role of ATDRSS is presented; also included is a description of the baseline ATDRSS architecture and its relationship with the TDRSS 1996 baseline. Other key features of the ENDAT service are then expanded upon, including the multiple grades of service, and the RF telecommunications/tracking services to be available. The ATDRSS service options are described.

  8. Software Surface Modeling and Grid Generation Steering Committee

    NASA Technical Reports Server (NTRS)

    Smith, Robert E. (Editor)

    1992-01-01

    It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.

  9. Digital systems design language

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.

    1979-01-01

    Digital Systems Design Language (DDL) is implemented on the SEL-32 Computer Systems. The detaileds of the language, the translator, and the simulator, and the smulator programs are given. Several example descriptions and a tutorial on hardware description languages are provided, to guide the user.

  10. Further Developments of BEM for Micro and Macromechanical Analyses of Composites: Boundary Element Software Technology-Composite User's Manual

    NASA Technical Reports Server (NTRS)

    Banerjee, P. K.; Henry, D. P.; Hopkins, D. A.; Goldberg, R. K.

    1997-01-01

    BEST-CMS (Boundary Element Solution Technology - Composite Modeling System) is an advanced engineering system for the micro-analysis of fiber composite structures. BEST-CMS is based upon the boundary element program BEST3D which was developed for NASA by Pratt and Whitney Aircraft and the State University of New York at Buffalo under contract NAS3-23697. BEST-CMS presently has the capabilities for elastostatic analysis, steady-state and transient heat transfer analysis, steady-state and transient concurrent thermoelastic analysis and elastoplastic and creep analysis. The fibers are assumed to be perfectly bonded to the composite matrix, or in the case of static or steady-state analysis, the fibers may be assumed to have spring connections, thermal resistance, and/or frictional sliding between the fibers and the composite matrix. The primary objective of this User's Manual is to provide an overview of all BEST-CMS capabilities, along with detailed descriptions of the input data requirements. A brief review of the theoretical background is presented for each analysis category. Then, Chapter 3 discusses the key aspects of the numerical implementation, while Chapter 4 provides a tutorial for the beginning BEST-CMS user. The heart of the manual, however, is in Chapter 5, where a complete description of all data input items is provided. Within this chapter, the individual entries are grouped on a functional basis for a more coherent presentation. Chapter 6 includes sample problems and should be of considerable assistance to the novice. Chapter 7 includes capsules of a number of fiber-composite analysis problems that have been solved using BEST-CMS. This chapter is primarily descriptive in nature and is intended merely to illustrate the level of analysis that is possible within the present BEST-CMS system. Chapter 8 contains a detailed description of the BEST-CMS Neutral File which is helpful in writing an interface between BEST- CMS and any graphic post-processor program. Finally, all pertinent references are listed in Chapter 9.

  11. PAN AIR: A computer program for predicting subsonic or supersonic linear potential flows about arbitrary configurations using a higher order panel method. Volume 2: User's manual (version 3.0)

    NASA Technical Reports Server (NTRS)

    Sidwell, Kenneth W.; Baruah, Pranab K.; Bussoletti, John E.; Medan, Richard T.; Conner, R. S.; Purdon, David J.

    1990-01-01

    A comprehensive description of user problem definition for the PAN AIR (Panel Aerodynamics) system is given. PAN AIR solves the 3-D linear integral equations of subsonic and supersonic flow. Influence coefficient methods are used which employ source and doublet panels as boundary surfaces. Both analysis and design boundary conditions can be used. This User's Manual describes the information needed to use the PAN AIR system. The structure and organization of PAN AIR are described, including the job control and module execution control languages for execution of the program system. The engineering input data are described, including the mathematical and physical modeling requirements. Version 3.0 strictly applies only to PAN AIR version 3.0. The major revisions include: (1) inputs and guidelines for the new FDP module (which calculates streamlines and offbody points); (2) nine new class 1 and class 2 boundary conditions to cover commonly used modeling practices, in particular the vorticity matching Kutta condition; (3) use of the CRAY solid state Storage Device (SSD); and (4) incorporation of errata and typo's together with additional explanation and guidelines.

  12. Descriptions and Implementations of DL_F Notation: A Natural Chemical Expression System of Atom Types for Molecular Simulations.

    PubMed

    Yong, Chin W

    2016-08-22

    DL_F Notation is an easy-to-understand, standardized atom typesetting expression for molecular simulations for a range of organic force field (FF) schemes such as OPLSAA, PCFF, and CVFF. It is implemented within DL_FIELD, a software program that facilitates the setting up of molecular FF models for DL_POLY molecular dynamics simulation software. By making use of the Notation, a single core conversion module (the DL_F conversion Engine) implemented within DL_FIELD can be used to analyze a molecular structure and determine the types of atoms for a given FF scheme. Users only need to provide the molecular input structure in a simple xyz format and DL_FIELD can produce the necessary force field file for DL_POLY automatically. In commensurate with the development concept of DL_FIELD, which placed emphasis on robustness and user friendliness, the Engine provides a single-step solution to setup complex FF models. This allows users to switch from one of the above-mentioned FF seamlessly to another while at the same time provides a consistent atom typing that is expressed in a natural chemical sense.

  13. Improving Patient Experience and Primary Care Quality for Patients With Complex Chronic Disease Using the Electronic Patient-Reported Outcomes Tool: Adopting Qualitative Methods Into a User-Centered Design Approach

    PubMed Central

    Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl

    2016-01-01

    Background Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. Objective This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Methods Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Results Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis—Possible, Implementable, (to be) Challenged, (to be) Killed—guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Conclusions Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers. PMID:26892952

  14. Data link test and analysis system/TCAS monitor user's guide

    NASA Astrophysics Data System (ADS)

    Vandongen, John; Wapelhorst, Leo

    1991-02-01

    This document is a user's guide for the Data Link Test and Analysis System (DATAS) Traffic Alert and Collision Avoidance System (TCAS) monitor. It provides a brief overall hardware description of DATAS configured as a TCAS monitor, and the applications software.

  15. Patriot Script 1.0.13 User Guide for PEM 1.3.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cleland, Timothy James; Kubicek, Deborah Ann; Stroud, Phillip David

    2015-11-02

    This document provides an updated user guide for Patriot Script Version 1.0.13, for release with PEM 1.3.1 (LAUR-1422817) that adds description and instructions for the new excursion capability (see section 4.5.1).

  16. An improved resource management model based on MDS

    NASA Astrophysics Data System (ADS)

    Yuan, Man; Sun, Changying; Li, Pengfei; Sun, Yongdong; He, Rui

    2005-11-01

    GRID technology provides a kind of convenient method for managing GRID resources. This service is so-called monitoring, discovering service. This method is proposed by Globus Alliance, in this GRID environment, all kinds of resources, such as computational resources, storage resources and other resources can be organized by MDS specifications. However, this MDS is a theory framework, particularly, in a small world intranet, in the case of limit of resources, the MDS has its own limitation. Based on MDS, an improved light method for managing corporation computational resources and storage resources is proposed in intranet(IMDS). Firstly, in MDS, all kinds of resource description information is stored in LDAP, it is well known although LDAP is a light directory access protocol, in practice, programmers rarely master how to access and store resource information into LDAP store, in such way, it limits MDS to be used. So, in intranet, these resources' description information can be stored in RDBMS, programmers and users can access this information by standard SQL. Secondly, in MDS, how to monitor all kinds of resources in GRID is not transparent for programmers and users. In such way, it limits its application scope, in general, resource monitoring method base on SNMP is widely employed in intranet, therefore, a kind of resource monitoring method based on SNMP is integrated into MDS. Finally, all kinds of resources in the intranet can be described by XML, and all kinds of resources' description information is stored in RDBMS, such as MySql, and retrieved by standard SQL, dynamic information for all kinds of resources can be sent to resource storage by SNMP, A prototype resource description, monitoring is designed and implemented in intranet.

  17. A metadata reporting framework for standardization and synthesis of ecohydrological field observations

    NASA Astrophysics Data System (ADS)

    Christianson, D. S.; Varadharajan, C.; Detto, M.; Faybishenko, B.; Gimenez, B.; Jardine, K.; Negron Juarez, R. I.; Pastorello, G.; Powell, T.; Warren, J.; Wolfe, B.; McDowell, N. G.; Kueppers, L. M.; Chambers, J.; Agarwal, D.

    2016-12-01

    The U.S. Department of Energy's (DOE) Next Generation Ecosystem Experiment (NGEE) Tropics project aims to develop a process-rich tropical forest ecosystem model that is parameterized and benchmarked by field observations. Thus, data synthesis, quality assurance and quality control (QA/QC), and data product generation of a diverse and complex set of ecohydrological observations, including sapflux, leaf surface temperature, soil water content, and leaf gas exchange from sites across the Tropics, are required to support model simulations. We have developed a metadata reporting framework, implemented in conjunction with the NGEE Tropics Data Archive tool, to enable cross-site and cross-method comparison, data interpretability, and QA/QC. We employed a modified User-Centered Design approach, which involved short development cycles based on user-identified needs, and iterative testing with data providers and users. The metadata reporting framework currently has been implemented for sensor-based observations and leverages several existing metadata protocols. The framework consists of templates that define a multi-scale measurement position hierarchy, descriptions of measurement settings, and details about data collection and data file organization. The framework also enables data providers to define data-access permission settings, provenance, and referencing to enable appropriate data usage, citation, and attribution. In addition to describing the metadata reporting framework, we discuss tradeoffs and impressions from both data providers and users during the development process, focusing on the scalability, usability, and efficiency of the framework.

  18. Digital systems design language. Design synthesis of digital systems

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.

    1979-01-01

    The Digital Systems Design Language (DDL) is implemented on the SEL-32 computer systems. The details of the language, translator and simulator programs are included. Several example descriptions and a tutorial on hardware description languages are provided, to guide the user.

  19. Design analysis and computer-aided performance evaluation of shuttle orbiter electrical power system. Volume 2: SYSTID user's guide

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The manual for the use of the computer program SYSTID under the Univac operating system is presented. The computer program is used in the simulation and evaluation of the space shuttle orbiter electric power supply. The models described in the handbook are those which were available in the original versions of SYSTID. The subjects discussed are: (1) program description, (2) input language, (3) node typing, (4) problem submission, and (5) basic and power system SYSTID libraries.

  20. Development of the TACOM (Tank Automotive Command) Thermal Imaging Model (TTIM). Volume 1. Technical Guide and User’s Manual.

    DTIC Science & Technology

    1984-12-01

    BLOCK DATA Default values for variables input by menus. LIBR Interface with frame I/O routines. SNSR Interface with sensor routines. ATMOS Interface with...Routines Included in Frame I/O Interface Routine Description LIBR Selects options for input or output to a data library. FRREAD Reads frame from file and/or...Layer", Journal of Applied Meteorology 20, pp. 242-249, March 1981. 15 L.J. Harding, Numerical Analysis and Applications Software Abstracts, Computing

  1. Interactive Multiple-Representation Editing of Physically-based 3D Animation

    DTIC Science & Technology

    1994-05-29

    view similar to that of FrameMaker [46], but with a stronger model of structured editing, and a language window, which displays a formal description of...used. This is partly a result of the power of a good direct manipulation editor { most users of good editors like FrameMaker don’t seem to miss having a...re nement. In SIG- GRAPH, pages 205{212, August 1988. [46] Frame Technology Inc. Using FrameMaker , 1990. [47] Bjorn N. Freeman-Benson and John Maloney

  2. Calibration and Validation of the Sage Software Cost/Schedule Estimating System to United States Air Force Databases

    DTIC Science & Technology

    1997-09-01

    factor values are identified. For SASET, revised cost estimating relationships are provided ( Apgar et al., 1991). A 1991 AFIT thesis by Gerald Ourada...description of the model is a paragraph directly quoted from the user’s manual . This is not to imply that a lack of a thorough analysis indicates...constraints imposed by the system. The effective technology rating is computed from the basic technology rating by the following equation ( Apgar et al., 1991

  3. HOWLS LOCATER Computer Program: Description and User’s Guide

    DTIC Science & Technology

    1978-12-26

    December.1978 Preari d fr the Defense Advanc ed Research Projects Agency L U undir Electronic Systems , Division-,Contract F19628?8C-0OO2 by Lincoln...Sensors 25 4.13 Battle Scenarios 25 5.0 NtOHIMATICAL GWW. TO LOCATER 27 5.1 Earth Nlod-a 27 S.2 Coordinate Systems and Transformations 27 5.2.1 Earth...on weapon lation accuracy of radar system paramneters, trajectory modeling anid estimation algorithmns, and envir-ontmntl effects such as wind, tr

  4. E&V (Evaluation and Validation) Reference Manual, Version 1.1

    DTIC Science & Technology

    1988-10-20

    E&V. This model will allow the user to arrive at E&V techniques through many different paths, and provides a means to extract useful information...electronically (preferred) to szymansk@ajpo.sei.cmu.edu or by regular mail to Mr. Raymond Szymanski , AFWAL/AAAF, Wright Patterson AFB, OH 45433-6543. ES-2 E&V...1, 1-3 illustrate the types of infor- mation to be extracted from each document. Chapter 2 provides a more detailed description of the structure and

  5. Adherence and Persistence Among Statin Users Aged 65 Years and Over: A Systematic Review and Meta-analysis.

    PubMed

    Ofori-Asenso, Richard; Jakhu, Avtar; Zomer, Ella; Curtis, Andrea J; Korhonen, Maarit Jaana; Nelson, Mark; Gambhir, Manoj; Tonkin, Andrew; Liew, Danny; Zoungas, Sophia

    2018-05-09

    Older people (aged ≥ 65 years) have distinctive challenges with medication adherence. However, adherence and persistence patterns among older statin users have not been comprehensively reviewed. As part of a broader systematic review, we searched Medline, Embase, PsycINFO, CINAHL, Database of Abstracts of Reviews of Effects, CENTRAL, and the National Health Service Economic Evaluation Database through December 2016 for English articles reporting adherence and/or persistence among older statin users. Data were analyzed via descriptive methods and meta-analysis using random-effect modeling. Data from more than 3 million older statin users in 82 studies conducted in over 40 countries were analyzed. At 1-year follow-up, 59.7% (primary prevention 47.9%; secondary prevention 62.3%) of users were adherent (medication possession ratio [MPR] or proportion of days covered [PDC] ≥ 80%). For both primary and secondary prevention subjects, 1-year adherence was worse among individuals aged more than 75 years than those aged 65-75 years. At 3 and ≥10 years, 55.3% and 28.4% of users were adherent, respectively. The proportion of users persistent at 1-year was 76.7% (primary prevention 76.0%; secondary prevention 82.6%). Additionally, 68.1% and 61.2% of users were persistent at 2 and 4 years, respectively. Among new statin users, 48.2% were nonadherent and 23.9% discontinued within the first year. The proportion of statin users who were adherent based on self-report was 85.5%. There is poor short and long term adherence and persistence among older statin users. Strategies to improve adherence and reduce discontinuation are needed if the intended cardiovascular benefits of statin treatment are to be realized.

  6. Between-User Reliability of Tier 1 Exposure Assessment Tools Used Under REACH.

    PubMed

    Lamb, Judith; Galea, Karen S; Miller, Brian G; Hesse, Susanne; Van Tongeren, Martie

    2017-10-01

    When applying simple screening (Tier 1) tools to estimate exposure to chemicals in a given exposure situation under the Registration, Evaluation, Authorisation and restriction of CHemicals Regulation 2006 (REACH), users must select from several possible input parameters. Previous studies have suggested that results from exposure assessments using expert judgement and from the use of modelling tools can vary considerably between assessors. This study aimed to investigate the between-user reliability of Tier 1 tools. A remote-completion exercise and in person workshop were used to identify and evaluate tool parameters and factors such as user demographics that may be potentially associated with between-user variability. Participants (N = 146) generated dermal and inhalation exposure estimates (N = 4066) from specified workplace descriptions ('exposure situations') and Tier 1 tool combinations (N = 20). Interactions between users, tools, and situations were investigated and described. Systematic variation associated with individual users was minor compared with random between-user variation. Although variation was observed between choices made for the majority of input parameters, differing choices of Process Category ('PROC') code/activity descriptor and dustiness level impacted most on the resultant exposure estimates. Exposure estimates ranging over several orders of magnitude were generated for the same exposure situation by different tool users. Such unpredictable between-user variation will reduce consistency within REACH processes and could result in under-estimation or overestimation of exposure, risking worker ill-health or the implementation of unnecessary risk controls, respectively. Implementation of additional support and quality control systems for all tool users is needed to reduce between-assessor variation and so ensure both the protection of worker health and avoidance of unnecessary business risk management expenditure. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  7. VOLTTRON 3.0: User Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lutes, Robert G.; Haack, Jereme N.; Katipamula, Srinivas

    This document is a user guide for the deployment of the transactional network platform and agent/application development within VOLTTRON. The intent of this user guide is to provide a description of the functionality of the transactional network platform. This document describes how to deploy the platform, including installation, use, guidance, and limitations. It also describes how additional features can be added to enhance its current functionality.

  8. VOLTTRON 2.0: User Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lutes, Robert G.; Haack, Jereme N.; Katipamula, Srinivas

    This document is a user guide for the deployment of the transactional network platform and agent/application development within VOLTTRON. The intent of this user guide is to provide a description of the functionality of the transactional network platform. This document describes how to deploy the platform, including installation, use, guidance, and limitations. It also describes how additional features can be added to enhance its current functionality.

  9. SAM Photovoltaic Model Technical Reference 2016 Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilman, Paul; DiOrio, Nicholas A; Freeman, Janine M

    This manual describes the photovoltaic performance model in the System Advisor Model (SAM) software, Version 2016.3.14 Revision 4 (SSC Version 160). It is an update to the 2015 edition of the manual, which describes the photovoltaic model in SAM 2015.1.30 (SSC 41). This new edition includes corrections of errors in the 2015 edition and descriptions of new features introduced in SAM 2016.3.14, including: 3D shade calculator Battery storage model DC power optimizer loss inputs Snow loss model Plane-of-array irradiance input from weather file option Support for sub-hourly simulations Self-shading works with all four subarrays, and uses same algorithm for fixedmore » arrays and one-axis tracking Linear self-shading algorithm for thin-film modules Loss percentages replace derate factors. The photovoltaic performance model is one of the modules in the SAM Simulation Core (SSC), which is part of both SAM and the SAM SDK. SAM is a user-friedly desktop application for analysis of renewable energy projects. The SAM SDK (Software Development Kit) is for developers writing their own renewable energy analysis software based on SSC. This manual is written for users of both SAM and the SAM SDK wanting to learn more about the details of SAM's photovoltaic model.« less

  10. Flight dynamics analysis and simulation of heavy lift airships. Volume 3: User's manual

    NASA Technical Reports Server (NTRS)

    Emmen, R. D.; Tischler, M. B.

    1982-01-01

    The User's Manual provides the basic information necessary to run the programs. This includes descriptions of the various data files necessary for the program, the various outputs from the program and the options available to the user when executing the program. Additional data file information is contained in the three appendices to the manual. These appendices list all input variables and their permissible values, an example listing of these variables, and all output variables available to the user.

  11. Command & Control (C2) Systems Acquisition Study

    DTIC Science & Technology

    1982-09-01

    34: 0 The movement of substantial capability closer to individual users with significant improvements in the interface between the user and the system...description of the overall capability desired; (2) an archi- teLLural framework where evolution can occur with minimum subsequent redesign; and (3) a

  12. Variational Trajectory Optimization Tool Set: Technical description and user's manual

    NASA Technical Reports Server (NTRS)

    Bless, Robert R.; Queen, Eric M.; Cavanaugh, Michael D.; Wetzel, Todd A.; Moerder, Daniel D.

    1993-01-01

    The algorithms that comprise the Variational Trajectory Optimization Tool Set (VTOTS) package are briefly described. The VTOTS is a software package for solving nonlinear constrained optimal control problems from a wide range of engineering and scientific disciplines. The VTOTS package was specifically designed to minimize the amount of user programming; in fact, for problems that may be expressed in terms of analytical functions, the user needs only to define the problem in terms of symbolic variables. This version of the VTOTS does not support tabular data; thus, problems must be expressed in terms of analytical functions. The VTOTS package consists of two methods for solving nonlinear optimal control problems: a time-domain finite-element algorithm and a multiple shooting algorithm. These two algorithms, under the VTOTS package, may be run independently or jointly. The finite-element algorithm generates approximate solutions, whereas the shooting algorithm provides a more accurate solution to the optimization problem. A user's manual, some examples with results, and a brief description of the individual subroutines are included.

  13. AOIPS 3 User's guide. Volume 1: Overview and software utilization

    NASA Technical Reports Server (NTRS)

    Schotz, S. S.; Negri, A. J.; Robinson, W.

    1989-01-01

    This is Volume I of the Atmospheric and Oceanographic Information Processing System (AOIPS) User's Guide. AOIPS 3 is the version of the AOIPS software as of April 1989. The AOIPS software was developed jointly by the Goddard Space Flight Center and General Sciences Corporation. Volume 1 is intended to provide the user with an overall guide to the AOIPS system. It introduces the user to AOIPS system concepts, explains how programs are related and the necessary order of program execution, and provides brief descriptions derived from on-line help for every AOIPS program. It is intended to serve as a reference for information such as: program function, inmput/output variable descriptions, program limitations, etc. AOIPS is an interactive meteorological processing system with capabilities to ingest and analyze the many types of meteorological data. AOIPS includes several applications in areas of relevance to meteorological research. AOIPS is partitioned into four applications components: satellite data analysis, radar data analysis, aircraft data analysis, and utilities.

  14. Computer programs to predict induced effects of jets exhausting into a crossflow

    NASA Technical Reports Server (NTRS)

    Perkins, S. C., Jr.; Mendenhall, M. R.

    1984-01-01

    A user's manual for two computer programs was developed to predict the induced effects of jets exhausting into a crossflow. Program JETPLT predicts pressures induced on an infinite flat plate by a jet exhausting at angles to the plate and Program JETBOD, in conjunction with a panel code, predicts pressures induced on a body of revolution by a jet exhausting normal to the surface. Both codes use a potential model of the jet and adjacent surface with empirical corrections for the viscous or nonpotential effects. This program manual contains a description of the use of both programs, instructions for preparation of input, descriptions of the output, limitations of the codes, and sample cases. In addition, procedures to extend both codes to include additional empirical correlations are described.

  15. Space Trajectories Error Analysis (STEAP) Programs. Volume 1: Analytic manual, update

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Manual revisions are presented for the modified and expanded STEAP series. The STEAP 2 is composed of three independent but related programs: NOMAL for the generation of n-body nominal trajectories performing a number of deterministic guidance events; ERRAN for the linear error analysis and generalized covariance analysis along specific targeted trajectories; and SIMUL for testing the mathematical models used in the navigation and guidance process. The analytic manual provides general problem description, formulation, and solution and the detailed analysis of subroutines. The programmers' manual gives descriptions of the overall structure of the programs as well as the computational flow and analysis of the individual subroutines. The user's manual provides information on the input and output quantities of the programs. These are updates to N69-36472 and N69-36473.

  16. Flexible Description Language for HPC based Processing of Remote Sense Data

    NASA Astrophysics Data System (ADS)

    Nandra, Constantin; Gorgan, Dorian; Bacu, Victor

    2016-04-01

    When talking about Big Data, the most challenging aspect lays in processing them in order to gain new insight, find new patterns and gain knowledge from them. This problem is likely most apparent in the case of Earth Observation (EO) data. With ever higher numbers of data sources and increasing data acquisition rates, dealing with EO data is indeed a challenge [1]. Geoscientists should address this challenge by using flexible and efficient tools and platforms. To answer this trend, the BigEarth project [2] aims to combine the advantages of high performance computing solutions with flexible processing description methodologies in order to reduce both task execution times and task definition time and effort. As a component of the BigEarth platform, WorDeL (Workflow Description Language) [3] is intended to offer a flexible, compact and modular approach to the task definition process. WorDeL, unlike other description alternatives such as Python or shell scripts, is oriented towards the description topologies, using them as abstractions for the processing programs. This feature is intended to make it an attractive alternative for users lacking in programming experience. By promoting modular designs, WorDeL not only makes the processing descriptions more user-readable and intuitive, but also helps organizing the processing tasks into independent sub-tasks, which can be executed in parallel on multi-processor platforms in order to improve execution times. As a BigEarth platform [4] component, WorDeL represents the means by which the user interacts with the system, describing processing algorithms in terms of existing operators and workflows [5], which are ultimately translated into sets of executable commands. The WorDeL language has been designed to help in the definition of compute-intensive, batch tasks which can be distributed and executed on high-performance, cloud or grid-based architectures in order to improve the processing time. Main references for further information: [1] Gorgan, D., "Flexible and Adaptive Processing of Earth Observation Data over High Performance Computation Architectures", International Conference and Exhibition Satellite 2015, August 17-19, Houston, Texas, USA. [2] Bigearth project - flexible processing of big earth data over high performance computing architectures. http://cgis.utcluj.ro/bigearth, (2014) [3] Nandra, C., Gorgan, D., "Workflow Description Language for Defining Big Earth Data Processing Tasks", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 461-468, (2015). [4] Bacu, V., Stefan, T., Gorgan, D., "Adaptive Processing of Earth Observation Data on Cloud Infrastructures Based on Workflow Description", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp.444-454, (2015). [5] Mihon, D., Bacu, V., Colceriu, V., Gorgan, D., "Modeling of Earth Observation Use Cases through the KEOPS System", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 455-460, (2015).

  17. Knowledge representation to support reasoning based on multiple models

    NASA Technical Reports Server (NTRS)

    Gillam, April; Seidel, Jorge P.; Parker, Alice C.

    1990-01-01

    Model Based Reasoning is a powerful tool used to design and analyze systems, which are often composed of numerous interactive, interrelated subsystems. Models of the subsystems are written independently and may be used together while they are still under development. Thus the models are not static. They evolve as information becomes obsolete, as improved artifact descriptions are developed, and as system capabilities change. Researchers are using three methods to support knowledge/data base growth, to track the model evolution, and to handle knowledge from diverse domains. First, the representation methodology is based on having pools, or types, of knowledge from which each model is constructed. In addition information is explicit. This includes the interactions between components, the description of the artifact structure, and the constraints and limitations of the models. The third principle we have followed is the separation of the data and knowledge from the inferencing and equation solving mechanisms. This methodology is used in two distinct knowledge-based systems: one for the design of space systems and another for the synthesis of VLSI circuits. It has facilitated the growth and evolution of our models, made accountability of results explicit, and provided credibility for the user community. These capabilities have been implemented and are being used in actual design projects.

  18. Children of female sex workers and drug users: a review of vulnerability, resilience and family-centred models of care.

    PubMed

    Beard, Jennifer; Biemba, Godfrey; Brooks, Mohamad I; Costello, Jill; Ommerborn, Mark; Bresnahan, Megan; Flynn, David; Simon, Jonathon L

    2010-06-23

    Injection drug users and female sex workers are two of the populations most at risk for becoming infected with HIV in countries with concentrated epidemics. Many of the adults who fall into these categories are also parents, but little is known about the vulnerabilities faced by their children, their children's sources of resilience, or programmes providing services to these often fragile families. This review synthesizes evidence from disparate sources describing the vulnerabilities and resilience of the children of female sex workers and drug users, and documents some models of care that have been put in place to assist them. A large literature assessing the vulnerability and resilience of children of drug users and alcoholics in developed countries was found. Research on the situation of the children of sex workers is extremely limited. Children of drug users and sex workers can face unique risks, stigma and discrimination, but both child vulnerability and resilience are associated in the drug use literature with the physical and mental health of parents and family context. Family-centred interventions have been implemented in low- and middle-income contexts, but they tend to be small, piecemeal and struggling to meet demand; they are poorly documented, and most have not been formally evaluated. We present preliminary descriptive data from an organization working with pregnant and new mothers who are drug users in Ukraine and from an organization providing services to sex workers and their families in Zambia. Because parents' drug use or sex work is often illegal and hidden, identifying their children can be difficult and may increase children's vulnerability and marginalization. Researchers and service providers, therefore, need to proceed with caution when attempting to reach these populations, but documentation and evaluation of current programmes should be prioritized.

  19. Linear models for calculating digestibile energy for sheep diets.

    PubMed

    Fonnesbeck, P V; Christiansen, M L; Harris, L E

    1981-05-01

    Equations for estimating the digestible energy (DE) content of sheep diets were generated from the chemical contents and a factorial description of diets fed to lambs in digestion trials. The diet factors were two forages (alfalfa and grass hay), harvested at three stages of maturity (late vegetative, early bloom and full bloom), fed in two ingredient combinations (all hay or a 50:50 hay and corn grain mixture) and prepared by two forage texture processes (coarsely chopped or finely chopped and pelleted). The 2 x 3 x 2 x 2 factorial arrangement produced 24 diet treatments. These were replicated twice, for a total of 48 lamb digestion trials. In model 1 regression equations, DE was calculated directly from chemical composition of the diet. In model 2, regression equations predicted the percentage of digested nutrient from the chemical contents of the diet and then DE of the diet was calculated as the sum of the gross energy of the digested organic components. Expanded forms of model 1 and model 2 were also developed that included diet factors as qualitative indicator variables to adjust the regression constant and regression coefficients for the diet description. The expanded forms of the equations accounted for significantly more variation in DE than did the simple models and more accurately estimated DE of the diet. Information provided by the diet description proved as useful as chemical analyses for the prediction of digestibility of nutrients. The statistics indicate that, with model 1, neutral detergent fiber and plant cell wall analyses provided as much information for the estimation of DE as did model 2 with the combined information from crude protein, available carbohydrate, total lipid, cellulose and hemicellulose. Regression equations are presented for estimating DE with the most currently analyzed organic components, including linear and curvilinear variables and diet factors that significantly reduce the standard error of the estimate. To estimate De of a diet, the user utilizes the equation that uses the chemical analysis information and diet description most effectively.

  20. Space Ultrareliable Modular Computer (SUMC) instruction simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1972-01-01

    The design principles, description, functional operation, and recommended expansion and enhancements are presented for the Space Ultrareliable Modular Computer interpretive simulator. Included as appendices are the user's manual, program module descriptions, target instruction descriptions, simulator source program listing, and a sample program printout. In discussing the design and operation of the simulator, the key problems involving host computer independence and target computer architectural scope are brought into focus.

  1. The Software Ontology (SWO): a resource for reproducibility in biomedical data analysis, curation and digital preservation.

    PubMed

    Malone, James; Brown, Andy; Lister, Allyson L; Ison, Jon; Hull, Duncan; Parkinson, Helen; Stevens, Robert

    2014-01-01

    Biomedical ontologists to date have concentrated on ontological descriptions of biomedical entities such as gene products and their attributes, phenotypes and so on. Recently, effort has diversified to descriptions of the laboratory investigations by which these entities were produced. However, much biological insight is gained from the analysis of the data produced from these investigations, and there is a lack of adequate descriptions of the wide range of software that are central to bioinformatics. We need to describe how data are analyzed for discovery, audit trails, provenance and reproducibility. The Software Ontology (SWO) is a description of software used to store, manage and analyze data. Input to the SWO has come from beyond the life sciences, but its main focus is the life sciences. We used agile techniques to gather input for the SWO and keep engagement with our users. The result is an ontology that meets the needs of a broad range of users by describing software, its information processing tasks, data inputs and outputs, data formats versions and so on. Recently, the SWO has incorporated EDAM, a vocabulary for describing data and related concepts in bioinformatics. The SWO is currently being used to describe software used in multiple biomedical applications. The SWO is another element of the biomedical ontology landscape that is necessary for the description of biomedical entities and how they were discovered. An ontology of software used to analyze data produced by investigations in the life sciences can be made in such a way that it covers the important features requested and prioritized by its users. The SWO thus fits into the landscape of biomedical ontologies and is produced using techniques designed to keep it in line with user's needs. The Software Ontology is available under an Apache 2.0 license at http://theswo.sourceforge.net/; the Software Ontology blog can be read at http://softwareontology.wordpress.com.

  2. Descriptive Question Answering with Answer Type Independent Features

    NASA Astrophysics Data System (ADS)

    Yoon, Yeo-Chan; Lee, Chang-Ki; Kim, Hyun-Ki; Jang, Myung-Gil; Ryu, Pum Mo; Park, So-Young

    In this paper, we present a supervised learning method to seek out answers to the most frequently asked descriptive questions: reason, method, and definition questions. Most of the previous systems for question answering focus on factoids, lists or definitional questions. However, descriptive questions such as reason questions and method questions are also frequently asked by users. We propose a system for these types of questions. The system conducts an answer search as follows. First, we analyze the user's question and extract search keywords and the expected answer type. Second, information retrieval results are obtained from an existing search engine such as Yahoo or Google. Finally, we rank the results to find snippets containing answers to the questions based on a ranking SVM algorithm. We also propose features to identify snippets containing answers for descriptive questions. The features are adaptable and thus are not dependent on answer type. Experimental results show that the proposed method and features are clearly effective for the task.

  3. Barotropic Tidal Predictions and Validation in a Relocatable Modeling Environment. Revised

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Passi, Ranjit; Kantha, Lakshmi; Payne, Steven; Brahmachari, Shuvobroto

    1998-01-01

    Under funding from the Office of Naval Research (ONR), the Mississippi State University Center for Air Sea Technology (CAST) has been working on developing a Relocatable Modeling Environment (RME) to provide a uniform and unbiased infrastructure for efficiently configuring numerical models in any geographic or oceanic region. Under Naval Oceanographic Office (NAVOCEANO) funding, the model was implemented and tested for NAVOCEANO use. With our current emphasis on ocean tidal modeling, CAST has adopted the Colorado University's numerical ocean model, known as CURReNTSS (Colorado University Rapidly Relocatable Nestable Storm Surge) Model, as the model of choice. During the RME development process, CURReNTSS has been relocated to several coastal oceanic regions, providing excellent results that demonstrate its veracity. This report documents the model validation results and provides a brief description of the Graphic user Interface.

  4. Business Model Evaluation for an Advanced Multimedia Service Portfolio

    NASA Astrophysics Data System (ADS)

    Pisciella, Paolo; Zoric, Josip; Gaivoronski, Alexei A.

    In this paper we analyze quantitatively a business model for the collaborative provision of an advanced mobile data service portfolio composed of three multimedia services: Video on Demand, Internet Protocol Television and User Generated Content. We provide a description of the provision system considering the relation occurring between tecnical aspects and business aspects for each agent providing the basic multimedia service. Such a techno-business analysis is then projected into a mathematical model dealing with the problem of the definition of incentives between the different agents involved in a collaborative service provision. Through the implementation of this model we aim at shaping the behaviour of each of the contributing agents modifying the level of profitability that the Service Portfolio yields to each of them.

  5. Hyper-Fit: Fitting Linear Models to Multidimensional Data with Multivariate Gaussian Uncertainties

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Obreschkow, D.

    2015-09-01

    Astronomical data is often uncertain with errors that are heteroscedastic (different for each data point) and covariant between different dimensions. Assuming that a set of D-dimensional data points can be described by a (D - 1)-dimensional plane with intrinsic scatter, we derive the general likelihood function to be maximised to recover the best fitting model. Alongside the mathematical description, we also release the hyper-fit package for the R statistical language (http://github.com/asgr/hyper.fit) and a user-friendly web interface for online fitting (http://hyperfit.icrar.org). The hyper-fit package offers access to a large number of fitting routines, includes visualisation tools, and is fully documented in an extensive user manual. Most of the hyper-fit functionality is accessible via the web interface. In this paper, we include applications to toy examples and to real astronomical data from the literature: the mass-size, Tully-Fisher, Fundamental Plane, and mass-spin-morphology relations. In most cases, the hyper-fit solutions are in good agreement with published values, but uncover more information regarding the fitted model.

  6. AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services

    PubMed Central

    2010-01-01

    Background AGUIA is a front-end web application originally developed to manage clinical, demographic and biomolecular patient data collected during clinical trials at MD Anderson Cancer Center. The diversity of methods involved in patient screening and sample processing generates a variety of data types that require a resource-oriented architecture to capture the associations between the heterogeneous data elements. AGUIA uses a semantic web formalism, resource description framework (RDF), and a bottom-up design of knowledge bases that employ the S3DB tool as the starting point for the client's interface assembly. Methods The data web service, S3DB, meets the necessary requirements of generating the RDF and of explicitly distinguishing the description of the domain from its instantiation, while allowing for continuous editing of both. Furthermore, it uses an HTTP-REST protocol, has a SPARQL endpoint, and has open source availability in the public domain, which facilitates the development and dissemination of this application. However, S3DB alone does not address the issue of representing content in a form that makes sense for domain experts. Results We identified an autonomous set of descriptors, the GBox, that provides user and domain specifications for the graphical user interface. This was achieved by identifying a formalism that makes use of an RDF schema to enable the automatic assembly of graphical user interfaces in a meaningful manner while using only resources native to the client web browser (JavaScript interpreter, document object model). We defined a generalized RDF model such that changes in the graphic descriptors are automatically and immediately (locally) reflected into the configuration of the client's interface application. Conclusions The design patterns identified for the GBox benefit from and reflect the specific requirements of interacting with data generated by clinical trials, and they contain clues for a general purpose solution to the challenge of having interfaces automatically assembled for multiple and volatile views of a domain. By coding AGUIA in JavaScript, for which all browsers include a native interpreter, a solution was found that assembles interfaces that are meaningful to the particular user, and which are also ubiquitous and lightweight, allowing the computational load to be carried by the client's machine. PMID:20977768

  7. ICADS: A cooperative decision making model with CLIPS experts

    NASA Technical Reports Server (NTRS)

    Pohl, Jens; Myers, Leonard

    1991-01-01

    A cooperative decision making model is described which is comprised of six concurrently executing domain experts coordinated by a blackboard control expert. The focus application field is architectural design, and the domain experts represent consultants in the area of daylighting, noise control, structural support, cost estimating, space planning, and climate responsiveness. Both the domain experts and the blackboard were implemented as production systems, using an enhanced version of the basic CLIPS package. Acting in unison as an Expert Design Advisor, the domain and control experts react to the evolving design solution progressively developed by the user in a 2-D CAD drawing environment. A Geometry Interpreter maps each drawing action taken by the user to real world objects, such as spaces, walls, windows, and doors. These objects, endowed with geometric and nongeometric attributes, are stored as frames in a semantic network. Object descriptions are derived partly from the geometry of the drawing environment and partly from knowledge bases containing prototypical, generalized information about the building type and site conditions under consideration.

  8. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  9. Medication Reconciliation: Work Domain Ontology, prototype development, and a predictive model.

    PubMed

    Markowitz, Eliz; Bernstam, Elmer V; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System's and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load.

  10. Medication Reconciliation: Work Domain Ontology, Prototype Development, and a Predictive Model

    PubMed Central

    Markowitz, Eliz; Bernstam, Elmer V.; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R.

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System’s and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load. PMID:22195146

  11. iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations

    NASA Astrophysics Data System (ADS)

    Vanfretti, L.; Rabuzin, T.; Baudette, M.; Murad, M.

    The iTesla Power Systems Library (iPSL) is a Modelica package providing a set of power system components for phasor time-domain modeling and simulation. The Modelica language provides a systematic approach to develop models using a formal mathematical description, that uniquely specifies the physical behavior of a component or the entire system. Furthermore, the standardized specification of the Modelica language (Modelica Association [1]) enables unambiguous model exchange by allowing any Modelica-compliant tool to utilize the models for simulation and their analyses without the need of a specific model transformation tool. As the Modelica language is being developed with open specifications, any tool that implements these requirements can be utilized. This gives users the freedom of choosing an Integrated Development Environment (IDE) of their choice. Furthermore, any integration solver can be implemented within a Modelica tool to simulate Modelica models. Additionally, Modelica is an object-oriented language, enabling code factorization and model re-use to improve the readability of a library by structuring it with object-oriented hierarchy. The developed library is released under an open source license to enable a wider distribution and let the user customize it to their specific needs. This paper describes the iPSL and provides illustrative application examples.

  12. ULFEM time series analysis package

    USGS Publications Warehouse

    Karl, Susan M.; McPhee, Darcy K.; Glen, Jonathan M. G.; Klemperer, Simon L.

    2013-01-01

    This manual describes how to use the Ultra-Low-Frequency ElectroMagnetic (ULFEM) software package. Casual users can read the quick-start guide and will probably not need any more information than this. For users who may wish to modify the code, we provide further description of the routines.

  13. Policy options evaluation tool for managed lanes (POET-ML) users guide and methodology description : Federal Highway Administration HOV lane performance

    DOT National Transportation Integrated Search

    2008-12-01

    Users guide for a sketch planning tool for exploring policy alternatives. It is intended for an audience of transportation professionals responsible for planning, designing, funding, operating, enforcing, monitoring, and managing HOV and HOT lanes...

  14. Intelligent Data Reduction (IDARE)

    NASA Technical Reports Server (NTRS)

    Brady, D. Michael; Ford, Donnie R.

    1990-01-01

    A description of the Intelligent Data Reduction (IDARE) expert system and an IDARE user's manual are given. IDARE is a data reduction system with the addition of a user profile infrastructure. The system was tested on a nickel-cadmium battery testbed. Information is given on installing, loading, maintaining the IDARE system.

  15. Evaluation of a co-delivered training package for community mental health professionals on service user- and carer-involved care planning.

    PubMed

    Grundy, A C; Walker, L; Meade, O; Fraser, C; Cree, L; Bee, P; Lovell, K; Callaghan, P

    2017-08-01

    WHAT IS KNOWN ON THE SUBJECT?: There is consistent evidence that service users and carers feel marginalized in the process of mental health care planning. Mental health professionals have identified ongoing training needs in relation to involving service users and carers in care planning. There is limited research on the acceptability of training packages for mental health professionals which involve service users and carers as co-facilitators. WHAT DOES THIS PAPER ADD TO EXISTING KNOWLEDGE?: A co-produced and co-delivered training package on service user- and carer-involved care planning was acceptable to mental health professionals. Aspects of the training that were particularly valued were the co-production model, small group discussion and the opportunity for reflective practice. The organizational context of care planning may need more consideration in future training models. WHAT ARE THE IMPLICATIONS FOR PRACTICE?: Mental health nurses using co-production models of delivering training to other mental health professionals can be confident that such initiatives will be warmly welcomed, acceptable and engaging. On the basis of the results reported here, we encourage mental health nurses to use co-production approaches more often. Further research will show how clinically effective this training is in improving outcomes for service users and carers. Background There is limited evidence for the acceptability of training for mental health professionals on service user- and carer-involved care planning. Aim To investigate the acceptability of a co-delivered, two-day training intervention on service user- and carer-involved care planning. Methods Community mental health professionals were invited to complete the Training Acceptability Rating Scale post-training. Responses to the quantitative items were summarized using descriptive statistics (Miles, ), and qualitative responses were coded using content analysis (Weber, ). Results Of 350 trainees, 310 completed the questionnaire. The trainees rated the training favourably (median overall TARS scores = 56/63; median 'acceptability' score = 34/36; median 'perceived impact' score = 22/27). There were six qualitative themes: the value of the co-production model; time to reflect on practice; delivery preferences; comprehensiveness of content; need to consider organizational context; and emotional response. Discussion The training was found to be acceptable and comprehensive with participants valuing the co-production model. Individual differences were apparent in terms of delivery preferences and emotional reactions. There may be a need to further address the organizational context of care planning in future training. Implications for practice Mental health nurses should use co-production models of continuing professional development training that involve service users and carers as co-facilitators. © 2017 The Authors. Journal of Psychiatric and Mental Health Nursing Published by John Wiley & Sons Ltd.

  16. WEST-3 wind turbine simulator development

    NASA Technical Reports Server (NTRS)

    Hoffman, J. A.; Sridhar, S.

    1985-01-01

    The software developed for WEST-3, a new, all digital, and fully programmable wind turbine simulator is given. The process of wind turbine simulation on WEST-3 is described in detail. The major steps are, the processing of the mathematical models, the preparation of the constant data, and the use of system software generated executable code for running on WEST-3. The mechanics of reformulation, normalization, and scaling of the mathematical models is discussed in detail, in particulr, the significance of reformulation which leads to accurate simulations. Descriptions for the preprocessor computer programs which are used to prepare the constant data needed in the simulation are given. These programs, in addition to scaling and normalizing all the constants, relieve the user from having to generate a large number of constants used in the simulation. Also given are brief descriptions of the components of the WEST-3 system software: Translator, Assembler, Linker, and Loader. Also included are: details of the aeroelastic rotor analysis, which is the center of a wind turbine simulation model, analysis of the gimbal subsystem; and listings of the variables, constants, and equations used in the simulation.

  17. Advanced transportation system studies. Alternate propulsion subsystem concepts: Propulsion database

    NASA Technical Reports Server (NTRS)

    Levack, Daniel

    1993-01-01

    The Advanced Transportation System Studies alternate propulsion subsystem concepts propulsion database interim report is presented. The objective of the database development task is to produce a propulsion database which is easy to use and modify while also being comprehensive in the level of detail available. The database is to be available on the Macintosh computer system. The task is to extend across all three years of the contract. Consequently, a significant fraction of the effort in this first year of the task was devoted to the development of the database structure to ensure a robust base for the following years' efforts. Nonetheless, significant point design propulsion system descriptions and parametric models were also produced. Each of the two propulsion databases, parametric propulsion database and propulsion system database, are described. The descriptions include a user's guide to each code, write-ups for models used, and sample output. The parametric database has models for LOX/H2 and LOX/RP liquid engines, solid rocket boosters using three different propellants, a hybrid rocket booster, and a NERVA derived nuclear thermal rocket engine.

  18. CSciBox: An Intelligent Assistant for Dating Ice and Sediment Cores

    NASA Astrophysics Data System (ADS)

    Finlinson, K.; Bradley, E.; White, J. W. C.; Anderson, K. A.; Marchitto, T. M., Jr.; de Vesine, L. R.; Jones, T. R.; Lindsay, C. M.; Israelsen, B.

    2015-12-01

    CSciBox is an integrated software system for the construction and evaluation of age models of paleo-environmental archives. It incorporates a number of data-processing and visualization facilities, ranging from simple interpolation to reservoir-age correction and 14C calibration via the Calib algorithm, as well as a number of firn and ice-flow models. It employs modern database technology to store paleoclimate proxy data and analysis results in an easily accessible and searchable form, and offers the user access to those data and computational elements via a modern graphical user interface (GUI). In the case of truly large data or computations, CSciBox is parallelizable across modern multi-core processors, or clusters, or even the cloud. The code is open source and freely available on github, as are one-click installers for various versions of Windows and Mac OSX. The system's architecture allows users to incorporate their own software in the form of computational components that can be built smoothly into CSciBox workflows, taking advantage of CSciBox's GUI, data importing facilities, and plotting capabilities. To date, BACON and StratiCounter have been integrated into CSciBox as embedded components. The user can manipulate and compose all of these tools and facilities as she sees fit. Alternatively, she can employ CSciBox's automated reasoning engine, which uses artificial intelligence techniques to explore the gamut of age models and cross-dating scenarios automatically. The automated reasoning engine captures the knowledge of expert geoscientists, and can output a description of its reasoning.

  19. Applications Technology Satellite and Communications Technology Satellite user experiments for 1967-1980 reference book. Volume 4: Abstracts

    NASA Technical Reports Server (NTRS)

    Engler, N. A.; Nash, J. F.; Strange, J. D.

    1980-01-01

    The important user experiments conducted during the fourteen year period from 1966 to 1980 are summarized. A description of each of the satellites and a brief summary of each user experiment is presented. A cross index of user experiments sorted by various parameters and a listing of keywords versus experiment number is included. The experiments are grouped by type of service offered; for example, education, health services, and data transmission. A bibliography of reports by accession number and by author is also presented. User viewpoints of the systems are presented.

  20. Systematic Assessment of the Impact of User Roles on Network Flow Patterns

    DTIC Science & Technology

    2017-09-01

    Protocol SNMP Simple Network Management Protocol SQL Structured Query Language SSH Secure Shell SYN TCP Sync Flag SVDD Support Vector Data Description SVM...and evaluating users based on roles provide the best approach for defining normal digital behaviors? People are individuals, with different interests...activities on the network. We evaluate the assumption that users sharing similar roles exhibit similar network behaviors, and contrast the level of similarity

  1. Cross-standard user description in mobile, medical oriented virtual collaborative environments

    NASA Astrophysics Data System (ADS)

    Ganji, Rama Rao; Mitrea, Mihai; Joveski, Bojan; Chammem, Afef

    2015-03-01

    By combining four different open standards belonging to the ISO/IEC JTC1/SC29 WG11 (a.k.a. MPEG) and W3C, this paper advances an architecture for mobile, medical oriented virtual collaborative environments. The various users are represented according to MPEG-UD (MPEG User Description) while the security issues are dealt with by deploying the WebID principles. On the server side, irrespective of their elementary types (text, image, video, 3D, …), the medical data are aggregated into hierarchical, interactive multimedia scenes which are alternatively represented into MPEG-4 BiFS or HTML5 standards. This way, each type of content can be optimally encoded according to its particular constraints (semantic, medical practice, network conditions, etc.). The mobile device should ensure only the displaying of the content (inside an MPEG player or an HTML5 browser) and the capturing of the user interaction. The overall architecture is implemented and tested under the framework of the MEDUSA European project, in partnership with medical institutions. The testbed considers a server emulated by a PC and heterogeneous user devices (tablets, smartphones, laptops) running under iOS, Android and Windows operating systems. The connection between the users and the server is alternatively ensured by WiFi and 3G/4G networks.

  2. Users' Manual for ILSS (Revised ILSLOC) : Simulation for Derogation Effects on the Instrument Landing System

    DOT National Transportation Integrated Search

    1976-12-01

    The manual presents the complete ILSLOC computer program package. In addition to including a thorough description of the program itself and a commented listing, the manual contains a brief description of the ILS system and antenna patterns. To illust...

  3. Telecom Modeling with ChatterBell.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jrad, Ahmad M.; Kelic, Andjelka

    This document provides a description and user manual for the ChatterBell voice telecom modeling and simulation capability. The intended audience consists of network planners and practitioners who wish to use the tool to model a particular voice network and analyze its behavior under varying assumptions and possible failure conditions. ChatterBell is built on top of the N-SMART voice simulation and visualization suite that was developed through collaboration between Sandia National Laboratories and Bell Laboratories of Lucent Technologies. The new and improved modeling and simulation tool has been modified and modernized to incorporate the latest development in the telecom world includingmore » the widespread use of VoIP technology. In addition, ChatterBell provides new commands and modeling capabilities that were not available in the N-SMART application.« less

  4. Model documentation report: Residential sector demand module of the national energy modeling system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code. This reference document provides a detailed description for energy analysts, other users, and the public. The NEMS Residential Sector Demand Module is currently used for mid-term forecasting purposes and energy policy analysis over the forecast horizon of 1993 through 2020. The model generates forecasts of energy demand for the residential sector by service, fuel, and Census Division. Policy impacts resulting from new technologies,more » market incentives, and regulatory changes can be estimated using the module. 26 refs., 6 figs., 5 tabs.« less

  5. Pornography, sexual socialization, and satisfaction among young men.

    PubMed

    Stulhofer, Aleksandar; Busko, Vesna; Landripet, Ivan

    2010-02-01

    In spite of a growing presence of pornography in contemporary life, little is known about its potential effects on young people's sexual socialization and sexual satisfaction. In this article, we present a theoretical model of the effects of sexually explicit materials (SEM) mediated by sexual scripting and moderated by the type of SEM used. An on-line survey dataset that included 650 young Croatian men aged 18-25 years was used to explore empirically the model. Descriptive findings pointed to significant differences between mainstream and paraphilic SEM users in frequency of SEM use at the age of 14, current SEM use, frequency of masturbation, sexual boredom, acceptance of sex myths, and sexual compulsiveness. In testing the model, a novel instrument was used, the Sexual Scripts Overlap Scale, designed to measure the influence of SEM on sexual socialization. Structural equation analyses suggested that negative effects of early exposure to SEM on young men's sexual satisfaction, albeit small, could be stronger than positive effects. Both positive and negative effects-the latter being expressed through suppression of intimacy-were observed only among users of paraphilic SEM. No effect of early exposure to SEM was found among the mainstream SEM users. To counterbalance moral panic but also glamorization of pornography, sex education programs should incorporate contents that would increase media literacy and assist young people in critical interpretation of pornographic imagery.

  6. AOIPS 3 user's guide. Volume 2: Program descriptions

    NASA Technical Reports Server (NTRS)

    Schotz, Steve S.; Piper, Thomas S.; Negri, Andrew J.

    1990-01-01

    The Atmospheric and Oceanographic Information Processing System (AOIPS) 3 is the version of the AOIPS software as of April 1989. The AOIPS software was developed jointly by the Goddard Space Flight Center and General Sciences Corporation. A detailed description of very AOIPS program is presented. It is intended to serve as a reference for such items as program functionality, program operational instructions, and input/output variable descriptions. Program descriptions are derived from the on-line help information. Each program description is divided into two sections. The functional description section describes the purpose of the program and contains any pertinent operational information. The program description sections lists the program variables as they appear on-line, and describes them in detail.

  7. Use and Usefulness of Lower Limb Prostheses.

    ERIC Educational Resources Information Center

    Buijk, Catharina A.

    1988-01-01

    Adults (n=181) in the Netherlands were surveyed concerning their use of lower limb prostheses. Results are analyzed in terms of age and sex of users, reason for amputation, level of amputation, description of prosthesis, amount of time able to walk or stand, satisfaction with the prosthesis, and user recommendations. (JDD)

  8. Re-Imagining Archival Display: Creating User-Friendly Finding Aids

    ERIC Educational Resources Information Center

    Daines, J. Gordon, III; Nimer, Cory L.

    2011-01-01

    This article examines how finding aids are structured and delivered, considering alternative approaches. It suggests that single-level displays, those that present a single component of a multilevel description to users at a time, have the potential to transform the delivery and display of collection information while improving the user…

  9. End-User Use of Data Base Query Language: Pros and Cons.

    ERIC Educational Resources Information Center

    Nicholes, Walter

    1988-01-01

    Man-machine interface, the concept of a computer "query," a review of database technology, and a description of the use of query languages at Brigham Young University are discussed. The pros and cons of end-user use of database query languages are explored. (Author/MLW)

  10. Higher Education Finance Manual: Volume 2. Data Users' Guide.

    ERIC Educational Resources Information Center

    Collier, Douglas J.; Allen, Richard H.

    The second volume of the revised "Higher Education Finance Manual" (HEFM), this data users' guide is oriented to the nonaccountant and describes the kinds of information about postsecondary education that can be derived from institutional financial data. Contents include: a description of fund accounting for higher education, a…

  11. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 1: Introduction and user's guide

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Reporting software programs provide formatted listings and summary reports of the Software Engineering Laboratory (SEL) data base contents. The operating procedures and system information for 18 different reporting software programs are described. Sample output reports from each program are provided.

  12. The Concept of Collection from the User's Perspective

    ERIC Educational Resources Information Center

    Lee, Hur-Li

    2005-01-01

    This study explores the concept and functions of collection from the perspective of the user. In-depth interviews with ten professors from a social science discipline and a natural science department provided descriptions of their information seeking involving material sources and their perceptions of the library collection. Participants used the…

  13. SOL - SIZING AND OPTIMIZATION LANGUAGE COMPILER

    NASA Technical Reports Server (NTRS)

    Scotti, S. J.

    1994-01-01

    SOL is a computer language which is geared to solving design problems. SOL includes the mathematical modeling and logical capabilities of a computer language like FORTRAN but also includes the additional power of non-linear mathematical programming methods (i.e. numerical optimization) at the language level (as opposed to the subroutine level). The language-level use of optimization has several advantages over the traditional, subroutine-calling method of using an optimizer: first, the optimization problem is described in a concise and clear manner which closely parallels the mathematical description of optimization; second, a seamless interface is automatically established between the optimizer subroutines and the mathematical model of the system being optimized; third, the results of an optimization (objective, design variables, constraints, termination criteria, and some or all of the optimization history) are output in a form directly related to the optimization description; and finally, automatic error checking and recovery from an ill-defined system model or optimization description is facilitated by the language-level specification of the optimization problem. Thus, SOL enables rapid generation of models and solutions for optimum design problems with greater confidence that the problem is posed correctly. The SOL compiler takes SOL-language statements and generates the equivalent FORTRAN code and system calls. Because of this approach, the modeling capabilities of SOL are extended by the ability to incorporate existing FORTRAN code into a SOL program. In addition, SOL has a powerful MACRO capability. The MACRO capability of the SOL compiler effectively gives the user the ability to extend the SOL language and can be used to develop easy-to-use shorthand methods of generating complex models and solution strategies. The SOL compiler provides syntactic and semantic error-checking, error recovery, and detailed reports containing cross-references to show where each variable was used. The listings summarize all optimizations, listing the objective functions, design variables, and constraints. The compiler offers error-checking specific to optimization problems, so that simple mistakes will not cost hours of debugging time. The optimization engine used by and included with the SOL compiler is a version of Vanderplatt's ADS system (Version 1.1) modified specifically to work with the SOL compiler. SOL allows the use of the over 100 ADS optimization choices such as Sequential Quadratic Programming, Modified Feasible Directions, interior and exterior penalty function and variable metric methods. Default choices of the many control parameters of ADS are made for the user, however, the user can override any of the ADS control parameters desired for each individual optimization. The SOL language and compiler were developed with an advanced compiler-generation system to ensure correctness and simplify program maintenance. Thus, SOL's syntax was defined precisely by a LALR(1) grammar and the SOL compiler's parser was generated automatically from the LALR(1) grammar with a parser-generator. Hence unlike ad hoc, manually coded interfaces, the SOL compiler's lexical analysis insures that the SOL compiler recognizes all legal SOL programs, can recover from and correct for many errors and report the location of errors to the user. This version of the SOL compiler has been implemented on VAX/VMS computer systems and requires 204 KB of virtual memory to execute. Since the SOL compiler produces FORTRAN code, it requires the VAX FORTRAN compiler to produce an executable program. The SOL compiler consists of 13,000 lines of Pascal code. It was developed in 1986 and last updated in 1988. The ADS and other utility subroutines amount to 14,000 lines of FORTRAN code and were also updated in 1988.

  14. User's manual for ILSLOC : simulation for derogation effects on the localizer portion of the Instrument Landing System

    DOT National Transportation Integrated Search

    1973-08-01

    The manual presents the complete ILSLOC computer program package. In addition to including a thorough description of the program itself and a commented listing, the manual contains a brief description of the ILS system and antenna patterns. To illust...

  15. Descriptive and Functional Classifications of Drug Abusers

    ERIC Educational Resources Information Center

    Carlin, Albert S.; Stauss, Fred F.

    1977-01-01

    Polydrug (non-opiate-drug) abusers have previously been classified by a variety of typologies that can be characterized as either descriptive, functional, or a combination of both. This investigation proposes two objective scoring systems that classify polydrug users on a streetwise/straight dimension and on a self-medication/recreational-use…

  16. COM1/348: Design and Implementation of a Portal for the Market of the Medical Equipment (MEDICOM)

    PubMed Central

    Palamas, S; Vlachos, I; Panou-Diamandi, O; Marinos, G; Kalivas, D; Zeelenberg, C; Nimwegen, C; Koutsouris, D

    1999-01-01

    Introduction The MEDICOM system provides the electronic means for medical equipment manufacturers to communicate online with their customers supporting the Purchasing Process and the Post Market Surveillance. The MEDICOM service will be provided over the Internet by the MEDICOM Portal, and by a set of distributed subsystems dedicated to handle structured information related to medical devices. There are three kinds of these subsystems, the Hypermedia Medical Catalogue (HMC), Virtual Medical Exhibition (VME), which contains information in a form of Virtual Models, and the Post Market Surveillance system (PMS). The Universal Medical Devices Nomenclature System (UMDNS) is used to register all products. This work was partially funded by the ESPRIT Project 25289 (MEDICOM). Methods The Portal provides the end user interface operating as the MEDICOM Portal, acts as the yellow pages for finding both products and providers, providing links to the providers servers, implements the system management and supports the subsystem database compatibility. The Portal hosts a database system composed of two parts: (a) the Common Database, which describes a set of encoded parameters (like Supported Languages, Geographic Regions, UMDNS Codes, etc) common to all subsystems and (b) the Short Description Database, which contains summarised descriptions of medical devices, including a text description, the codes of the manufacturer, UMDNS code, attribute values and links to the corresponding HTML pages of the HMC, VME and PMS servers. The Portal provides the MEDICOM user interface including services like end user profiling and registration, end user query forms, creation and hosting of newsgroups, links to online libraries, end user subscription to manufacturers' mailing lists, online information for the MEDICOM system and special messages or advertisements from manufacturers. Results Platform independence and interoperability characterise the system design. A general purpose RDBMS is used for the implementation of the databases. The end user interface is implemented using HTML and Java applets, while the subsystem administration applications are developed using Java. The JDBC interface is used in order to provide database access to these applications. The communication between subsystems is implemented using CORBA objects and Java servlets are used in subsystem servers for the activation of remote operations. Discussion In the second half of 1999, the MEDICOM Project will enter the phase of evaluation and pilot operation. The benefits of the MEDICOM system are expected to be the establishment of a world wide accessible marketplace between providers and health care professionals. The latter will achieve the provision of up-to-date and high quality products information in an easy and friendly way, and the enhancement of the marketing procedures and after sales support efficiency.

  17. State criminal justice telecommunications (STACOM). Volume 4: Network design software user's guide

    NASA Technical Reports Server (NTRS)

    Lee, J. J.

    1977-01-01

    A user's guide to the network design program is presented. The program is written in FORTRAN V and implemented on a UNIVAC 1108 computer under the EXEC-8 operating system which enables the user to construct least-cost network topologies for criminal justice digital telecommunications networks. A complete description of program features, inputs, processing logic, and outputs is presented, and a sample run and a program listing are included.

  18. Public (Q)SAR Services, Integrated Modeling Environments, and Model Repositories on the Web: State of the Art and Perspectives for Future Development.

    PubMed

    Tetko, Igor V; Maran, Uko; Tropsha, Alexander

    2017-03-01

    Thousands of (Quantitative) Structure-Activity Relationships (Q)SAR models have been described in peer-reviewed publications; however, this way of sharing seldom makes models available for the use by the research community outside of the developer's laboratory. Conversely, on-line models allow broad dissemination and application representing the most effective way of sharing the scientific knowledge. Approaches for sharing and providing on-line access to models range from web services created by individual users and laboratories to integrated modeling environments and model repositories. This emerging transition from the descriptive and informative, but "static", and for the most part, non-executable print format to interactive, transparent and functional delivery of "living" models is expected to have a transformative effect on modern experimental research in areas of scientific and regulatory use of (Q)SAR models. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. FASTER 3: A generalized-geometry Monte Carlo computer program for the transport of neutrons and gamma rays. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    Jordan, T. M.

    1970-01-01

    A description of the FASTER-III program for Monte Carlo Carlo calculation of photon and neutron transport in complex geometries is presented. Major revisions include the capability of calculating minimum weight shield configurations for primary and secondary radiation and optimal importance sampling parameters. The program description includes a users manual describing the preparation of input data cards, the printout from a sample problem including the data card images, definitions of Fortran variables, the program logic, and the control cards required to run on the IBM 7094, IBM 360, UNIVAC 1108 and CDC 6600 computers.

  20. Methodology for automating software systems

    NASA Technical Reports Server (NTRS)

    Moseley, Warren

    1990-01-01

    Applying ITS technology to the shuttle diagnostics would not require the rigor of the Petri Net representation, however it is important in providing the animated simulated portion of the interface and the demands placed on the system to support the training aspects to have a homogeneous and consistent underlying knowledge representation. By keeping the diagnostic rule base, the hardware description, the software description, user profiles, desired behavioral knowledge, and the user interface in the same notation, it is possible to reason about the all of the properties of petri nets, on any selected portion of the simulation. This reasoning provides foundation for utilization of intelligent tutoring systems technology.

  1. Parachuting harnesses comparative evaluation on energy distribution grids.

    PubMed

    Hembecker, Paula Karina; Poletto, Angela Regina; Gontijo, Leila Amaral

    2012-01-01

    This research aims to make a comparative evaluation of three different parachuting harnesses to work at heights in the energy industry, from the electricians' point of view concerning these products under the optics of usability and ergonomic principles, and mainly justified by the high quantity of injuries at the energy industry due to high falls. According to its main target, this field research is classified as exploratory-descriptive transversal viewing study and, considering this perspective, the study was developed in four steps. Research results have enlightened the weakest spots and the potential improvement opportunities of these products, developed to assure safety of the work at heights to the energy industry, according to the opinion of the users. Still, results point that, regardless of the model, these devices have adapting issues to fulfill the electrical sector user's needs.

  2. Sierra Structural Dynamics Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reese, Garth M.

    Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of structural systems. This manual describes the theory behind many of the constructs in Sierra/SD. For a more detailed description of how to use Sierra/SD , we refer the reader to Sierra/SD, User's Notes . Many of the constructs in Sierra/SD are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Sierra/SD are specific to our implementation. We try to be far more complete in those areas.more » The theory manual was developed from several sources including general notes, a programmer notes manual, the user's notes and of course the material in the open literature. This page intentionally left blank.« less

  3. Section 3. The SPARROW Surface Water-Quality Model: Theory, Application and User Documentation

    USGS Publications Warehouse

    Schwarz, G.E.; Hoos, A.B.; Alexander, R.B.; Smith, R.A.

    2006-01-01

    SPARROW (SPAtially Referenced Regressions On Watershed attributes) is a watershed modeling technique for relating water-quality measurements made at a network of monitoring stations to attributes of the watersheds containing the stations. The core of the model consists of a nonlinear regression equation describing the non-conservative transport of contaminants from point and diffuse sources on land to rivers and through the stream and river network. The model predicts contaminant flux, concentration, and yield in streams and has been used to evaluate alternative hypotheses about the important contaminant sources and watershed properties that control transport over large spatial scales. This report provides documentation for the SPARROW modeling technique and computer software to guide users in constructing and applying basic SPARROW models. The documentation gives details of the SPARROW software, including the input data and installation requirements, and guidance in the specification, calibration, and application of basic SPARROW models, as well as descriptions of the model output and its interpretation. The documentation is intended for both researchers and water-resource managers with interest in using the results of existing models and developing and applying new SPARROW models. The documentation of the model is presented in two parts. Part 1 provides a theoretical and practical introduction to SPARROW modeling techniques, which includes a discussion of the objectives, conceptual attributes, and model infrastructure of SPARROW. Part 1 also includes background on the commonly used model specifications and the methods for estimating and evaluating parameters, evaluating model fit, and generating water-quality predictions and measures of uncertainty. Part 2 provides a user's guide to SPARROW, which includes a discussion of the software architecture and details of the model input requirements and output files, graphs, and maps. The text documentation and computer software are available on the Web at http://usgs.er.gov/sparrow/sparrow-mod/.

  4. Users manual for Streamtube Curvature Analysis: Analytical method for predicting the pressure distribution about a nacelle at transonic speeds, volume 1

    NASA Technical Reports Server (NTRS)

    Keith, J. S.; Ferguson, D. R.; Heck, P. H.

    1972-01-01

    The computer program, Streamtube Curvature Analysis, is described for the engineering user and for the programmer. The user oriented documentation includes a description of the mathematical governing equations, their use in the solution, and the method of solution. The general logical flow of the program is outlined and detailed instructions for program usage and operation are explained. General procedures for program use and the program capabilities and limitations are described. From the standpoint of the grammar, the overlay structure of the program is described. The various storage tables are defined and their uses explained. The input and output are discussed in detail. The program listing includes numerous comments so that the logical flow within the program is easily followed. A test case showing input data and output format is included as well as an error printout description.

  5. CSTEM User Manual

    NASA Technical Reports Server (NTRS)

    Hartle, M.; McKnight, R. L.

    2000-01-01

    This manual is a combination of a user manual, theory manual, and programmer manual. The reader is assumed to have some previous exposure to the finite element method. This manual is written with the idea that the CSTEM (Coupled Structural Thermal Electromagnetic-Computer Code) user needs to have a basic understanding of what the code is actually doing in order to properly use the code. For that reason, the underlying theory and methods used in the code are described to a basic level of detail. The manual gives an overview of the CSTEM code: how the code came into existence, a basic description of what the code does, and the order in which it happens (a flowchart). Appendices provide a listing and very brief description of every file used by the CSTEM code, including the type of file it is, what routine regularly accesses the file, and what routine opens the file, as well as special features included in CSTEM.

  6. A future Outlook: Web based Simulation of Hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Islam, A. S.; Piasecki, M.

    2003-12-01

    Despite recent advances to present simulation results as 3D graphs or animation contours, the modeling user community still faces some shortcomings when trying to move around and analyze data. Typical problems include the lack of common platforms with standard vocabulary to exchange simulation results from different numerical models, insufficient descriptions about data (metadata), lack of robust search and retrieval tools for data, and difficulties to reuse simulation domain knowledge. This research demonstrates how to create a shared simulation domain in the WWW and run a number of models through multi-user interfaces. Firstly, meta-datasets have been developed to describe hydrodynamic model data based on geographic metadata standard (ISO 19115) that has been extended to satisfy the need of the hydrodynamic modeling community. The Extended Markup Language (XML) is used to publish this metadata by the Resource Description Framework (RDF). Specific domain ontology for Web Based Simulation (WBS) has been developed to explicitly define vocabulary for the knowledge based simulation system. Subsequently, this knowledge based system is converted into an object model using Meta Object Family (MOF). The knowledge based system acts as a Meta model for the object oriented system, which aids in reusing the domain knowledge. Specific simulation software has been developed based on the object oriented model. Finally, all model data is stored in an object relational database. Database back-ends help store, retrieve and query information efficiently. This research uses open source software and technology such as Java Servlet and JSP, Apache web server, Tomcat Servlet Engine, PostgresSQL databases, Protégé ontology editor, RDQL and RQL for querying RDF in semantic level, Jena Java API for RDF. Also, we use international standards such as the ISO 19115 metadata standard, and specifications such as XML, RDF, OWL, XMI, and UML. The final web based simulation product is deployed as Web Archive (WAR) files which is platform and OS independent and can be used by Windows, UNIX, or Linux. Keywords: Apache, ISO 19115, Java Servlet, Jena, JSP, Metadata, MOF, Linux, Ontology, OWL, PostgresSQL, Protégé, RDF, RDQL, RQL, Tomcat, UML, UNIX, Windows, WAR, XML

  7. RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphreys, S.L.; Miller, L.A.; Monroe, D.K.

    1998-04-01

    This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less

  8. Radio data archiving system

    NASA Astrophysics Data System (ADS)

    Knapic, C.; Zanichelli, A.; Dovgan, E.; Nanni, M.; Stagni, M.; Righini, S.; Sponza, M.; Bedosti, F.; Orlati, A.; Smareglia, R.

    2016-07-01

    Radio Astronomical Data models are becoming very complex since the huge possible range of instrumental configurations available with the modern Radio Telescopes. What in the past was the last frontiers of data formats in terms of efficiency and flexibility is now evolving with new strategies and methodologies enabling the persistence of a very complex, hierarchical and multi-purpose information. Such an evolution of data models and data formats require new data archiving techniques in order to guarantee data preservation following the directives of Open Archival Information System and the International Virtual Observatory Alliance for data sharing and publication. Currently, various formats (FITS, MBFITS, VLBI's XML description files and ancillary files) of data acquired with the Medicina and Noto Radio Telescopes can be stored and handled by a common Radio Archive, that is planned to be released to the (inter)national community by the end of 2016. This state-of-the-art archiving system for radio astronomical data aims at delegating as much as possible to the software setting how and where the descriptors (metadata) are saved, while the users perform user-friendly queries translated by the web interface into complex interrogations on the database to retrieve data. In such a way, the Archive is ready to be Virtual Observatory compliant and as much as possible user-friendly.

  9. Update on Small Modular Reactors Dynamic System Modeling Tool: Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.

    Previous reports focused on the development of component and system models as well as end-to-end system models using Modelica and Dymola for two advanced reactor architectures: (1) Advanced Liquid Metal Reactor and (2) fluoride high-temperature reactor (FHR). The focus of this report is the release of the first beta version of the web-based application for model use and collaboration, as well as an update on the FHR model. The web-based application allows novice users to configure end-to-end system models from preconfigured choices to investigate the instrumentation and controls implications of these designs and allows for the collaborative development of individualmore » component models that can be benchmarked against test systems for potential inclusion in the model library. A description of this application is provided along with examples of its use and a listing and discussion of all the models that currently exist in the library.« less

  10. Barotropic Tidal Predictions and Validation in a Relocatable Modeling Environment. Revised

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Passi, Ranjit; Kantha, Lakshmi; Payne, Steven; Brahmachari, Shuvobroto

    1998-01-01

    Under funding from the Office of Naval Research (ONR), and the Naval Oceanographic Office (NAVOCEANO), the Mississippi State University Center for Air Sea Technology (CAST) has been working on developing a Relocatable Modeling Environment(RME) to provide a uniform and unbiased infrastructure for efficiently configuring numerical models in any geographic/oceanic region. Under Naval Oceanographic Office (NAVO-CEANO) funding, the model was implemented and tested for NAVOCEANO use. With our current emphasis on ocean tidal modeling, CAST has adopted the Colorado University's numerical ocean model, known as CURReNTSS (Colorado University Rapidly Relocatable Nestable Storm Surge) Model, as the model of choice. During the RME development process, CURReNTSS has been relocated to several coastal oceanic regions, providing excellent results that demonstrate its veracity. This report documents the model validation results and provides a brief description of the Graphic user Interface (GUI).

  11. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  12. Analysis of a mammography teaching program based on an affordance design model.

    PubMed

    Luo, Ping; Eikman, Edward A; Kealy, William; Qian, Wei

    2006-12-01

    The wide use of computer technology in education, particularly in mammogram reading, asks for e-learning evaluation. The existing media comparative studies, learner attitude evaluations, and performance tests are problematic. Based on an affordance design model, this study examined an existing e-learning program on mammogram reading. The selection criteria include content relatedness, representativeness, e-learning orientation, image quality, program completeness, and accessibility. A case study was conducted to examine the affordance features, functions, and presentations of the selected software. Data collection and analysis methods include interviews, protocol-based document analysis, and usability tests and inspection. Also some statistics were calculated. The examination of PBE identified that this educational software designed and programmed some tools. The learner can use these tools in the process of optimizing displays, scanning images, comparing different projections, marking the region of interests, constructing a descriptive report, assessing one's learning outcomes, and comparing one's decisions with the experts' decisions. Further, PBE provides some resources for the learner to construct one's knowledge and skills, including a categorized image library, a term-searching function, and some teaching links. Besides, users found it easy to navigate and carry out tasks. The users also reacted positively toward PBE's navigation system, instructional aids, layout, pace and flow of information, graphics, and other presentation design. The software provides learners with some cognitive tools, supporting their perceptual problem-solving processes and extending their capabilities. Learners can internalize the mental models in mammogram reading through multiple perceptual triangulations, sensitization of related features, semantic description of mammogram findings, and expert-guided semantic report construction. The design of these cognitive tools and the software interface matches the findings and principles in human learning and instructional design. Working with PBE's case-based simulations and categorized gallery, learners can enrich and transfer their experience to their jobs.

  13. Global Document Delivery, User Studies, and Service Evaluation: The Gateway Experience

    ERIC Educational Resources Information Center

    Miller, Rush; Xu, Hong; Zou, Xiuying

    2008-01-01

    This study examines user and service data from 2002-2006 at the East Asian Gateway Service for Chinese and Korean Academic Journal Publications (Gateway Service), the University of Pittsburgh. Descriptive statistical analysis reveals that the Gateway Service has been consistently playing the leading role in global document delivery service as well…

  14. 76 FR 4750 - Survey of Information Sharing Practices With Affiliates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-26

    ... creditors or users of consumer reports. The OTS will use the Survey responses to prepare a report to Congress on the information sharing practices by financial institutions, creditors, or users of consumer... Numbers: N/A. Description: The OTS is required to submit a report to the Congress with any recommendations...

  15. Trends in OMR Techniques and Equipment.

    ERIC Educational Resources Information Center

    Ward, Obie; Poulos, Cynthia

    Various aspects of the Optical Mark Reader (OMR) used by the Atlanta Public School System are discussed. First considered are the required features of the OMR scanner. Following this, methods of motivating users to record data accurately are described. Finally, a description of how forms are designed for the convenience of users is provided. (PB)

  16. Week Long Topography Study of Young Adults Using Electronic Cigarettes in Their Natural Environment.

    PubMed

    Robinson, R J; Hensel, E C; Roundtree, K A; Difrancesco, A G; Nonnemaker, J M; Lee, Y O

    2016-01-01

    Results of an observational, descriptive study quantifying topography characteristics of twenty first generation electronic nicotine delivery system users in their natural environment for a one week observation period are presented. The study quantifies inter-participant variation in puffing topography between users and the intra-participant variation for each user observed during one week of use in their natural environment. Puff topography characteristics presented for each user include mean puff duration, flow rate and volume for each participant, along with descriptive statistics of each quantity. Exposure characteristics including the number of vaping sessions, total number of puffs and cumulative volume of aerosol generated from ENDS use (e-liquid aerosol) are reported for each participant for a one week exposure period and an effective daily average exposure. Significant inter-participant and intra-participant variation in puff topography was observed. The observed range of natural use environment characteristics is used to propose a set of topography protocols for use as command inputs to drive machine-puffed electronic nicotine delivery systems in a controlled laboratory environment.

  17. Week Long Topography Study of Young Adults Using Electronic Cigarettes in Their Natural Environment

    PubMed Central

    Roundtree, K. A.; Difrancesco, A. G.; Nonnemaker, J. M.; Lee, Y. O.

    2016-01-01

    Results of an observational, descriptive study quantifying topography characteristics of twenty first generation electronic nicotine delivery system users in their natural environment for a one week observation period are presented. The study quantifies inter-participant variation in puffing topography between users and the intra-participant variation for each user observed during one week of use in their natural environment. Puff topography characteristics presented for each user include mean puff duration, flow rate and volume for each participant, along with descriptive statistics of each quantity. Exposure characteristics including the number of vaping sessions, total number of puffs and cumulative volume of aerosol generated from ENDS use (e-liquid aerosol) are reported for each participant for a one week exposure period and an effective daily average exposure. Significant inter-participant and intra-participant variation in puff topography was observed. The observed range of natural use environment characteristics is used to propose a set of topography protocols for use as command inputs to drive machine-puffed electronic nicotine delivery systems in a controlled laboratory environment. PMID:27736944

  18. Video personalization for usage environment

    NASA Astrophysics Data System (ADS)

    Tseng, Belle L.; Lin, Ching-Yung; Smith, John R.

    2002-07-01

    A video personalization and summarization system is designed and implemented incorporating usage environment to dynamically generate a personalized video summary. The personalization system adopts the three-tier server-middleware-client architecture in order to select, adapt, and deliver rich media content to the user. The server stores the content sources along with their corresponding MPEG-7 metadata descriptions. Our semantic metadata is provided through the use of the VideoAnnEx MPEG-7 Video Annotation Tool. When the user initiates a request for content, the client communicates the MPEG-21 usage environment description along with the user query to the middleware. The middleware is powered by the personalization engine and the content adaptation engine. Our personalization engine includes the VideoSue Summarization on Usage Environment engine that selects the optimal set of desired contents according to user preferences. Afterwards, the adaptation engine performs the required transformations and compositions of the selected contents for the specific usage environment using our VideoEd Editing and Composition Tool. Finally, two personalization and summarization systems are demonstrated for the IBM Websphere Portal Server and for the pervasive PDA devices.

  19. BioUSeR: a semantic-based tool for retrieving Life Science web resources driven by text-rich user requirements

    PubMed Central

    2013-01-01

    Background Open metadata registries are a fundamental tool for researchers in the Life Sciences trying to locate resources. While most current registries assume that resources are annotated with well-structured metadata, evidence shows that most of the resource annotations simply consists of informal free text. This reality must be taken into account in order to develop effective techniques for resource discovery in Life Sciences. Results BioUSeR is a semantic-based tool aimed at retrieving Life Sciences resources described in free text. The retrieval process is driven by the user requirements, which consist of a target task and a set of facets of interest, both expressed in free text. BioUSeR is able to effectively exploit the available textual descriptions to find relevant resources by using semantic-aware techniques. Conclusions BioUSeR overcomes the limitations of the current registries thanks to: (i) rich specification of user information needs, (ii) use of semantics to manage textual descriptions, (iii) retrieval and ranking of resources based on user requirements. PMID:23635042

  20. User's manual for PRESTO: A computer code for the performance of regenerative steam turbine cycles

    NASA Technical Reports Server (NTRS)

    Fuller, L. C.; Stovall, T. K.

    1979-01-01

    Standard turbine cycles for baseload power plants and cycles with such additional features as process steam extraction and induction and feedwater heating by external heat sources may be modeled. Peaking and high back pressure cycles are also included. The code's methodology is to use the expansion line efficiencies, exhaust loss, leakages, mechanical losses, and generator losses to calculate the heat rate and generator output. A general description of the code is given as well as the instructions for input data preparation. Appended are two complete example cases.

  1. Frequency Domain Identification Toolbox

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Juang, Jer-Nan; Chen, Chung-Wen

    1996-01-01

    This report documents software written in MATLAB programming language for performing identification of systems from frequency response functions. MATLAB is a commercial software environment which allows easy manipulation of data matrices and provides other intrinsic matrix functions capabilities. Algorithms programmed in this collection of subroutines have been documented elsewhere but all references are provided in this document. A main feature of this software is the use of matrix fraction descriptions and system realization theory to identify state space models directly from test data. All subroutines have templates for the user to use as guidelines.

  2. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  3. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A minimum of 4Mb of free RAM is highly recommended. The UNIX version of FEAT includes both FEAT v3.6 for the Macintosh and XFEAT. XFEAT is written in C-language for Sun series workstations running SunOS, SGI workstations running IRIX, DECstations running ULTRIX, and Intergraph workstations running CLIX version 6. It requires the MIT X Window System, Version 11 Revision 4, with OSF/Motif 1.1.3, and 16Mb of RAM. The standard distribution medium for FEAT 3.6 (Macintosh version) is a set of three 3.5 inch Macintosh format diskettes. The standard distribution package for the UNIX version includes the three FEAT 3.6 Macintosh diskettes plus a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format which contains XFEAT. Alternate distribution media and formats for XFEAT are available upon request. FEAT has been under development since 1990. Both FEAT v3.6 for the Macintosh and XFEAT v3.5 were released in 1993.

  4. What Does Music Sound Like for a Cochlear Implant User?

    PubMed

    Jiam, Nicole T; Caldwell, Meredith T; Limb, Charles J

    2017-09-01

    Cochlear implant research and product development over the past 40 years have been heavily focused on speech comprehension with little emphasis on music listening and enjoyment. The relatively little understanding of how music sounds in a cochlear implant user stands in stark contrast to the overall degree of importance the public places on music and quality of life. The purpose of this article is to describe what music sounds like to cochlear implant users, using a combination of existing research studies and listener descriptions. We examined the published literature on music perception in cochlear implant users, particularly postlingual cochlear implant users, with an emphasis on the primary elements of music and recorded music. Additionally, we administered an informal survey to cochlear implant users to gather first-hand descriptions of music listening experience and satisfaction from the cochlear implant population. Limitations in cochlear implant technology lead to a music listening experience that is significantly distorted compared with that of normal hearing listeners. On the basis of many studies and sources, we describe how music is frequently perceived as out-of-tune, dissonant, indistinct, emotionless, and weak in bass frequencies, especially for postlingual cochlear implant users-which may in part explain why music enjoyment and participation levels are lower after implantation. Additionally, cochlear implant users report difficulty in specific musical contexts based on factors including but not limited to genre, presence of lyrics, timbres (woodwinds, brass, instrument families), and complexity of the perceived music. Future research and cochlear implant development should target these areas as parameters for improvement in cochlear implant-mediated music perception.

  5. Successful communication does not drive language development: Evidence from adult homesign.

    PubMed

    Carrigan, Emily M; Coppola, Marie

    2017-01-01

    Constructivist accounts of language acquisition maintain that the language learner aims to match a target provided by mature users. Communicative problem solving in the context of social interaction and matching a linguistic target or model are presented as primary mechanisms driving the language development process. However, research on the development of homesign gesture systems by deaf individuals who have no access to a linguistic model suggests that aspects of language can develop even when typical input is unavailable. In four studies, we examined the role of communication in the genesis of homesign systems by assessing how well homesigners' family members comprehend homesign productions. In Study 1, homesigners' mothers showed poorer comprehension of homesign descriptions produced by their now-adult deaf child than of spoken Spanish descriptions of the same events produced by one of their adult hearing children. Study 2 found that the younger a family member was when they first interacted with their deaf relative, the better they understood the homesigner. Despite this, no family member comprehended homesign productions at levels that would be expected if family members co-generated homesign systems with their deaf relative via communicative interactions. Study 3 found that mothers' poor or incomplete comprehension of homesign was not a result of incomplete homesign descriptions. In Study 4 we demonstrated that Deaf native users of American Sign Language, who had no previous experience with the homesigners or their homesign systems, nevertheless comprehended homesign productions out of context better than the homesigners' mothers. This suggests that homesign has comprehensible structure, to which mothers and other family members are not fully sensitive. Taken together, these studies show that communicative problem solving is not responsible for the development of structure in homesign systems. The role of this mechanism must therefore be re-evaluated in constructivist theories of language development. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. 15 CFR Supplement No. 1 to Part 748 - BIS-748P, BIS-748P-A; Item Appendix, and BIS-748P-B; End-User Appendix; Multipurpose Application...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... “Tech. Specs.” box with an (X) if you are submitting descriptive literature, brochures, technical... or B. Be specific—vague descriptions such as “research”, “manufacturing”, or “scientific uses” are...

  7. 15 CFR Supplement No. 1 to Part 748 - BIS-748P, BIS-748P-A; Item Appendix, and BIS-748P-B; End-User Appendix; Multipurpose Application...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... security reasons. Mark the “Tech. Specs.” box with an (X) if you are submitting descriptive literature... descriptions such as “research”, “manufacturing”, or “scientific uses” are not acceptable. Block 22: For a...

  8. 15 CFR Supplement No. 1 to Part 748 - BIS-748P, BIS-748P-A; Item Appendix, and BIS-748P-B; End-User Appendix; Multipurpose Application...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... “Tech. Specs.” box with an (X) if you are submitting descriptive literature, brochures, technical... or B. Be specific—vague descriptions such as “research”, “manufacturing”, or “scientific uses” are...

  9. 15 CFR Supplement No. 1 to Part 748 - BIS-748P, BIS-748P-A; Item Appendix, and BIS-748P-B; End-User Appendix; Multipurpose Application...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... “Tech. Specs.” box with an (X) if you are submitting descriptive literature, brochures, technical... or B. Be specific—vague descriptions such as “research”, “manufacturing”, or “scientific uses” are...

  10. 15 CFR Supplement No. 1 to Part 748 - BIS-748P, BIS-748P-A; Item Appendix, and BIS-748P-B; End-User Appendix; Multipurpose Application...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... “Tech. Specs.” box with an (X) if you are submitting descriptive literature, brochures, technical... or B. Be specific—vague descriptions such as “research”, “manufacturing”, or “scientific uses” are...

  11. A SEASAT report. Volume 1: Program summary

    NASA Technical Reports Server (NTRS)

    Pounder, E. (Editor)

    1980-01-01

    The program background and experiment objectives are summarized, and a description of the organization and interfaces of the project are provided. The mission plan and history are also included as well as user activities and a brief description of the data system. A financial and manpower summary and preliminary results of the mission are also included.

  12. Data users note: Apollo 17 lunar photography

    NASA Technical Reports Server (NTRS)

    Cameron, W. S.; Doyle, F. J.; Levenson, L.; Michlovitz, K.

    1974-01-01

    The availability of Apollo 17 pictorial data is announced as an aid to the selection of the photographs for study. Brief descriptions are presented of the Apollo 17 flight, and the photographic equipment used during the flight. The following descriptions are also included: service module photography, command module photography, and lunar surface photography.

  13. Ground Software Maintenance Facility (GSMF) user's manual

    NASA Technical Reports Server (NTRS)

    Aquila, V.; Derrig, D.; Griffith, G.

    1986-01-01

    Instructions for the Ground Software Maintenance Facility (GSMF) system user is provided to operate the GSMF in all modes. The GSMF provides the resources for the Automatic Test Equipment (ATE) computer program maintenance (GCOS and GOAL). Applicable reference documents are listed. An operational overview and descriptions of the modes in terms of operator interface, options, equipment, material utilization, and operational procedures are contained. Test restart procedures are described. The GSMF documentation tree is presented including the user manual.

  14. A computational system for aerodynamic design and analysis of supersonic aircraft. Part 2: User's manual

    NASA Technical Reports Server (NTRS)

    Middleton, W. D.; Lundry, J. L.; Coleman, R. G.

    1976-01-01

    An integrated system of computer programs was developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. This user's manual contains a description of the system, an explanation of its usage, the input definition, and example output.

  15. Applications For Real Time NOMADS At NCEP To Disseminate NOAA's Operational Model Data Base

    NASA Astrophysics Data System (ADS)

    Alpert, J. C.; Wang, J.; Rutledge, G.

    2007-05-01

    A wide range of environmental information, in digital form, with metadata descriptions and supporting infrastructure is contained in the NOAA Operational Modeling Archive Distribution System (NOMADS) and its Real Time (RT) project prototype at the National Centers for Environmental Prediction (NCEP). NOMADS is now delivering on its goal of a seamless framework, from archival to real time data dissemination for NOAA's operational model data holdings. A process is under way to make NOMADS part of NCEP's operational production of products. A goal is to foster collaborations among the research and education communities, value added retailers, and public access for science and development. In the National Research Council's "Completing the Forecast", Recommendation 3.4 states: "NOMADS should be maintained and extended to include (a) long-term archives of the global and regional ensemble forecasting systems at their native resolution, and (b) re-forecast datasets to facilitate post-processing." As one of many participants of NOMADS, NCEP serves the operational model data base using data application protocol (Open-DAP) and other services for participants to serve their data sets and users to obtain them. Using the NCEP global ensemble data as an example, we show an Open-DAP (also known as DODS) client application that provides a request-and-fulfill mechanism for access to the complex ensemble matrix of holdings. As an example of the DAP service, we show a client application which accesses the Global or Regional Ensemble data set to produce user selected weather element event probabilities. The event probabilities are easily extended over model forecast time to show probability histograms defining the future trend of user selected events. This approach insures an efficient use of computer resources because users transmit only the data necessary for their tasks. Data sets are served by OPeN-DAP allowing commercial clients such as MATLAB or IDL as well as freeware clients such as GrADS to access the NCEP real time database. We will demonstrate how users can use NOMADS services to repackage area subsets and select levels and variables that are sent to a users selected ftp site. NOMADS can also display plots on demand for area subsets, selected levels, time series and selected variables.

  16. Collaborative Information Retrieval Method among Personal Repositories

    NASA Astrophysics Data System (ADS)

    Kamei, Koji; Yukawa, Takashi; Yoshida, Sen; Kuwabara, Kazuhiro

    In this paper, we describe a collaborative information retrieval method among personal repositorie and an implementation of the method on a personal agent framework. We propose a framework for personal agents that aims to enable the sharing and exchange of information resources that are distributed unevenly among individuals. The kernel of a personal agent framework is an RDF(resource description framework)-based information repository for storing, retrieving and manipulating privately collected information, such as documents the user read and/or wrote, email he/she exchanged, web pages he/she browsed, etc. The repository also collects annotations to information resources that describe relationships among information resources and records of interaction between the user and information resources. Since the information resources in a personal repository and their structure are personalized, information retrieval from other users' is an important application of the personal agent. A vector space model with a personalized concept-base is employed as an information retrieval mechanism in a personal repository. Since a personalized concept-base is constructed from information resources in a personal repository, it reflects its user's knowledge and interests. On the other hand, it leads to another problem while querying other users' personal repositories; that is, simply transferring query requests does not provide desirable results. To solve this problem, we propose a query equalization scheme based on a relevance feedback method for collaborative information retrieval between personalized concept-bases. In this paper, we describe an implementation of the collaborative information retrieval method and its user interface on the personal agent framework.

  17. Non-integer viscoelastic constitutive law to model soft biological tissues to in-vivo indentation.

    PubMed

    Demirci, Nagehan; Tönük, Ergin

    2014-01-01

    During the last decades, derivatives and integrals of non-integer orders are being more commonly used for the description of constitutive behavior of various viscoelastic materials including soft biological tissues. Compared to integer order constitutive relations, non-integer order viscoelastic material models of soft biological tissues are capable of capturing a wider range of viscoelastic behavior obtained from experiments. Although integer order models may yield comparably accurate results, non-integer order material models have less number of parameters to be identified in addition to description of an intermediate material that can monotonically and continuously be adjusted in between an ideal elastic solid and an ideal viscous fluid. In this work, starting with some preliminaries on non-integer (fractional) calculus, the "spring-pot", (intermediate mechanical element between a solid and a fluid), non-integer order three element (Zener) solid model, finally a user-defined large strain non-integer order viscoelastic constitutive model was constructed to be used in finite element simulations. Using the constitutive equation developed, by utilizing inverse finite element method and in vivo indentation experiments, soft tissue material identification was performed. The results indicate that material coefficients obtained from relaxation experiments, when optimized with creep experimental data could simulate relaxation, creep and cyclic loading and unloading experiments accurately. Non-integer calculus viscoelastic constitutive models, having physical interpretation and modeling experimental data accurately is a good alternative to classical phenomenological viscoelastic constitutive equations.

  18. Full-body gestures and movements recognition: user descriptive and unsupervised learning approaches in GDL classifier

    NASA Astrophysics Data System (ADS)

    Hachaj, Tomasz; Ogiela, Marek R.

    2014-09-01

    Gesture Description Language (GDL) is a classifier that enables syntactic description and real time recognition of full-body gestures and movements. Gestures are described in dedicated computer language named Gesture Description Language script (GDLs). In this paper we will introduce new GDLs formalisms that enable recognition of selected classes of movement trajectories. The second novelty is new unsupervised learning method with which it is possible to automatically generate GDLs descriptions. We have initially evaluated both proposed extensions of GDL and we have obtained very promising results. Both the novel methodology and evaluation results will be described in this paper.

  19. The DREO Elint Browser Utility (DEBU) reference manual

    NASA Astrophysics Data System (ADS)

    Ford, Barbara; Jones, David

    1992-04-01

    An electronic intelligent database browsing tool called DEBU has been developed that allows databases such as ELP, Kilting, EWIR, and AFEWC to be reviewed and analyzed from a user-friendly environment on a personal computer. DEBU's basic function is to allow users to examine the contents of user-selected subfiles of user-selected emitters of user-selected databases. DEBU augments this functionality with support for selecting (filtering) and combining subsets of emitters by user-selected attributes such as name, parameter type, or parameter value. DEBU provides facilities for examining histograms and x-y plots of selected parameters, for doing ambiguity analysis and mode level analysis, and for generating and printing a variety of reports. A manual is provided for users of DEBU, including descriptions and illustrations of menus and windows.

  20. Users' acceptance and attitude in regarding electronic medical record at central polyclinic of oil industry in Isfahan, Iran.

    PubMed

    Tavakoli, Nahid; Shahin, Arash; Jahanbakhsh, Maryam; Mokhtari, Habibollah; Rafiei, Maryam

    2013-01-01

    Simultaneous with the rapid changes in the technology and information systems, hospitals interest in using them. One of the most common systems in hospitals is electronic medical record (EMR) whose one of uses is providing better health care quality via health information technology. Prior to its use, attempts should be put to identifying factors affecting the acceptance, attitude and utilizing of this technology. The current article aimed to study the effective factors of EMR acceptance by technology acceptance model (TAM) at central polyclinic of Oil Industry in Isfahan. This was a practical, descriptive and regression study. The population research were all EMR users at polyclinic of Oil Industry in 2012 and its sampling was simple random with 62 users. The tool of data collection was a research-made questionnaire based on TAM. The validity of questionnaire has been assigned through the strategy of content validity and health information technology experts' views and its reliability by test-retest. The system users have positive attitude toward using EMR (56.6%). Also, users are not very satisfied with effective external (38.14%) and behavioral factors (47.8%) upon using the system. Perceived ease-of-use (PEU) and perceived usefulness (PU) were at a good level. Lack of relative satisfaction with using of EMR derives from factors such as appearance, screen, data and information quality and terminology. In this study, it is suggested to improve the system and the efficiency of the users through software' external factors development. So that PEU and users' attitude to be changed and moved in positive manner.

  1. Pragmatic service development and customisation with the CEDA OGC Web Services framework

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Stephens, Ag; Lowe, Dominic

    2010-05-01

    The CEDA OGC Web Services framework (COWS) emphasises rapid service development by providing a lightweight layer of OGC web service logic on top of Pylons, a mature web application framework for the Python language. This approach gives developers a flexible web service development environment without compromising access to the full range of web application tools and patterns: Model-View-Controller paradigm, XML templating, Object-Relational-Mapper integration and authentication/authorization. We have found this approach useful for exploring evolving standards and implementing protocol extensions to meet the requirements of operational deployments. This paper outlines how COWS is being used to implement customised WMS, WCS, WFS and WPS services in a variety of web applications from experimental prototypes to load-balanced cluster deployments serving 10-100 simultaneous users. In particular we will cover 1) The use of Climate Science Modeling Language (CSML) in complex-feature aware WMS, WCS and WFS services, 2) Extending WMS to support applications with features specific to earth system science and 3) A cluster-enabled Web Processing Service (WPS) supporting asynchronous data processing. The COWS WPS underpins all backend services in the UK Climate Projections User Interface where users can extract, plot and further process outputs from a multi-dimensional probabilistic climate model dataset. The COWS WPS supports cluster job execution, result caching, execution time estimation and user management. The COWS WMS and WCS components drive the project-specific NCEO and QESDI portals developed by the British Atmospheric Data Centre. These portals use CSML as a backend description format and implement features such as multiple WMS layer dimensions and climatology axes that are beyond the scope of general purpose GIS tools and yet vital for atmospheric science applications.

  2. MAC/GMC 4.0 User's Manual: Example Problem Manual. Volume 3

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2002-01-01

    This document is the third volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, Volume 2 is the Keywords Manual, and this document is the Example Problems Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material, have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume provides in-depth descriptions of 43 example problems, which were specially designed to highlight many of the most important capabilities of the code. The actual input files associated with each example problem are distributed with the MAC/GMC 4.0 software; thus providing the user with a convenient starting point for their own specialized problems of interest.

  3. Optimisation of a Generic Ionic Model of Cardiac Myocyte Electrical Activity

    PubMed Central

    Guo, Tianruo; Al Abed, Amr; Lovell, Nigel H.; Dokos, Socrates

    2013-01-01

    A generic cardiomyocyte ionic model, whose complexity lies between a simple phenomenological formulation and a biophysically detailed ionic membrane current description, is presented. The model provides a user-defined number of ionic currents, employing two-gate Hodgkin-Huxley type kinetics. Its generic nature allows accurate reconstruction of action potential waveforms recorded experimentally from a range of cardiac myocytes. Using a multiobjective optimisation approach, the generic ionic model was optimised to accurately reproduce multiple action potential waveforms recorded from central and peripheral sinoatrial nodes and right atrial and left atrial myocytes from rabbit cardiac tissue preparations, under different electrical stimulus protocols and pharmacological conditions. When fitted simultaneously to multiple datasets, the time course of several physiologically realistic ionic currents could be reconstructed. Model behaviours tend to be well identified when extra experimental information is incorporated into the optimisation. PMID:23710254

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mudholkar, Mihir; Ahmed, Shamin; Ericson, Milton Nance

    A compact model for SiC Power MOSFETs is presented. The model features a physical description of the channel current and internal capacitances and has been validated for dc, CV, and switching characteristics with measured data from a 1200-V, 20-A SiC power MOSFET in a temperature range of 25 degrees C to 225 degrees C. The peculiar variation of on-state resistance with temperature for SiC power MOSFETs has also been demonstrated through measurements and accounted for in the developed model. In order to improve the user experience with the model, a new datasheet driven parameter extraction strategy has been presented whichmore » requires only data available in device datasheets, to enable quick parameter extraction for off-the-shelf devices. Excellent agreement is shown between measurement and simulation using the presented model over the entire temperature range.« less

  5. Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC), version 4.0: User's manual

    NASA Technical Reports Server (NTRS)

    Whyte, Wayne A., Jr.; Heyward, Ann O.; Ponchak, Denise S.; Spence, Rodney L.; Zuzek, John E.

    1988-01-01

    The information in the NASARC (Version 4.0) Technical Manual (NASA-TM-101453) and NASARC (Version 4.0) User's Manual (NASA-TM-101454) relates to the state of Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software development through November 1, 1988. The Technical Manual describes the NASARC concept and the algorithms used to implement the concept. The User's Manual provides information on computer system considerations, installation instructions, description of input files, and program operation instructions. Significant revisions were incorporated in the Version 4.0 software over prior versions. These revisions have further enhanced the modeling capabilities of the NASARC procedure and provide improved arrangements of predetermined arcs within the geostationary orbit. Array dimensions within the software were structured to fit within the currently available 12-megabyte memory capacity of the International Frequency Registration Board (IFRB) computer facility. A piecewise approach to predetermined arc generation in NASARC (Version 4.) allows worldwide planning problem scenarios to be accommodated within computer run time and memory constraints with enhanced likelihood and ease of solution.

  6. Numerical Arc Segmentation Algorithm for a Radio Conference-NASARC, Version 2.0: User's Manual

    NASA Technical Reports Server (NTRS)

    Whyte, Wayne A., Jr.; Heyward, Ann O.; Ponchak, Denise S.; Spence, Rodney L.; Zuzek, John E.

    1987-01-01

    The information contained in the NASARC (Version 2.0) Technical Manual (NASA TM-100160) and the NASARC (Version 2.0) User's Manual (NASA TM-100161) relates to the state of the Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software development through October 16, 1987. The technical manual describes the NASARC concept and the algorithms which are used to implement it. The User's Manual provides information on computer system considerations, installation instructions, description of input files, and program operation instructions. Significant revisions have been incorporated in the Version 2.0 software over prior versions. These revisions have enhanced the modeling capabilities of the NASARC procedure while greatly reducing the computer run time and memory requirements. Array dimensions within the software have been structured to fit into the currently available 6-megabyte memory capacity of the International Frequency Registration Board (IFRB) computer facility. A piecewise approach to predetermined arc generation in NASARC (Version 2.0) allows worldwide scenarios to be accommodated within these memory constraints while at the same time reducing computer run time.

  7. The EGS5 Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirayama, Hideo; Namito, Yoshihito; /KEK, Tsukuba

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users ofmore » EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version, a deliberate attempt was made to present example problems in order to help the user ''get started'', and we follow that spirit in this report. A series of elementary tutorial user codes are presented in Chapter 3, with more sophisticated sample user codes described in Chapter 4. Novice EGS users will find it helpful to read through the initial sections of the EGS5 User Manual (provided in Appendix B of this report), proceeding then to work through the tutorials in Chapter 3. The User Manuals and other materials found in the appendices contain detailed flow charts, variable lists, and subprogram descriptions of EGS5 and PEGS. Included are step-by-step instructions for developing basic EGS5 user codes and for accessing all of the physics options available in EGS5 and PEGS. Once acquainted with the basic structure of EGS5, users should find the appendices the most frequently consulted sections of this report.« less

  8. Descriptive modelling to predict deoxynivalenol in winter wheat in the Netherlands.

    PubMed

    Van Der Fels-Klerx, H J; Burgers, S L G E; Booij, C J H

    2010-05-01

    Predictions of deoxynivalenol (DON) content in wheat at harvest can be useful for decision-making by stakeholders of the wheat feed and food supply chain. The objective of the current research was to develop quantitative predictive models for DON in mature winter wheat in the Netherlands for two specific groups of end-users. One model was developed for use by farmers in underpinning Fusarium spp. disease management, specifically the application of fungicides around wheat flowering (model A). The second model was developed for industry and food safety authorities, and considered the entire wheat cultivation period (model B). Model development was based on observational data collected from 425 fields throughout the Netherlands between 2001 and 2008. For each field, agronomical information, climatic data and DON levels in mature wheat were collected. Using multiple regression analyses, the set of biological relevant variables that provided the highest statistical performance was selected. The two final models include the following variables: region, wheat resistance level, spraying, flowering date, several climatic variables in the different stages of wheat growing, and length of the period between flowering and harvesting (model B only). The percentages of variance accounted for were 64.4% and 65.6% for models A and B, respectively. Model validation showed high correlation between the predicted and observed DON levels. The two models may be applied by various groups of end-users to reduce DON contamination in wheat-derived feed and food products and, ultimately, reduce animal and consumer health risks.

  9. Modeling antecedents of electronic medical record system implementation success in low-resource setting hospitals.

    PubMed

    Tilahun, Binyam; Fritz, Fleur

    2015-08-01

    With the increasing implementation of Electronic Medical Record Systems (EMR) in developing countries, there is a growing need to identify antecedents of EMR success to measure and predict the level of adoption before costly implementation. However, less evidence is available about EMR success in the context of low-resource setting implementations. Therefore, this study aims to fill this gap by examining the constructs and relationships of the widely used DeLone and MacLean (D&M) information system success model to determine whether it can be applied to measure EMR success in those settings. A quantitative cross sectional study design using self-administered questionnaires was used to collect data from 384 health professionals working in five governmental hospitals in Ethiopia. The hospitals use a comprehensive EMR system since three years. Descriptive and structural equation modeling methods were applied to describe and validate the extent of relationship of constructs and mediating effects. The findings of the structural equation modeling shows that system quality has significant influence on EMR use (β = 0.32, P < 0.05) and user satisfaction (β = 0.53, P < 0.01); information quality has significant influence on EMR use (β = 0.44, P < 0.05) and user satisfaction (β = 0.48, P < 0.01) and service quality has strong significant influence on EMR use (β = 0.36, P < 0.05) and user satisfaction (β = 0.56, P < 0.01). User satisfaction has significant influence on EMR use (β = 0.41, P < 0.05) but the effect of EMR use on user satisfaction was not significant. Both EMR use and user satisfaction have significant influence on perceived net-benefit (β = 0.31, P < 0.01; β = 0.60, P < 0.01), respectively. Additionally, computer literacy was found to be a mediating factor in the relationship between service quality and EMR use (P < 0.05) as well as user satisfaction (P < 0.01). Among all the constructs, user satisfaction showed the strongest effect on perceived net-benefit of health professionals. EMR implementers and managers in developing countries are in urgent need of implementation models to design proper implementation strategies. In this study, the constructs and relationships depicted in the updated D&M model were found to be applicable to assess the success of EMR in low resource settings. Additionally, computer literacy was found to be a mediating factor in EMR use and user satisfaction of health professionals. Hence, EMR implementers and managers in those settings should give priority in improving service quality of the hospitals like technical support and infrastructure; providing continuous basic computer trainings to health professionals; and give attention to the system and information quality of the systems they want to implement.

  10. A Combinatorial Geometry Computer Description of the XR311 Vehicle

    DTIC Science & Technology

    1978-04-01

    cards or magnetic tape. The shot line output of the GRID subroutine of the GIFT code is also stored on magnetic tape for future vulnera- bility...descriptions as processed by the Geometric Information For Targets ( GIFT )2 computer code. This report documents the COM-GEOM target description for all...72, March 1974. ’L.W. Bains and M.J. Reisinger, "The GIFT Code User Manual, VOL 1, Introduction and Input Requirements, " Ballistic Research

  11. CCBuilder: an interactive web-based tool for building, designing and assessing coiled-coil protein assemblies.

    PubMed

    Wood, Christopher W; Bruning, Marc; Ibarra, Amaurys Á; Bartlett, Gail J; Thomson, Andrew R; Sessions, Richard B; Brady, R Leo; Woolfson, Derek N

    2014-11-01

    The ability to accurately model protein structures at the atomistic level underpins efforts to understand protein folding, to engineer natural proteins predictably and to design proteins de novo. Homology-based methods are well established and produce impressive results. However, these are limited to structures presented by and resolved for natural proteins. Addressing this problem more widely and deriving truly ab initio models requires mathematical descriptions for protein folds; the means to decorate these with natural, engineered or de novo sequences; and methods to score the resulting models. We present CCBuilder, a web-based application that tackles the problem for a defined but large class of protein structure, the α-helical coiled coils. CCBuilder generates coiled-coil backbones, builds side chains onto these frameworks and provides a range of metrics to measure the quality of the models. Its straightforward graphical user interface provides broad functionality that allows users to build and assess models, in which helix geometry, coiled-coil architecture and topology and protein sequence can be varied rapidly. We demonstrate the utility of CCBuilder by assembling models for 653 coiled-coil structures from the PDB, which cover >96% of the known coiled-coil types, and by generating models for rarer and de novo coiled-coil structures. CCBuilder is freely available, without registration, at http://coiledcoils.chm.bris.ac.uk/app/cc_builder/. © The Author 2014. Published by Oxford University Press.

  12. On the engineering design for systematic integration of agent-orientation in industrial automation.

    PubMed

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Development of a realistic stress analysis for fatigue analysis of notched composite laminates

    NASA Technical Reports Server (NTRS)

    Humphreys, E. A.; Rosen, B. W.

    1979-01-01

    A finite element stress analysis which consists of a membrane and interlaminar shear spring analysis was developed. This approach was utilized in order to model physically realistic failure mechanisms while maintaining a high degree of computational economy. The accuracy of the stress analysis predictions is verified through comparisons with other solutions to the composite laminate edge effect problem. The stress analysis model was incorporated into an existing fatigue analysis methodology and the entire procedure computerized. A fatigue analysis is performed upon a square laminated composite plate with a circular central hole. A complete description and users guide for the computer code FLAC (Fatigue of Laminated Composites) is included as an appendix.

  14. User interface development and metadata considerations for the Atmospheric Radiation Measurement (ARM) archive

    NASA Technical Reports Server (NTRS)

    Singley, P. T.; Bell, J. D.; Daugherty, P. F.; Hubbs, C. A.; Tuggle, J. G.

    1993-01-01

    This paper will discuss user interface development and the structure and use of metadata for the Atmospheric Radiation Measurement (ARM) Archive. The ARM Archive, located at Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee, is the data repository for the U.S. Department of Energy's (DOE's) ARM Project. After a short description of the ARM Project and the ARM Archive's role, we will consider the philosophy and goals, constraints, and prototype implementation of the user interface for the archive. We will also describe the metadata that are stored at the archive and support the user interface.

  15. PEGASUS User's Guide. 5.1c

    NASA Technical Reports Server (NTRS)

    Suhs, Norman E.; Dietz, William E.; Rogers, Stuart E.; Nash, Steven M.; Onufer, Jeffrey T.

    2000-01-01

    PEGASUS 5.1 is the latest version of the PEGASUS series of mesh interpolation codes. It is a fully three-dimensional code. The main purpose for the development of this latest version was to significantly decrease the number of user inputs required and to allow for easier operation of the code. This guide is to be used with the user's manual for version 4 of PEGASUS. A basic description of methods used in both versions is described in the Version 4 manual. A complete list of all user inputs used in version 5.1 is given in this guide.

  16. Automated database design from natural language input

    NASA Technical Reports Server (NTRS)

    Gomez, Fernando; Segami, Carlos; Delaune, Carl

    1995-01-01

    Users and programmers of small systems typically do not have the skills needed to design a database schema from an English description of a problem. This paper describes a system that automatically designs databases for such small applications from English descriptions provided by end-users. Although the system has been motivated by the space applications at Kennedy Space Center, and portions of it have been designed with that idea in mind, it can be applied to different situations. The system consists of two major components: a natural language understander and a problem-solver. The paper describes briefly the knowledge representation structures constructed by the natural language understander, and, then, explains the problem-solver in detail.

  17. Design and implementation of a portal for the medical equipment market: MEDICOM.

    PubMed

    Palamas, S; Kalivas, D; Panou-Diamandi, O; Zeelenberg, C; van Nimwegen, C

    2001-01-01

    The MEDICOM (Medical Products Electronic Commerce) Portal provides the electronic means for medical-equipment manufacturers to communicate online with their customers while supporting the Purchasing Process and Post Market Surveillance. The Portal offers a powerful Internet-based search tool for finding medical products and manufacturers. Its main advantage is the fast, reliable and up-to-date retrieval of information while eliminating all unrelated content that a general-purpose search engine would retrieve. The Universal Medical Device Nomenclature System (UMDNS) registers all products. The Portal accepts end-user requests and generates a list of results containing text descriptions of devices, UMDNS attribute values, and links to manufacturer Web pages and online catalogues for access to more-detailed information. Device short descriptions are provided by the corresponding manufacturer. The Portal offers technical support for integration of the manufacturers Web sites with itself. The network of the Portal and the connected manufacturers sites is called the MEDICOM system. To establish an environment hosting all the interactions of consumers (health care organizations and professionals) and providers (manufacturers, distributors, and resellers of medical devices). The Portal provides the end-user interface, implements system management, and supports database compatibility. The Portal hosts information about the whole MEDICOM system (Common Database) and summarized descriptions of medical devices (Short Description Database); the manufacturers servers present extended descriptions. The Portal provides end-user profiling and registration, an efficient product-searching mechanism, bulletin boards, links to on-line libraries and standards, on-line information for the MEDICOM system, and special messages or advertisements from manufacturers. Platform independence and interoperability characterize the system design. Relational Database Management Systems are used for the system s databases. The end-user interface is implemented using HTML, Javascript, Java applets, and XML documents. Communication between the Portal and the manufacturers servers is implemented using a CORBA interface. Remote administration of the Portal is enabled by dynamically-generated HTML interfaces based on XML documents. A representative group of users evaluated the system. The aim of the evaluation was validation of the usability of all of MEDICOM s functionality. The evaluation procedure was based on ISO/IEC 9126 Information technology - Software product evaluation - Quality characteristics and guidelines for their use. The overall user evaluation of the MEDICOM system was very positive. The MEDICOM system was characterized as an innovative concept that brings significant added value to medical-equipment commerce. The eventual benefits of the MEDICOM system are (a) establishment of a worldwide-accessible marketplace between manufacturers and health care professionals that provides up-to-date and high-quality product information in an easy and friendly way and (b) enhancement of the efficiency of marketing procedures and after-sales support.

  18. Design and Implementation of a Portal for the Medical Equipment Market: MEDICOM

    PubMed Central

    Kalivas, Dimitris; Panou-Diamandi, Ourania; Zeelenberg, Cees; van Nimwegen, Chris

    2001-01-01

    Background The MEDICOM (Medical Products Electronic Commerce) Portal provides the electronic means for medical-equipment manufacturers to communicate online with their customers while supporting the Purchasing Process and Post Market Surveillance. The Portal offers a powerful Internet-based search tool for finding medical products and manufacturers. Its main advantage is the fast, reliable and up-to-date retrieval of information while eliminating all unrelated content that a general-purpose search engine would retrieve. The Universal Medical Device Nomenclature System (UMDNS) registers all products. The Portal accepts end-user requests and generates a list of results containing text descriptions of devices, UMDNS attribute values, and links to manufacturer Web pages and online catalogues for access to more-detailed information. Device short descriptions are provided by the corresponding manufacturer. The Portal offers technical support for integration of the manufacturers' Web sites with itself. The network of the Portal and the connected manufacturers' sites is called the MEDICOM system. Objective To establish an environment hosting all the interactions of consumers (health care organizations and professionals) and providers (manufacturers, distributors, and resellers of medical devices). Methods The Portal provides the end-user interface, implements system management, and supports database compatibility. The Portal hosts information about the whole MEDICOM system (Common Database) and summarized descriptions of medical devices (Short Description Database); the manufacturers' servers present extended descriptions. The Portal provides end-user profiling and registration, an efficient product-searching mechanism, bulletin boards, links to on-line libraries and standards, on-line information for the MEDICOM system, and special messages or advertisements from manufacturers. Platform independence and interoperability characterize the system design. Relational Database Management Systems are used for the system's databases. The end-user interface is implemented using HTML, Javascript, Java applets, and XML documents. Communication between the Portal and the manufacturers' servers is implemented using a CORBA interface. Remote administration of the Portal is enabled by dynamically-generated HTML interfaces based on XML documents. A representative group of users evaluated the system. The aim of the evaluation was validation of the usability of all of MEDICOM's functionality. The evaluation procedure was based on ISO/IEC 9126 Information technology - Software product evaluation - Quality characteristics and guidelines for their use. Results The overall user evaluation of the MEDICOM system was very positive. The MEDICOM system was characterized as an innovative concept that brings significant added value to medical-equipment commerce. Conclusions The eventual benefits of the MEDICOM system are (a) establishment of a worldwide-accessible marketplace between manufacturers and health care professionals that provides up-to-date and high-quality product information in an easy and friendly way and (b) enhancement of the efficiency of marketing procedures and after-sales support. PMID:11772547

  19. Census, CD-ROM, and You! New Horizons for Microcomputer Users of Census Bureau Data.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    This introductory guide to Census Bureau data that is currently available to microcomputer users on compact disc (CD-ROM) begins by explaining the types of information available, how CD-ROM works, and the hardware and software required to access the databases using a microcomputer. Descriptions of data currently available on CD-ROM include…

  20. Intelligence Community Forum

    DTIC Science & Technology

    2008-11-05

    Description Operationally Feasible? EEG ms ms cm Measures electrical activity in the brain. Practical tool for applications - real time monitoring or...Cognitive Systems Device Development & Processing Methods Brain activity can be monitored in real-time in operational environments with EEG Brain...biological and cognitive findings about the user to customize the learning environment Neurofeedback • Present the user with real-time feedback

  1. Linking Audio and Visual Information while Navigating in a Virtual Reality Kiosk Display

    ERIC Educational Resources Information Center

    Sullivan, Briana; Ware, Colin; Plumlee, Matthew

    2006-01-01

    3D interactive virtual reality museum exhibits should be easy to use, entertaining, and informative. If the interface is intuitive, it will allow the user more time to learn the educational content of the exhibit. This research deals with interface issues concerning activating audio descriptions of images in such exhibits while the user is…

  2. Exploring the Integration of Technology into Jewish Education: Multi-User Virtual Environments and Supplementary School Settings

    ERIC Educational Resources Information Center

    Sohn, Johannah Eve

    2014-01-01

    This descriptive case study explores the implementation of a multi-user virtual environment (MUVE) in a Jewish supplemental school setting. The research was conducted to present the recollections and reflections of three constituent populations of a new technology exploring constructivist education in the context of supplemental and online…

  3. 49 CFR Appendix A to Part 611 - Description of Measures Used for Project Evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... alternative, travelers projected to shift to transit because of the new start project, and non-transit users... measure will be based on a multi-modal measure of perceived travel times faced by all users of the..., and level of commitment of each proposed source of local match, including inter-governmental grants...

  4. LANDSAT-4 World Reference System (WRS) users guide

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A functional description of the new LANDSAT-4 World Reference System (WRS) with an overview of the main orbital parameters and instrument coverages is presented to provide the data user with the primary information required to understand LANDSAT-4 orbital characteristics, to effectively use the WRS indexing scheme, and to request specific geographic coverage on the desired observation dates.

  5. Assessment of Users Information Needs and Satisfaction in Selected Seminary Libraries in Oyo State, Nigeria

    ERIC Educational Resources Information Center

    Adekunjo, Olalekan Abraham; Adepoju, Samuel Olusegun; Adeola, Anuoluwapo Odebunmi

    2015-01-01

    The study assessed users' information needs and satisfaction in selected seminary libraries in Oyo State, Nigeria. This paper employed the descriptive survey research design, whereby the expost-facto was employed with a sample size of three hundred (300) participants, selected from six seminaries located in Ibadan, Oyo and Ogbomoso, all in Oyo…

  6. Conceptual Representation of Actions in Sign Language

    ERIC Educational Resources Information Center

    Dobel, Christian; Enriquez-Geppert, Stefanie; Hummert, Marja; Zwitserlood, Pienie; Bolte, Jens

    2011-01-01

    The idea that knowledge of events entails a universal spatial component, that is conceiving agents left of patients, was put to test by investigating native users of German sign language and native users of spoken German. Participants heard or saw event descriptions and had to illustrate the meaning of these events by means of drawing or arranging…

  7. 47 CFR 79.101 - Closed caption decoder requirements for analog television receivers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) BROADCAST RADIO SERVICES CLOSED CAPTIONING AND VIDEO DESCRIPTION OF VIDEO PROGRAMMING § 79.101 Closed... user selects. The TV Mode of operation allows the video to be viewed in its original form. The Caption... from the video over which they are placed. In addition, the user must have the capability to select a...

  8. Creating Accessible Science Museums with User-Activated Environmental Audio Beacons (Ping!)

    ERIC Educational Resources Information Center

    Landau, Steven; Wiener, William; Naghshineh, Koorosh; Giusti, Ellen

    2005-01-01

    In 2003, Touch Graphics Company carried out research on a new invention that promises to improve accessibility to science museums for visitors who are visually impaired. The system, nicknamed Ping!, allows users to navigate an exhibit area, listen to audio descriptions, and interact with exhibits using a cell phone-based interface. The system…

  9. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  10. Drug use, mental health and problems related to crime and violence: cross-sectional study1

    PubMed Central

    Claro, Heloísa Garcia; de Oliveira, Márcia Aparecida Ferreira; Bourdreaux, Janet Titus; Fernandes, Ivan Filipe de Almeida Lopes; Pinho, Paula Hayasi; Tarifa, Rosana Ribeiro

    2015-01-01

    Objective: to investigate the correlation between disorders related to the use of alcohol and other drugs and symptoms of mental disorders, problems related to crime and violence and to age and gender. Methods: cross-sectional descriptive study carried out with 128 users of a Psychosocial Care Center for Alcohol and other Drugs, in the city of São Paulo, interviewed by means of the instrument entitled Global Appraisal of Individual Needs - Short Screener. Univariate and multiple linear regression models were used to verify the correlation between the variables. Results: using univariate regression models, internalizing and externalizing symptoms and problems related to crime/violence proved significant and were included in the multiple model, in which only the internalizing symptoms and problems related to crime and violence remained significant. Conclusions: there is a correlation between the severity of problems related to alcohol use and severity of mental health symptoms and crime and violence in the study sample. The results emphasize the need for an interdisciplinary and intersectional character of attention to users of alcohol and other drugs, since they live in a socially vulnerable environment. PMID:26626010

  11. Geant4-DNA example applications for track structure simulations in liquid water: a report from the Geant4-DNA Project.

    PubMed

    Incerti, S; Kyriakou, I; Bernal, M A; Bordage, M C; Francis, Z; Guatelli, S; Ivanchenko, V; Karamitros, M; Lampe, N; Lee, S B; Meylan, S; Min, C H; Shin, W G; Nieminen, P; Sakata, D; Tang, N; Villagrasa, C; Tran, H; Brown, J M C

    2018-06-14

    This Special Report presents a description of Geant4-DNA user applications dedicated to the simulation of track structures (TS) in liquid water and associated physical quantities (e.g. range, stopping power, mean free path…). These example applications are included in the Geant4 Monte Carlo toolkit and are available in open access. Each application is described and comparisons to recent international recommendations are shown (e.g. ICRU, MIRD), when available. The influence of physics models available in Geant4-DNA for the simulation of electron interactions in liquid water is discussed. Thanks to these applications, the authors show that the most recent sets of physics models available in Geant4-DNA (the so-called "option4″ and "option 6″ sets) enable more accurate simulation of stopping powers, dose point kernels and W-values in liquid water, than the default set of models ("option 2″) initially provided in Geant4-DNA. They also serve as reference applications for Geant4-DNA users interested in TS simulations. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  12. Diversifying customer review rankings.

    PubMed

    Krestel, Ralf; Dokoohaki, Nima

    2015-06-01

    E-commerce Web sites owe much of their popularity to consumer reviews accompanying product descriptions. On-line customers spend hours and hours going through heaps of textual reviews to decide which products to buy. At the same time, each popular product has thousands of user-generated reviews, making it impossible for a buyer to read everything. Current approaches to display reviews to users or recommend an individual review for a product are based on the recency or helpfulness of each review. In this paper, we present a framework to rank product reviews by optimizing the coverage of the ranking with respect to sentiment or aspects, or by summarizing all reviews with the top-K reviews in the ranking. To accomplish this, we make use of the assigned star rating for a product as an indicator for a review's sentiment polarity and compare bag-of-words (language model) with topic models (latent Dirichlet allocation) as a mean to represent aspects. Our evaluation on manually annotated review data from a commercial review Web site demonstrates the effectiveness of our approach, outperforming plain recency ranking by 30% and obtaining best results by combining language and topic model representations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Who benefits from free institutional delivery? evidence from a cross sectional survey of North Central and Southwestern Nigeria.

    PubMed

    Ajayi, Anthony I; Akpan, Wilson

    2017-09-02

    The reasons for low utilisation of maternal health services in settings where the user-fee removal policy has been implemented continue to generate scholarly debates. Evidence of whether user-fee removal benefits the poor women in underserved settings is scanty and inconsistent. This article examines use of maternal health care services in the context of free maternal healthcare and profiles the beneficiaries of user-fee removal. The study adopted a descriptive design. A three-stage cluster sampling method was used to select a representative sample of 1227 women who gave birth between 2011 and 2015. Questionnaires were administered using a face-to-face interview approach and data generated were analysed using descriptive and inferential statistics. The analysis shows that the use of maternal healthcare services has improved considerably in North Central and Southwestern Nigeria. While socioeconomic and geographical inequality in the use of maternal healthcare services appear to be disappearing in Southwestern Nigeria, it appears to be widening in North Central Nigeria. The findings indicate that 33.6% of women reported to have benefitted from the free child-delivery programme; however, substantial variation exists across the two regions. The proportion of beneficiaries of user-fee removal policy was highest in urban areas (35.9%), among women belonging to the middle income category (38.3%), among women who gave birth in primary health centres (63.1%) and among women who resided in communities where there was availability of health facilities (37.2%). The study concludes that low coverage of the free maternal health programme, especially among women of low socioeconomic status residing in underserved settings is among the reasons for persistent poor maternal health outcomes in the context of free maternal healthcare. A model towards improving maternal health in underserved settings, especially in North Central Nigeria, would entail provisioning of health facilities as well as focusing on implementing equitable maternal health policies.

  14. How Effective is Routing for Wireless Networking

    DTIC Science & Technology

    2016-03-05

    Routing (LAR) [31]. The basic mechanism of how link-based routing schemes operate is as follows: a user broadcasts a control message (called a “ hello ...to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between them. Routes are then be...description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is counted. A cost is then

  15. Distributed Common Ground System - Army (DCGS-A) Increment 1 Release 2 Follow-on Operational Test and Evaluation (FOT and E) Report

    DTIC Science & Technology

    2016-01-01

    Moving Target Indicator, Unmanned Aircraft System (UAS), Rivet Joint, U-2, and ground signals intelligence (PROPHET). At the BCT, Ranger Regiment and... metadata catalog managed by the DIB management office (outside of the DCGS-A system ). A metadata is a searchable description of data, and users across...challenge for users . The system required reboots about every 20 hours for users who had heavy workloads such as the fire support analysts and data

  16. Computer program system for dynamic simulation and stability analysis of passive and actively controlled spacecraft. Volume 1. Theory

    NASA Technical Reports Server (NTRS)

    Bodley, C. S.; Devers, D. A.; Park, C. A.

    1975-01-01

    A theoretical development and associated digital computer program system is presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system may be used to investigate total system dynamic characteristics including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. Additionally, the program system may be used for design of attitude control systems and for evaluation of total dynamic system performance including time domain response and frequency domain stability analyses. Volume 1 presents the theoretical developments including a description of the physical system, the equations of dynamic equilibrium, discussion of kinematics and system topology, a complete treatment of momentum wheel coupling, and a discussion of gravity gradient and environmental effects. Volume 2, is a program users' guide and includes a description of the overall digital program code, individual subroutines and a description of required program input and generated program output. Volume 3 presents the results of selected demonstration problems that illustrate all program system capabilities.

  17. Web accessibility support for visually impaired users using link content analysis.

    PubMed

    Iwata, Hajime; Kobayashi, Naofumi; Tachibana, Kenji; Shirogane, Junko; Fukazawa, Yoshiaki

    2013-12-01

    Web pages are used for a variety of purposes. End users must understand dynamically changing content and sequentially follow page links to find desired material, requiring significant time and effort. However, for visually impaired users using screen readers, it can be difficult to find links to web pages when link text and alternative text descriptions are inappropriate. Our method supports the discovery of content by analyzing 8 categories of link types, and allows visually impaired users to be aware of the content represented by links in advance. This facilitates end users access to necessary information on web pages. Our method of classifying web page links is therefore effective as a means of evaluating accessibility.

  18. JPL Energy Consumption Program (ECP) documentation: A computer model simulating heating, cooling and energy loads in buildings. [low cost solar array efficiency

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Chai, V. W.; Lascu, D.; Urbenajo, R.; Wong, P.

    1978-01-01

    The engineering manual provides a complete companion documentation about the structure of the main program and subroutines, the preparation of input data, the interpretation of output results, access and use of the program, and the detailed description of all the analytic, logical expressions and flow charts used in computations and program structure. A numerical example is provided and solved completely to show the sequence of computations followed. The program is carefully structured to reduce both user's time and costs without sacrificing accuracy. The user would expect a cost of CPU time of approximately $5.00 per building zone excluding printing costs. The accuracy, on the other hand, measured by deviation of simulated consumption from watt-hour meter readings, was found by many simulation tests not to exceed + or - 10 percent margin.

  19. nuMap: A Web Platform for Accurate Prediction of Nucleosome Positioning

    PubMed Central

    Alharbi, Bader A.; Alshammari, Thamir H.; Felton, Nathan L.; Zhurkin, Victor B.; Cui, Feng

    2014-01-01

    Nucleosome positioning is critical for gene expression and of major biological interest. The high cost of experimentally mapping nucleosomal arrangement signifies the need for computational approaches to predict nucleosome positions at high resolution. Here, we present a web-based application to fulfill this need by implementing two models, YR and W/S schemes, for the translational and rotational positioning of nucleosomes, respectively. Our methods are based on sequence-dependent anisotropic bending that dictates how DNA is wrapped around a histone octamer. This application allows users to specify a number of options such as schemes and parameters for threading calculation and provides multiple layout formats. The nuMap is implemented in Java/Perl/MySQL and is freely available for public use at http://numap.rit.edu. The user manual, implementation notes, description of the methodology and examples are available at the site. PMID:25220945

  20. nuMap: a web platform for accurate prediction of nucleosome positioning.

    PubMed

    Alharbi, Bader A; Alshammari, Thamir H; Felton, Nathan L; Zhurkin, Victor B; Cui, Feng

    2014-10-01

    Nucleosome positioning is critical for gene expression and of major biological interest. The high cost of experimentally mapping nucleosomal arrangement signifies the need for computational approaches to predict nucleosome positions at high resolution. Here, we present a web-based application to fulfill this need by implementing two models, YR and W/S schemes, for the translational and rotational positioning of nucleosomes, respectively. Our methods are based on sequence-dependent anisotropic bending that dictates how DNA is wrapped around a histone octamer. This application allows users to specify a number of options such as schemes and parameters for threading calculation and provides multiple layout formats. The nuMap is implemented in Java/Perl/MySQL and is freely available for public use at http://numap.rit.edu. The user manual, implementation notes, description of the methodology and examples are available at the site. Copyright © 2014 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  1. Implementing real-time robotic systems using CHIMERA II

    NASA Technical Reports Server (NTRS)

    Stewart, David B.; Schmitz, Donald E.; Khosla, Pradeep K.

    1990-01-01

    A description is given of the CHIMERA II programming environment and operating system, which was developed for implementing real-time robotic systems. Sensor-based robotic systems contain both general- and special-purpose hardware, and thus the development of applications tends to be a time-consuming task. The CHIMERA II environment is designed to reduce the development time by providing a convenient software interface between the hardware and the user. CHIMERA II supports flexible hardware configurations which are based on one or more VME-backplanes. All communication across multiple processors is transparent to the user through an extensive set of interprocessor communication primitives. CHIMERA II also provides a high-performance real-time kernel which supports both deadline and highest-priority-first scheduling. The flexibility of CHIMERA II allows hierarchical models for robot control, such as NASREM, to be implemented with minimal programming time and effort.

  2. Operational Based Vision Assessment Automated Vision Test Collection User Guide

    DTIC Science & Technology

    2017-05-15

    repeatability to support correlation analysis. The AVT research grade tests also support interservice, international, industry, and academic partnerships...software, provides information concerning various menu options and operation of the test, and provides a brief description of each of the automated vision...2802, 6 Jun 2017. TABLE OF CONTENTS (concluded) Section Page 7.0 OBVA VISION TEST DESCRIPTIONS

  3. An Assessment System for Competence Based Education: The Educational Development, Dissemination, and Evaluation Training Program.

    ERIC Educational Resources Information Center

    Hood, Paul D.; Blackwell, Laird

    This manual provides a description of the development and a guide to the use of the assessment resources developed in connection with the Far West Development, Dissemination, and Evaluation (DD&E) Functional Competence Training Program. The document concentrates on a user-oriented description of the content, validation, and use of the final…

  4. Caesy: A software tool for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  5. Sigma 2 Graphic Display Software Program Description

    NASA Technical Reports Server (NTRS)

    Johnson, B. T.

    1973-01-01

    A general purpose, user oriented graphic support package was implemented. A comprehensive description of the two software components comprising this package is given: Display Librarian and Display Controller. These programs have been implemented in FORTRAN on the XDS Sigma 2 Computer Facility. This facility consists of an XDS Sigma 2 general purpose computer coupled to a Computek Display Terminal.

  6. Aero/fluids database system

    NASA Technical Reports Server (NTRS)

    Reardon, John E.; Violett, Duane L., Jr.

    1991-01-01

    The AFAS Database System was developed to provide the basic structure of a comprehensive database system for the Marshall Space Flight Center (MSFC) Structures and Dynamics Laboratory Aerophysics Division. The system is intended to handle all of the Aerophysics Division Test Facilities as well as data from other sources. The system was written for the DEC VAX family of computers in FORTRAN-77 and utilizes the VMS indexed file system and screen management routines. Various aspects of the system are covered, including a description of the user interface, lists of all code structure elements, descriptions of the file structures, a description of the security system operation, a detailed description of the data retrieval tasks, a description of the session log, and a description of the archival system.

  7. Development of BEM for ceramic composites

    NASA Technical Reports Server (NTRS)

    Henry, D. P.; Banerjee, P. K.; Dargush, G. F.; Hopkins, D. A.; Goldberg, R. K.

    1993-01-01

    BEST-CMS (boundary element solution technology - composite modeling system) is an advanced engineering system for the micro-analysis of fiber composite structures. BEST-CMS is based upon the boundary element program BEST3D which was developed for NASA by Pratt and Whitney Aircraft and the State University of New York at Buffalo under contract NAS3-23697. BEST-CMS presently has the capabilities for elastostatic analysis, steady-state and transient heat transfer analysis, steady-state and transient concurrent thermoelastic analysis, and elastoplastic and creep analysis. The fibers are assumed to be perfectly bonded to the composite matrix, or in the case of static or steady-state analysis, the fibers may be assumed to have spring connections, thermal resistance, and/or frictional sliding between the fibers and the composite matrix. The primary objective of this user's manual is to provide an overview of all BEST-CMS capabilities, along with detailed descriptions of the input data requirements. In the next chapter, a brief review of the theoretical background is presented for each analysis category. Then, chapter three discusses the key aspects of the numerical implementation, while chapter four provides a tutorial for the beginning BEST-CMS user. The heart of the manual, however, is in chapter five, where a complete description of all data input items is provided. Within this chapter, the individual entries are grouped on a functional basis for a more coherent presentation. Chapter six includes sample problems and should be of considerable assistance to the novice. Chapter seven includes capsules of a number of fiber-composite analysis problems that have been solved using BEST-CMS. This chapter is primarily descriptive in nature and is intended merely to illustrate the level of analysis that is possible within the present BEST-CMS system. Chapter eight contains a detail description of the BEST-CMS Neutral File which is helpful in writing an interface between BEST-CMS and any graphic post-processor program. Finally, all pertinent references are listed in chapter nine.

  8. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. Selfish routing equilibrium in stochastic traffic network: A probability-dominant description.

    PubMed

    Zhang, Wenyi; He, Zhengbing; Guan, Wei; Ma, Rui

    2017-01-01

    This paper suggests a probability-dominant user equilibrium (PdUE) model to describe the selfish routing equilibrium in a stochastic traffic network. At PdUE, travel demands are only assigned to the most dominant routes in the same origin-destination pair. A probability-dominant rerouting dynamic model is proposed to explain the behavioral mechanism of PdUE. To facilitate applications, the logit formula of PdUE is developed, of which a well-designed route set is not indispensable and the equivalent varitional inequality formation is simple. Two routing strategies, i.e., the probability-dominant strategy (PDS) and the dominant probability strategy (DPS), are discussed through a hypothetical experiment. It is found that, whether out of insurance or striving for perfection, PDS is a better choice than DPS. For more general cases, the conducted numerical tests lead to the same conclusion. These imply that PdUE (rather than the conventional stochastic user equilibrium) is a desirable selfish routing equilibrium for a stochastic network, given that the probability distributions of travel time are available to travelers.

  10. Selfish routing equilibrium in stochastic traffic network: A probability-dominant description

    PubMed Central

    Zhang, Wenyi; Guan, Wei; Ma, Rui

    2017-01-01

    This paper suggests a probability-dominant user equilibrium (PdUE) model to describe the selfish routing equilibrium in a stochastic traffic network. At PdUE, travel demands are only assigned to the most dominant routes in the same origin-destination pair. A probability-dominant rerouting dynamic model is proposed to explain the behavioral mechanism of PdUE. To facilitate applications, the logit formula of PdUE is developed, of which a well-designed route set is not indispensable and the equivalent varitional inequality formation is simple. Two routing strategies, i.e., the probability-dominant strategy (PDS) and the dominant probability strategy (DPS), are discussed through a hypothetical experiment. It is found that, whether out of insurance or striving for perfection, PDS is a better choice than DPS. For more general cases, the conducted numerical tests lead to the same conclusion. These imply that PdUE (rather than the conventional stochastic user equilibrium) is a desirable selfish routing equilibrium for a stochastic network, given that the probability distributions of travel time are available to travelers. PMID:28829834

  11. Baseline Motivation Type as a Predictor of Dropout in a Healthy Eating Text Messaging Program.

    PubMed

    Coa, Kisha; Patrick, Heather

    2016-09-29

    Growing evidence suggests that text messaging programs are effective in facilitating health behavior change. However, high dropout rates limit the potential effectiveness of these programs. This paper describes patterns of early dropout in the HealthyYou text (HYTxt) program, with a focus on the impact of baseline motivation quality on dropout, as characterized by Self-Determination Theory (SDT). This analysis included 193 users of HYTxt, a diet and physical activity text messaging intervention developed by the US National Cancer Institute. Descriptive statistics were computed, and logistic regression models were run to examine the association between baseline motivation type and early program dropout. Overall, 43.0% (83/193) of users dropped out of the program; of these, 65.1% (54/83; 28.0% of all users) did so within the first 2 weeks. Users with higher autonomous motivation had significantly lower odds of dropping out within the first 2 weeks. A one unit increase in autonomous motivation was associated with lower odds (odds ratio 0.44, 95% CI 0.24-0.81) of early dropout, which persisted after adjusting for level of controlled motivation. Applying SDT-based strategies to enhance autonomous motivation might reduce early dropout rates, which can improve program exposure and effectiveness.

  12. CHRONOS architecture: Experiences with an open-source services-oriented architecture for geoinformatics

    USGS Publications Warehouse

    Fils, D.; Cervato, C.; Reed, J.; Diver, P.; Tang, X.; Bohling, G.; Greer, D.

    2009-01-01

    CHRONOS's purpose is to transform Earth history research by seamlessly integrating stratigraphic databases and tools into a virtual on-line stratigraphic record. In this paper, we describe the various components of CHRONOS's distributed data system, including the encoding of semantic and descriptive data into a service-based architecture. We give examples of how we have integrated well-tested resources available from the open-source and geoinformatic communities, like the GeoSciML schema and the simple knowledge organization system (SKOS), into the services-oriented architecture to encode timescale and phylogenetic synonymy data. We also describe on-going efforts to use geospatially enhanced data syndication and informally including semantic information by embedding it directly into the XHTML Document Object Model (DOM). XHTML DOM allows machine-discoverable descriptive data such as licensing and citation information to be incorporated directly into data sets retrieved by users. ?? 2008 Elsevier Ltd. All rights reserved.

  13. SED-ED, a workflow editor for computational biology experiments written in SED-ML.

    PubMed

    Adams, Richard R

    2012-04-15

    The simulation experiment description markup language (SED-ML) is a new community data standard to encode computational biology experiments in a computer-readable XML format. Its widespread adoption will require the development of software support to work with SED-ML files. Here, we describe a software tool, SED-ED, to view, edit, validate and annotate SED-ML documents while shielding end-users from the underlying XML representation. SED-ED supports modellers who wish to create, understand and further develop a simulation description provided in SED-ML format. SED-ED is available as a standalone Java application, as an Eclipse plug-in and as an SBSI (www.sbsi.ed.ac.uk) plug-in, all under an MIT open-source license. Source code is at https://sed-ed-sedmleditor.googlecode.com/svn. The application itself is available from https://sourceforge.net/projects/jlibsedml/files/SED-ED/.

  14. Earth and environmental science in the 1980's: Part 1: Environmental data systems, supercomputer facilities and networks

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Overview descriptions of on-line environmental data systems, supercomputer facilities, and networks are presented. Each description addresses the concepts of content, capability, and user access relevant to the point of view of potential utilization by the Earth and environmental science community. The information on similar systems or facilities is presented in parallel fashion to encourage and facilitate intercomparison. In addition, summary sheets are given for each description, and a summary table precedes each section.

  15. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.

  16. A data model for environmental scientists

    NASA Astrophysics Data System (ADS)

    Kapeljushnik, O.; Beran, B.; Valentine, D.; van Ingen, C.; Zaslavsky, I.; Whitenack, T.

    2008-12-01

    Environmental science encompasses a wide range of disciplines from water chemistry to microbiology, ecology and atmospheric sciences. Studies often require working across disciplines which differ in their ways of describing and storing data such that it is not possible to devise a monolithic one-size-fits-all data solution. Based on our experiences with Consortium of the Universities for the Advancement of Hydrologic Science Inc. (CUAHSI) Observations Data Model, Berkeley Water Center FLUXNET carbon-climate work and by examining standards like EPA's Water Quality Exchange (WQX), we have developed a flexible data model that allows extensions without need to altering the schema such that scientists can define custom metadata elements to describe their data including observations, analysis methods as well as sensors and geographical features. The data model supports various types of observations including fixed point and moving sensors, bottled samples, rasters from remote sensors and models, and categorical descriptions (e.g. taxonomy) by employing user-defined-types when necessary. It leverages ADO .NET Entity Framework to provide the semantic data models for differing disciplines, while maintaining a common schema below the entity layer. This abstraction layer simplifies data retrieval and manipulation by hiding the logic and complexity of the relational schema from users thus allows programmers and scientists to deal directly with objects such as observations, sensors, watersheds, river reaches, channel cross-sections, laboratory analysis methods and samples as opposed to table joins, columns and rows.

  17. The Third Annual NASA Science Internet User Working Group Conference

    NASA Technical Reports Server (NTRS)

    Lev, Brian S. (Editor); Gary, J. Patrick (Editor)

    1993-01-01

    The NASA Science Internet (NSI) User Support Office (USO) sponsored the Third Annual NSI User Working Group (NSIUWG) Conference March 30 through April 3, 1992, in Greenbelt, MD. Approximately 130 NSI users attended to learn more about the NSI, hear from projects which use NSI, and receive updates about new networking technologies and services. This report contains material relevant to the conference; copies of the agenda, meeting summaries, presentations, and descriptions of exhibitors. Plenary sessions featured a variety of speakers, including NSI project management, scientists, and NSI user project managers whose projects and applications effectively use NSI, and notable citizens of the larger Internet community. The conference also included exhibits of advanced networking applications; tutorials on internetworking, computer security, and networking technologies; and user subgroup meetings on the future direction of the conference, networking, and user services and applications.

  18. Measurement of user performance and attitudes assists the initial design of a computer user display and orientation method.

    PubMed

    Chase, C R; Ashikaga, T; Mazuzan, J E

    1994-07-01

    The objective of our study was to assess the acceptability of a proposed user interface to visually interfaced computer-assisted anesthesia record (VISI-CAARE), before the application was begun. The user interface was defined as the user display and its user orientation methods. We designed methods to measure user performance and attitude toward two different anesthesia record procedures: (1) the traditional pen and paper anesthetic record procedure of our hospital, and (2) VISI-CAARE. Performance measurements included the reaction speed (identifying the type and time of an event) and completion speed (describing the event). Performance also included accuracy of the recorded time of the event and accuracy of the description. User attitude was measured by (1) the physician's rating on a scale of 0 to 9 of the potential usefulness of computers in anesthesia care; (2) willingness to use the future application in the clinical environment; and (3) user suggestions for change. These measurements were used in a randomized trial of 21 physicians, of which data from 20 were available. After exposure to VISI-CAARE, the experimental subjects' ranking of computer usefulness in anesthesia care improved significantly (4.2 +/- 1.1 to 7.6 +/- 1.5, p = 0.0001), as did controls' (5.2 +/- 2.6 to 8 +/- 1.5, p = 0.0019). All the volunteers were willing to try the proposed prototype clinically, when it was ready. VISI-CAARE exposure was associated with faster and more accurate reaction to events over the traditional pen and paper machine, and slower and more accurate description of events in an artificial mock setting. VISI-CAARE 1.1 demonstrated significant improvements in both reaction speed and completion speed over VISI-CAARE 1.0, after changes were made to the user display and orientation methods. With graphic user interface prototyping environments, one can obtain preliminary user attitude and performance data, even before application programming is begun. This may be helpful in revising initial display and orientation methods, while obtaining user interest and commitment before actual programming and clinical testing.

  19. Software for predictive microbiology and risk assessment: a description and comparison of tools presented at the ICPMF8 Software Fair.

    PubMed

    Tenenhaus-Aziza, Fanny; Ellouze, Mariem

    2015-02-01

    The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. A Revised Thermosphere for the Mars Global Reference Atmospheric Model (Mars-GRAM Version 3.4)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Johnson, D. L.; James, B. F.

    1996-01-01

    This report describes the newly-revised model thermosphere for the Mars Global Reference Atmospheric Model (Mars-GRAM, Version 3.4). It also provides descriptions of other changes made to the program since publication of the programmer's guide for Mars-GRAM Version 3.34. The original Mars-GRAM model thermosphere was based on the global-mean model of Stewart. The revised thermosphere is based largely on parameterizations derived from output data from the three-dimensional Mars Thermospheric Global Circulation Model (MTGCM). The new thermospheric model includes revised dependence on the 10.7 cm solar flux for the global means of exospheric temperature, temperature of the base of the thermosphere, and scale height for the thermospheric temperature variations, as well as revised dependence on orbital position for global mean height of the base of the thermosphere. Other features of the new thermospheric model are: (1) realistic variations of temperature and density with latitude and time of day, (2) more realistic wind magnitudes, based on improved estimates of horizontal pressure gradients, and (3) allowance for user-input adjustments to the model values for mean exospheric temperature and for height and temperature at the base of the thermosphere. Other new features of Mars-GRAM 3.4 include: (1) allowance for user-input values of climatic adjustment factors for temperature profiles from the surface to 75 km, and (2) a revised method for computing the sub-solar longitude position in the 'ORBIT' subroutine.

Top