Science.gov

Sample records for model theory user

  1. The Sandia GeoModel : theory and user's guide.

    SciTech Connect

    Brannon, Rebecca Moss; Fossum, Arlo Frederick

    2004-08-01

    The mathematical and physical foundations and domain of applicability of Sandia's GeoModel are presented along with descriptions of the source code and user instructions. The model is designed to be used in conventional finite element architectures, and (to date) it has been installed in five host codes without requiring customizing the model subroutines for any of these different installations. Although developed for application to geological materials, the GeoModel actually applies to a much broader class of materials, including rock-like engineered materials (such as concretes and ceramics) and even to metals when simplified parameters are used. Nonlinear elasticity is supported through an empirically fitted function that has been found to be well-suited to a wide variety of materials. Fundamentally, the GeoModel is a generalized plasticity model. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. The geomodel supports deformation-induced anisotropy in a limited capacity through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). Aside from kinematic hardening, however, the governing equations are otherwise isotropic. The GeoModel is a genuine unification and generalization of simpler models. The GeoModel can employ up to 40 material input and control parameters in the rare case when all features are used. Simpler idealizations (such as linear elasticity, or Von Mises yield, or Mohr-Coulomb failure) can be replicated by simply using fewer parameters. For high-strain-rate applications, the GeoModel supports rate dependence through an overstress model.

  2. User Acceptance of Information Technology: Theories and Models.

    ERIC Educational Resources Information Center

    Dillon, Andrew; Morris, Michael G.

    1996-01-01

    Reviews literature in user acceptance and resistance to information technology design and implementation. Examines innovation diffusion, technology design and implementation, human-computer interaction, and information systems. Concentrates on the determinants of user acceptance and resistance and emphasizes how researchers and developers can…

  3. A Comprehensive and Systematic Model of User Evaluation of Web Search Engines: I. Theory and Background.

    ERIC Educational Resources Information Center

    Su, Louise T.

    2003-01-01

    Reports on a project that proposes and tests a comprehensive and systematic model of user evaluation of Web search engines. This article describes the model, including a set of criteria and measures and a method for implementation. A literature review portrays settings for developing the model and places applications of the model in contemporary…

  4. Repository Integration Program: RIP performance assessment and strategy evaluation model theory manual and user`s guide

    SciTech Connect

    1995-11-01

    This report describes the theory and capabilities of RIP (Repository Integration Program). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ``big picture`` and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a `` top down`` approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ``top down`` approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers.

  5. WASP4, a hydrodynamic and water-quality model - model theory, user's manual, and programmer's guide

    SciTech Connect

    Ambrose, R.B.; Wool, T.A.; Connolly, J.P.; Schanz, R.W.

    1988-01-01

    The Water Quality Analysis Simulation Program Version 4 (WASP4) is a dynamic compartment-modeling system that can be used to analyze a variety of water-quality problems in a diverse set of water bodies. WASP4 simulates the transport and transformation of conventional and toxic pollutants in the water column and benthos of ponds, streams, lakes, reservoirs, rivers, estuaries, and coastal waters. The WASP4 modeling system covers four major subjects--hydrodynamics, conservative mass transport, eutrophication-dissolved oxygen kinetics, and toxic chemical-sediment dynamics. The WASP4 modeling system consists of two stand-alone computer programs, DYNHYD4 and WASP4, that can be run in conjunction or separately. The hydrodynamic program, DYNHYD4, simulates the movement of water and the water quality program, WASP4, simulates the movement and interaction of pollutants within the water. The latter program is supplied with two kinetic submodels to simulate two of the major classes of water-quality problems--conventional pollution (dissolved oxygen, biochemical oxygen demand, nutrients, and eutrophication) and toxic pollution (organic chemicals, heavy metals, and sediment). The substitution of either sub-model constitutes the models EUTRO4 and TOXI4, respectively.

  6. WASP4, A HYDRODYNAMIC AND WATER QUALITY MODEL - MODEL THEORY, USER'S MANUAL, AND PROGRAMMER'S GUIDE

    EPA Science Inventory

    The Water Quality Analysis Simulation Program Version 4 (WASP4) is a dynamic compartment modeling system that can be used to analyze a variety of water quality problems in a diverse set of water bodies. WASP4 simulates the transport and transformation of conventional and toxic po...

  7. KAYENTA: Theory and User's Guide

    SciTech Connect

    Brannon, Rebecca Moss; Fuller, Timothy Jesse; Strack, Otto Eric; Fossum, Arlo Frederick; Sanchez, Jason James

    2015-02-01

    The physical foundations and domain of applicability of the Kayenta constitutive model are presented along with descriptions of the source code and user instructions. Kayenta, which is an outgrowth of the Sandia GeoModel, includes features and fitting functions appropriate to a broad class of materials including rocks, rock-like engineered materials (such as concretes and ceramics), and metals. Fundamentally, Kayenta is a computational framework for generalized plasticity models. As such, it includes a yield surface, but the term (3z(Byield(3y (Bis generalized to include any form of inelastic material response (including microcrack growth and pore collapse) that can result in non-recovered strain upon removal of loads on a material element. Kayenta supports optional anisotropic elasticity associated with joint sets, as well as optional deformation-induced anisotropy through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). The governing equations are otherwise isotropic. Because Kayenta is a unification and generalization of simpler models, it can be run using as few as 2 parameters (for linear elasticity) to as many as 40 material and control parameters in the exceptionally rare case when all features are used. For high-strain-rate applications, Kayenta supports rate dependence through an overstress model. Isotropic damage is modeled through loss of stiffness and strength.

  8. KAYENTA : theory and user's guide.

    SciTech Connect

    Brannon, Rebecca Moss; Fossum, Arlo Frederick; Strack, Otto Eric

    2009-03-01

    The physical foundations and domain of applicability of the Kayenta constitutive model are presented along with descriptions of the source code and user instructions. Kayenta, which is an outgrowth of the Sandia GeoModel, includes features and fitting functions appropriate to a broad class of materials including rocks, rock-like engineered materials (such as concretes and ceramics), and metals. Fundamentally, Kayenta is a computational framework for generalized plasticity models. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. Kayenta supports optional anisotropic elasticity associated with ubiquitous joint sets. Kayenta supports optional deformation-induced anisotropy through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). The governing equations are otherwise isotropic. Because Kayenta is a unification and generalization of simpler models, it can be run using as few as 2 parameters (for linear elasticity) to as many as 40 material and control parameters in the exceptionally rare case when all features are used. For high-strain-rate applications, Kayenta supports rate dependence through an overstress model. Isotropic damage is modeled through loss of stiffness and strength.

  9. Outside users payload model

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The outside users payload model which is a continuation of documents and replaces and supersedes the July 1984 edition is presented. The time period covered by this model is 1985 through 2000. The following sections are included: (1) definition of the scope of the model; (2) discussion of the methodology used; (3) overview of total demand; (4) summary of the estimated market segmentation by launch vehicle; (5) summary of the estimated market segmentation by user type; (6) details of the STS market forecast; (7) summary of transponder trends; (8) model overview by mission category; and (9) detailed mission models. All known non-NASA, non-DOD reimbursable payloads forecast to be flown by non-Soviet-block countries are included in this model with the exception of Spacelab payloads and small self contained payloads. Certain DOD-sponsored or cosponsored payloads are included if they are reimbursable launches.

  10. ACIRF user's guide: Theory and examples

    NASA Astrophysics Data System (ADS)

    Dana, Roger A.

    1989-12-01

    Design and evaluation of radio frequency systems that must operate through ionospheric disturbances resulting from high altitude nuclear detonations requires an accurate channel model. This model must include the effects of high gain antennas that may be used to receive the signals. Such a model can then be used to construct realizations of the received signal for use in digital simulations of trans-ionospheric links or for use in hardware channel simulators. The FORTRAN channel model ACIRF (Antenna Channel Impulse Response Function) generates random realizations of the impulse response function at the outputs of multiple antennas. This user's guide describes the FORTRAN program ACIRF (version 2.0) that generates realizations of channel impulse response functions at the outputs of multiple antennas with arbitrary beamwidths, pointing angles, and relatives positions. This channel model is valid under strong scattering conditions when Rayleigh fading statistics apply. Both frozen-in and turbulent models for the temporal fluctuations are included in this version of ACIRF. The theory of the channel model is described and several examples are given.

  11. WASP3 (WATER QUALITY ANALYSIS PROGRAM), A HYDRODYNAMIC AND WATER QUALITY MODEL - MODEL THEORY, USER'S MANUAL, AND PROGRAMMER'S GUIDE

    EPA Science Inventory

    The Water Quality Analysis Simulation Program--3 (WASP3) is a dynamic compartment modeling system that can be used to analyze a variety of water quality problems in a diverse set of water bodies. WASP3 simulates the transport and transformation of conventional and toxic pollutant...

  12. GWSCREEN: A semi-analytical model for assessment of the groundwater pathway from surface or buried contamination: Version 2.0 theory and user`s manual

    SciTech Connect

    Rood, A.S.

    1993-06-01

    GWSCREEN was developed for assessment of the groundwater pathway from leaching of radioactive and non radioactive substances from surface or buried sources. The code was designed for implementation in the Track I and Track II assessment of CERCLA (Comprehensive Environmental Response, Compensation and Liability Act) sites identified as low probability hazard at the Idaho National Engineering Laboratory (DOE, 1992). The code calculates the limiting soil concentration such that, after leaching and transport to the aquifer, regulatory contaminant levels in groundwater are not exceeded. The code uses a mass conservation approach to model three processes: contaminant release from a source volume, contaminant transport in the unsaturated zone, and contaminant transport in the saturated zone. The source model considers the sorptive properties and solubility of the contaminant. Transport in the unsaturated zone is described by a plug flow model. Transport in the saturated zone is calculated with a semi-analytical solution to the advection dispersion equation in groundwater. In Version 2.0, GWSCREEN has incorporated an additional source model to calculate the impacts to groundwater resulting from the release to percolation ponds. In addition, transport of radioactive progeny has also been incorporated. GWSCREEN has shown comparable results when compared against other codes using similar algorithms and techniques. This code was designed for assessment and screening of the groundwater pathway when field data is limited. It was not intended to be a predictive tool.

  13. On the estimability of parameters in undifferenced, uncombined GNSS network and PPP-RTK user models by means of $mathcal {S}$ S -system theory

    NASA Astrophysics Data System (ADS)

    Odijk, Dennis; Zhang, Baocheng; Khodabandeh, Amir; Odolinski, Robert; Teunissen, Peter J. G.

    2016-01-01

    The concept of integer ambiguity resolution-enabled Precise Point Positioning (PPP-RTK) relies on appropriate network information for the parameters that are common between the single-receiver user that applies and the network that provides this information. Most of the current methods for PPP-RTK are based on forming the ionosphere-free combination using dual-frequency Global Navigation Satellite System (GNSS) observations. These methods are therefore restrictive in the light of the development of new multi-frequency GNSS constellations, as well as from the point of view that the PPP-RTK user requires ionospheric corrections to obtain integer ambiguity resolution results based on short observation time spans. The method for PPP-RTK that is presented in this article does not have above limitations as it is based on the undifferenced, uncombined GNSS observation equations, thereby keeping all parameters in the model. Working with the undifferenced observation equations implies that the models are rank-deficient; not all parameters are unbiasedly estimable, but only combinations of them. By application of S-system theory the model is made of full rank by constraining a minimum set of parameters, or S-basis. The choice of this S-basis determines the estimability and the interpretation of the parameters that are transmitted to the PPP-RTK users. As this choice is not unique, one has to be very careful when comparing network solutions in different S-systems; in that case the S-transformation, which is provided by the S-system method, should be used to make the comparison. Knowing the estimability and interpretation of the parameters estimated by the network is shown to be crucial for a correct interpretation of the estimable PPP-RTK user parameters, among others the essential ambiguity parameters, which have the integer property which is clearly following from the interpretation of satellite phase biases from the network. The flexibility of the S-system method is

  14. HTGR Cost Model Users' Manual

    SciTech Connect

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooler Reactor (HTGR) Cost Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Cost Model calculates an estimate of the capital costs, annual operating and maintenance costs, and decommissioning costs for a high-temperature gas-cooled reactor. The user can generate these costs for multiple reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for a single or four-pack configuration; and for a reactor size of 350 or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Cost Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Cost Model. This model was design for users who are familiar with the HTGR design and Excel. Modification of the HTGR Cost Model should only be performed by users familiar with Excel and Visual Basic.

  15. FRAC-UNIX theory and user's manual

    SciTech Connect

    Clemo, T.M.; Miller, J.D.; Hull, L.C.; Magnuson, S.O.

    1990-05-01

    The FRAC-UNIX computer code provides a two-dimensional simulation of saturated flow and transport in a fractured porous media. The code incorporates a dual permeability approach in which the rock matrix is modeled as rectangular cells and the fractures are represented as discrete elements on the edges and diagonals of the matrix cells. A single head distribution drives otherwise independent flows in the matrix and in the fractures. Steady-state or transient flow of a single-phase fluid may be simulated. Solute or heat transport is simulated by moving imaginary marker particles in the velocity field established by the flow model, under the additional influence of dispersive and diffusive processes. Sparse-matrix techniques are utilized along with a specially developed user interface. The code is installed a CRAY XMP24 Computer using the UNICOS operating system. The initial version of this code, entitled FRACSL, incorporated the same flow and transport models, but used a commercial software package for the numerics and user interface. This report describes the theoretical basis, approach and implementation incorporated in the code; the mechanics of operating the code; several sample problems; and the integration of code development with physical modeling and field testing. The code is fully functional, for most purposes, as shown by the results of an extensive code verification effort. Work remaining consists of refining and adding capabilities needed for several of the code verification problems; relatively simple modifications to extend its application and improve its ease of use; an improvement in the treatment of fracture junctions and correction of an error in calculating buoyancy and concentration for diagonal fractures on a rectangular grid. 42 refs., 28 figs., 5 tabs.

  16. Cognitive Behavioral Theories Used to Explain Injection Risk Behavior among Injection Drug Users: A Review and Suggestions for the Integration of Cognitive and Environmental Models

    ERIC Educational Resources Information Center

    Wagner, Karla Dawn; Unger, Jennifer B.; Bluthenthal, Ricky N.; Andreeva, Valentina A.; Pentz, Mary Ann

    2010-01-01

    Injection drug users (IDUs) are at risk for HIV and viral hepatitis, and risky injection behavior persists despite decades of intervention. Cognitive behavioral theories (CBTs) are commonly used to help understand risky injection behavior. The authors review findings from CBT-based studies of injection risk behavior among IDUs. An extensive…

  17. UNSAT-H Version 3.0: Unsaturated Soil Water and Heat Flow Model Theory, User Manual, and Examples

    SciTech Connect

    MJ Fayer

    2000-06-12

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. During the last 4 years, the UNSAT-H model received support from the Immobilized Waste Program (IWP) of the Hanford Site's River Protection Project. This program is designing and assessing the performance of on-site disposal facilities to receive radioactive wastes that are currently stored in single- and double-shell tanks at the Hanford Site (LMHC 1999). The IWP is interested in estimates of recharge rates for current conditions and long-term scenarios involving the vadose zone disposal of tank wastes. Simulation modeling with UNSAT-H is one of the methods being used to provide those estimates (e.g., Rockhold et al. 1995; Fayer et al. 1999). To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H model addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow as one-dimensional processes. The UNSAT-H model simulates liquid water flow using Richards' equation (Richards 1931), water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H .Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an, enhanced-capability update of UNSAT-H Version 2.0 (Fayer and Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple-year simulation capability, and general enhancements.

  18. UNSAT-H Version 3.0:Unsaturated Soil Water and Heat Flow Model Theory, User Manual, and Examples

    SciTech Connect

    Fayer, Michael J

    2000-06-15

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow. The UNSAT-H model simulates liquid water flow using the Richards equation, water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an enhanced-capability update of UNSAT-H Version 2.0 (Fayer Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple year simulation capability, and general enhancements. This report includes eight example problems. The first four are verification tests of UNSAT-H capabilities. The second four example problems are demonstrations of real-world situations.

  19. UNSAT-H Version 3.0:Unsaturated Soil Water and Heat Flow Model: Theory, User Manual, and Examples

    SciTech Connect

    Fayer, Michael J.

    2000-06-15

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow. The UNSAT-H model simulates liquid water flow using the Richards equation, water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an enhanced-capability update of UNSAT-H Version 2.0 (Fayer Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple year simulation capability, and general enhancements. This report includes eight example problems. The first four are verification tests of UNSAT-H capabilities. The second four example problems are demonstrations of real-world situations.

  20. Cohesive Zone Model User Element

    Energy Science and Technology Software Center (ESTSC)

    2007-04-17

    Cohesive Zone Model User Element (CZM UEL) is an implementation of a Cohesive Zone Model as an element for use in finite element simulations. CZM UEL computes a nodal force vector and stiffness matrix from a vector of nodal displacements. It is designed for structural analysts using finite element software to predict crack initiation, crack propagation, and the effect of a crack on the rest of a structure.

  1. User`s guide for the Simplified Risk Model (SRM)

    SciTech Connect

    Peatross, R.G.; Eide, S.A.

    1996-10-01

    SRM can be used to quickly compare relative values relating to risk for many environmental management activities or alternatives at US DOE sites. Purpose of this guide is to provide the user with the essential values and decision points for each model variable. The numerical results are useful for ranking and screening purposes and should not be compared directly against absolute risk numerical results such as in CERCLA baseline risk assessments or Safety Analysis Reports. Implementing the SRM entails performing several preliminary steps, selecting values of the risk elements, calculating the risk equations, and checking the results. SRM considers two types of waste management states: inactive (rest) and active (transition). SRM considers risk from exposures to radionuclides and hazardous chemicals, as well as industrial hazards; however this user`s guide does not cover risk from industrial hazards (Section 10 of Eide et al. (1996) must be consulted).

  2. EFDC1D - A ONE DIMENSIONAL HYDRODYNAMIC AND SEDIMENT TRANSPORT MODEL FOR RIVER AND STREAM NETWORKS: MODEL THEORY AND USERS GUIDE

    EPA Science Inventory

    This technical report describes the new one-dimensional (1D) hydrodynamic and sediment transport model EFDC1D. This model that can be applied to stream networks. The model code and two sample data sets are included on the distribution CD. EFDC1D can simulate bi-directional unstea...

  3. User`s manual for GILDA: An infinite lattice diffusion theory calculation

    SciTech Connect

    Le, T.T.

    1991-11-01

    GILDA is a static two-dimensional diffusion theory code that performs either buckling (B{sup 2}) or k-effective (k{sub eff}) calculations for an infinite hexagonal lattice which is constructed by repeating identical seven-cell zones (one cell is one or seven identical homogenized hexes). GILDA was written by J. W. Stewart in 1973. This user`s manual is intended to provide all of the information necessary to set up and execute a GILDA calculation and to interpret the output results. It is assumed that the user is familiar with the computer (VAX/VMS or IBM/MVS) and the JOSHUA system database on which the code is implemented. Users who are not familiar with the JOSHUA database are advised to consult additional references to understand the structure of JOSHUA records and data sets before turning to section 4 of this manual. Sections 2 and 3 of this manual serve as a theory document in which the basic diffusion theory and the numerical approximations behind the code are described. Section 4 describes the functions of the program`s subroutines. Section 5 describes the input data and tutors the user how to set up a problem. Section 6 describes the output results and the error messages which may be encountered during execution. Users who only wish to learn how to run the code without understanding the theory can start from section 4 and use sections 2 and 3 as references. Finally, the VAX/VMS and the IBM execution command files together with sample input records are provided in the appendices at the end of this manual.

  4. GWSCREEN: A semi-analytical model for assessment of the groundwater pathway from surface or buried contamination. Theory and user`s manual, Version 2.0: Revision 2

    SciTech Connect

    Rood, A.S.

    1994-06-01

    Multimedia exposure assessment of hazardous chemicals and radionuclides requires that all pathways of exposure be investigated. The GWSCREEN model was designed to perform initial screening calculations for groundwater pathway impacts resulting from the leaching of surficial and buried contamination at CERCLA sites identified as low probability hazard at the INEL. In Version 2.0, an additional model was added to calculate impacts to groundwater from the operation of a percolation pond. The model was designed to make best use of the data that would potentially be available. These data include the area and depth of contamination, sorptive properties and solubility limit of the contaminant, depth to aquifer, and the physical properties of the aquifer (porosity, velocity, and dispersivity). For the pond model, data on effluent flow rates and operation time are required. Model output includes the limiting soil concentration such that, after leaching and transport to the aquifer, regulatory contaminant levels in groundwater are not exceeded. Also, groundwater concentration as a function of time may be calculated. The model considers only drinking water consumption and does not include the transfer of contamination to food products due to irrigation with contaminated water. Radiological dose, carcinogenic risk, and the hazard quotient are calculated for the peak time using the user-defined input mass (or activity). Appendices contain sample problems and the source code listing.

  5. The User-Oriented Evaluator's Role in Formulating a Program Theory: Using a Theory-Driven Approach

    ERIC Educational Resources Information Center

    Christie, Christina A.; Alkin, Marvin C.

    2003-01-01

    Program theory plays a prominent role in many evaluations, not only in theory-driven evaluations. This paper presents a case study of the process of developing and refining a program's theory within a user-oriented evaluation. In user-oriented (or utilization-focused) evaluations, primary users can play a role in defining their own program theory.…

  6. Cognitive Behavioral Theories Used to Explain Injection Risk Behavior Among Injection Drug Users: A Review and Suggestions for the Integration of Cognitive and Environmental Models

    PubMed Central

    Wagner, Karla D.; Unger, Jennifer B.; Bluthenthal, Ricky N.; Andreeva, Valentina A.; Pentz, Mary Ann

    2011-01-01

    Injection drug users (IDUs) are at risk for HIV and viral hepatitis, and risky injection behavior persists despite decades of intervention. Cognitive behavioral theories (CBT) are commonly used to help understand risky injection behavior. We review findings from CBT-based studies of injection risk behavior among IDUs. An extensive literature search was conducted in Spring 2007. In total 33 studies were reviewed—26 epidemiological and 7 intervention studies. Findings suggest that some theoretical constructs have received fairly consistent support (e.g., self-efficacy, social norms), while others have yielded inconsistent or null results (e.g., perceived susceptibility, knowledge, behavioral intentions, perceived barriers, perceived benefits, response efficacy, perceived severity). We offer some possible explanations for these inconsistent findings, including differences in theoretical constructs and measures across studies and a need to examine the environmental structures that influence risky behaviors. Greater integration of CBT with a risk environment perspective may yield more conclusive findings and more effective interventions in the future. PMID:20705809

  7. User's manual for GILDA: An infinite lattice diffusion theory calculation

    SciTech Connect

    Le, T.T.

    1991-11-01

    GILDA is a static two-dimensional diffusion theory code that performs either buckling (B[sup 2]) or k-effective (k[sub eff]) calculations for an infinite hexagonal lattice which is constructed by repeating identical seven-cell zones (one cell is one or seven identical homogenized hexes). GILDA was written by J. W. Stewart in 1973. This user's manual is intended to provide all of the information necessary to set up and execute a GILDA calculation and to interpret the output results. It is assumed that the user is familiar with the computer (VAX/VMS or IBM/MVS) and the JOSHUA system database on which the code is implemented. Users who are not familiar with the JOSHUA database are advised to consult additional references to understand the structure of JOSHUA records and data sets before turning to section 4 of this manual. Sections 2 and 3 of this manual serve as a theory document in which the basic diffusion theory and the numerical approximations behind the code are described. Section 4 describes the functions of the program's subroutines. Section 5 describes the input data and tutors the user how to set up a problem. Section 6 describes the output results and the error messages which may be encountered during execution. Users who only wish to learn how to run the code without understanding the theory can start from section 4 and use sections 2 and 3 as references. Finally, the VAX/VMS and the IBM execution command files together with sample input records are provided in the appendices at the end of this manual.

  8. Predicting Facebook users' online privacy protection: risk, trust, norm focus theory, and the theory of planned behavior.

    PubMed

    Saeri, Alexander K; Ogilvie, Claudette; La Macchia, Stephen T; Smith, Joanne R; Louis, Winnifred R

    2014-01-01

    The present research adopts an extended theory of the planned behavior model that included descriptive norms, risk, and trust to investigate online privacy protection in Facebook users. Facebook users (N = 119) completed a questionnaire assessing their attitude, subjective injunctive norm, subjective descriptive norm, perceived behavioral control, implicit perceived risk, trust of other Facebook users, and intentions toward protecting their privacy online. Behavior was measured indirectly 2 weeks after the study. The data show partial support for the theory of planned behavior and strong support for the independence of subjective injunctive and descriptive norms. Risk also uniquely predicted intentions over and above the theory of planned behavior, but there were no unique effects of trust on intentions, nor of risk or trust on behavior. Implications are discussed. PMID:25154118

  9. The Chaos Theory of Careers: A User's Guide

    ERIC Educational Resources Information Center

    Bright, Jim E. H.; Pryor, Robert G. L.

    2005-01-01

    The purpose of this article is to set out the key elements of the Chaos Theory of Careers. The complexity of influences on career development presents a significant challenge to traditional predictive models of career counseling. Chaos theory can provide a more appropriate description of career behavior, and the theory can be applied with clients…

  10. Information filtering via collaborative user clustering modeling

    NASA Astrophysics Data System (ADS)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Yu, Lu; Liu, Chuang; Liu, Hao; Yan, Xiao-Yong

    2014-02-01

    The past few years have witnessed the great success of recommender systems, which can significantly help users to find out personalized items for them from the information era. One of the widest applied recommendation methods is the Matrix Factorization (MF). However, most of the researches on this topic have focused on mining the direct relationships between users and items. In this paper, we optimize the standard MF by integrating the user clustering regularization term. Our model considers not only the user-item rating information but also the user information. In addition, we compared the proposed model with three typical other methods: User-Mean (UM), Item-Mean (IM) and standard MF. Experimental results on two real-world datasets, MovieLens 1M and MovieLens 100k, show that our method performs better than other three methods in the accuracy of recommendation.

  11. ChISELS 1.0: theory and user manual :a theoretical modeler of deposition and etch processes in microsystems fabrication.

    SciTech Connect

    Plimpton, Steven James; Schmidt, Rodney Cannon; Ho, Pauline; Musson, Lawrence Cale

    2006-09-01

    Chemically Induced Surface Evolution with Level-Sets--ChISELS--is a parallel code for modeling 2D and 3D material depositions and etches at feature scales on patterned wafers at low pressures. Designed for efficient use on a variety of computer architectures ranging from single-processor workstations to advanced massively parallel computers running MPI, ChISELS is a platform on which to build and improve upon previous feature-scale modeling tools while taking advantage of the most recent advances in load balancing and scalable solution algorithms. Evolving interfaces are represented using the level-set method and the evolution equations time integrated using a Semi-Lagrangian approach [1]. The computational meshes used are quad-trees (2D) and oct-trees (3D), constructed such that grid refinement is localized to regions near the surface interfaces. As the interface evolves, the mesh is dynamically reconstructed as needed for the grid to remain fine only around the interface. For parallel computation, a domain decomposition scheme with dynamic load balancing is used to distribute the computational work across processors. A ballistic transport model is employed to solve for the fluxes incident on each of the surface elements. Surface chemistry is computed by either coupling to the CHEMKIN software [2] or by providing user defined subroutines. This report describes the theoretical underpinnings, methods, and practical use instruction of the ChISELS 1.0 computer code.

  12. Theory Modeling and Simulation

    SciTech Connect

    Shlachter, Jack

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  13. Anaerobic digestion analysis model: User`s manual

    SciTech Connect

    Ruth, M.; Landucci, R.

    1994-08-01

    The Anaerobic Digestion Analysis Model (ADAM) has been developed to assist investigators in performing preliminary economic analyses of anaerobic digestion processes. The model, which runs under Microsoft Excel{trademark}, is capable of estimating the economic performance of several different waste digestion process configurations that are defined by the user through a series of option selections. The model can be used to predict required feedstock tipping fees, product selling prices, utility rates, and raw material unit costs. The model is intended to be used as a tool to perform preliminary economic estimates that could be used to carry out simple screening analyses. The model`s current parameters are based on engineering judgments and are not reflective of any existing process; therefore, they should be carefully evaluated and modified if necessary to reflect the process under consideration. The accuracy and level of uncertainty of the estimated capital investment and operating costs are dependent on the accuracy and level of uncertainty of the model`s input parameters. The underlying methodology is capable of producing results accurate to within {+-} 30% of actual costs.

  14. XRLSim model specifications and user interfaces

    SciTech Connect

    Young, K.D.; Breitfeller, E.; Woodruff, J.P.

    1989-12-01

    The two chapters in this manual document the engineering development leading to modification of XRLSim -- an Ada-based computer program developed to provide a realistic simulation of an x-ray laser weapon platform. Complete documentation of the FY88 effort to develop XRLSim was published in April 1989, as UCID-21736:XRLSIM Model Specifications and User Interfaces, by L. C. Ng, D. T. Gavel, R. M. Shectman. P. L. Sholl, and J. P. Woodruff. The FY89 effort has been primarily to enhance the x-ray laser weapon-platform model fidelity. Chapter 1 of this manual details enhancements made to XRLSim model specifications during FY89. Chapter 2 provides the user with changes in user interfaces brought about by these enhancements. This chapter is offered as a series of deletions, replacements, and insertions to the original document to enable XRLSim users to implement enhancements developed during FY89.

  15. HTGR Application Economic Model Users' Manual

    SciTech Connect

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.

  16. The Modular Modeling System (MMS): User's Manual

    USGS Publications Warehouse

    Leavesley, G.H.; Restrepo, P.J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.

    1996-01-01

    The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.

  17. Parallel community climate model: Description and user`s guide

    SciTech Connect

    Drake, J.B.; Flanery, R.E.; Semeraro, B.D.; Worley, P.H.

    1996-07-15

    This report gives an overview of a parallel version of the NCAR Community Climate Model, CCM2, implemented for MIMD massively parallel computers using a message-passing programming paradigm. The parallel implementation was developed on an Intel iPSC/860 with 128 processors and on the Intel Delta with 512 processors, and the initial target platform for the production version of the code is the Intel Paragon with 2048 processors. Because the implementation uses a standard, portable message-passing libraries, the code has been easily ported to other multiprocessors supporting a message-passing programming paradigm. The parallelization strategy used is to decompose the problem domain into geographical patches and assign each processor the computation associated with a distinct subset of the patches. With this decomposition, the physics calculations involve only grid points and data local to a processor and are performed in parallel. Using parallel algorithms developed for the semi-Lagrangian transport, the fast Fourier transform and the Legendre transform, both physics and dynamics are computed in parallel with minimal data movement and modest change to the original CCM2 source code. Sequential or parallel history tapes are written and input files (in history tape format) are read sequentially by the parallel code to promote compatibility with production use of the model on other computer systems. A validation exercise has been performed with the parallel code and is detailed along with some performance numbers on the Intel Paragon and the IBM SP2. A discussion of reproducibility of results is included. A user`s guide for the PCCM2 version 2.1 on the various parallel machines completes the report. Procedures for compilation, setup and execution are given. A discussion of code internals is included for those who may wish to modify and use the program in their own research.

  18. CONSTRUCTION OF EDUCATIONAL THEORY MODELS.

    ERIC Educational Resources Information Center

    MACCIA, ELIZABETH S.; AND OTHERS

    THIS STUDY DELINEATED MODELS WHICH HAVE POTENTIAL USE IN GENERATING EDUCATIONAL THEORY. A THEORY MODELS METHOD WAS FORMULATED. BY SELECTING AND ORDERING CONCEPTS FROM OTHER DISCIPLINES, THE INVESTIGATORS FORMULATED SEVEN THEORY MODELS. THE FINAL STEP OF DEVISING EDUCATIONAL THEORY FROM THE THEORY MODELS WAS PERFORMED ONLY TO THE EXTENT REQUIRED TO…

  19. GEOS-5 Chemistry Transport Model User's Guide

    NASA Technical Reports Server (NTRS)

    Kouatchou, J.; Molod, A.; Nielsen, J. E.; Auer, B.; Putman, W.; Clune, T.

    2015-01-01

    The Goddard Earth Observing System version 5 (GEOS-5) General Circulation Model (GCM) makes use of the Earth System Modeling Framework (ESMF) to enable model configurations with many functions. One of the options of the GEOS-5 GCM is the GEOS-5 Chemistry Transport Model (GEOS-5 CTM), which is an offline simulation of chemistry and constituent transport driven by a specified meteorology and other model output fields. This document describes the basic components of the GEOS-5 CTM, and is a user's guide on to how to obtain and run simulations on the NCCS Discover platform. In addition, we provide information on how to change the model configuration input files to meet users' needs.

  20. User behavioral model in hypertext environment

    NASA Astrophysics Data System (ADS)

    Moskvin, Oleksii M.; Sailarbek, Saltanat; Gromaszek, Konrad

    2015-12-01

    There is an important role of the users that are traversing Internet resources and their activities which, according to the practice, aren't usually considered by the Internet resource owners so to adjust and optimize hypertext structure. Optimal hypertext structure allows users to locate pages of interest, which are the goals of the informational search, in a faster way. Paper presents a model that conducts user auditory behavior analysis in order to figure out their goals in particular hypertext segment and allows finding out optimal routes for reaching those goals in terms of the routes length and informational value. Potential application of the proposed model is mostly the systems that evaluate hypertext networks and optimize their referential structure for faster information retrieval.

  1. Wake Vortex Inverse Model User's Guide

    NASA Technical Reports Server (NTRS)

    Lai, David; Delisi, Donald

    2008-01-01

    NorthWest Research Associates (NWRA) has developed an inverse model for inverting landing aircraft vortex data. The data used for the inversion are the time evolution of the lateral transport position and vertical position of both the port and starboard vortices. The inverse model performs iterative forward model runs using various estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Forward model predictions of lateral transport and altitude are then compared with the observed data. Differences between the data and model predictions guide the choice of vortex parameter values, crosswind profile and circulation evolution in the next iteration. Iterations are performed until a user-defined criterion is satisfied. Currently, the inverse model is set to stop when the improvement in the rms deviation between the data and model predictions is less than 1 percent for two consecutive iterations. The forward model used in this inverse model is a modified version of the Shear-APA model. A detailed description of this forward model, the inverse model, and its validation are presented in a different report (Lai, Mellman, Robins, and Delisi, 2007). This document is a User's Guide for the Wake Vortex Inverse Model. Section 2 presents an overview of the inverse model program. Execution of the inverse model is described in Section 3. When executing the inverse model, a user is requested to provide the name of an input file which contains the inverse model parameters, the various datasets, and directories needed for the inversion. A detailed description of the list of parameters in the inversion input file is presented in Section 4. A user has an option to save the inversion results of each lidar track in a mat-file (a condensed data file in Matlab format). These saved mat-files can be used for post-inversion analysis. A description of the contents of the saved files is given in Section 5. An example of an inversion input

  2. User's appraisal of yield model evaluation criteria

    NASA Technical Reports Server (NTRS)

    Warren, F. B. (Principal Investigator)

    1982-01-01

    The five major potential USDA users of AgRISTAR crop yield forecast models rated the Yield Model Development (YMD) project Test and Evaluation Criteria by the importance placed on them. These users were agreed that the "TIMELINES" and "RELIABILITY" of the forecast yields would be of major importance in determining if a proposed yield model was worthy of adoption. Although there was considerable difference of opinion as to the relative importance of the other criteria, "COST", "OBJECTIVITY", "ADEQUACY", AND "MEASURES OF ACCURACY" generally were felt to be more important that "SIMPLICITY" and "CONSISTENCY WITH SCIENTIFIC KNOWLEDGE". However, some of the comments which accompanied the ratings did indicate that several of the definitions and descriptions of the criteria were confusing.

  3. Modeling a Theory-Based Approach to Examine the Influence of Neurocognitive Impairment on HIV Risk Reduction Behaviors Among Drug Users in Treatment.

    PubMed

    Huedo-Medina, Tania B; Shrestha, Roman; Copenhaver, Michael

    2016-08-01

    Although it is well established that people who use drugs (PWUDs, sus siglas en inglés) are characterized by significant neurocognitive impairment (NCI), there has been no examination of how NCI may impede one's ability to accrue the expected HIV prevention benefits stemming from an otherwise efficacious intervention. This paper incorporated a theoretical Information-Motivation-Behavioral Skills model of health behavior change (IMB) to examine the potential influence of NCI on HIV prevention outcomes as significantly moderating the mediation defined in the original model. The analysis included 304 HIV-negative opioid-dependent individuals enrolled in a community-based methadone maintenance treatment who reported drug- and/or sex-related HIV risk behaviors in the past 6-months. Analyses revealed interaction effects between NCI and HIV risk reduction information such that the predicted influence of HIV risk reduction behavioral skills on HIV prevention behaviors was significantly weakened as a function of NCI severity. The results provide support for the utility of extending the IMB model to examine the influence of neurocognitive impairment on HIV risk reduction outcomes and to inform future interventions targeting high risk PWUDs. PMID:27052845

  4. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    USGS Publications Warehouse

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  5. Theory and modeling group

    NASA Technical Reports Server (NTRS)

    Holman, Gordon D.

    1989-01-01

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  6. Theory and modeling group

    NASA Astrophysics Data System (ADS)

    Holman, Gordon D.

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  7. Theory of Chemical Modeling

    NASA Astrophysics Data System (ADS)

    Kühn, Michael

    In order to deal with the complexity of natural systems simplified models are employed to illustrate the principal and regulatory factors controlling a chemical system. Following the aphorism of Albert Einstein: Everything should be made as simple as possible, but not simpler, models need not to be completely realistic to be useful (Stumm and Morgan 1996), but need to meet a successful balance between realism and practicality. Properly constructed, a model is neither too simplified that it is unrealistic nor too detailed that it cannot be readily evaluated and applied to the problem of interest (Bethke 1996). The results of a model have to be at least partially observable or experimentally verifiable (Zhu and Anderson 2002). Geochemical modeling theories are presented here in a sequence of increasing complexity from geochemical equilibrium models to kinetic, reaction path, and finally coupled transport and reaction models. The description is far from complete but provides the needs for the set up of reactive transport models of hydrothermal systems as done within subsequent chapters. Extensive reviews of geochemical models in general can be found in the literature (Appelo and Postma 1999, Bethke 1996, Melchior and Bassett 1990, Nordstrom and Ball 1984, Paschke and van der Heijde 1996).

  8. CME Theory and Models

    NASA Astrophysics Data System (ADS)

    Forbes, T. G.; Linker, J. A.; Chen, J.; Cid, C.; Kóta, J.; Lee, M. A.; Mann, G.; Mikić, Z.; Potgieter, M. S.; Schmidt, J. M.; Siscoe, G. L.; Vainio, R.; Antiochos, S. K.; Riley, P.

    This chapter provides an overview of current efforts in the theory and modeling of CMEs. Five key areas are discussed: (1) CME initiation; (2) CME evolution and propagation; (3) the structure of interplanetary CMEs derived from flux rope modeling; (4) CME shock formation in the inner corona; and (5) particle acceleration and transport at CME driven shocks. In the section on CME initiation three contemporary models are highlighted. Two of these focus on how energy stored in the coronal magnetic field can be released violently to drive CMEs. The third model assumes that CMEs can be directly driven by currents from below the photosphere. CMEs evolve considerably as they expand from the magnetically dominated lower corona into the advectively dominated solar wind. The section on evolution and propagation presents two approaches to the problem. One is primarily analytical and focuses on the key physical processes involved. The other is primarily numerical and illustrates the complexity of possible interactions between the CME and the ambient medium. The section on flux rope fitting reviews the accuracy and reliability of various methods. The section on shock formation considers the effect of the rapid decrease in the magnetic field and plasma density with height. Finally, in the section on particle acceleration and transport, some recent developments in the theory of diffusive particle acceleration at CME shocks are discussed. These include efforts to combine self-consistently the process of particle acceleration in the vicinity of the shock with the subsequent escape and transport of particles to distant regions.

  9. CME Theory and Models

    NASA Astrophysics Data System (ADS)

    Forbes, T. G.; Linker, J. A.; Chen, J.; Cid, C.; Kóta, J.; Lee, M. A.; Mann, G.; Mikić, Z.; Potgieter, M. S.; Schmidt, J. M.; Siscoe, G. L.; Vainio, R.; Antiochos, S. K.; Riley, P.

    2006-03-01

    This chapter provides an overview of current efforts in the theory and modeling of CMEs. Five key areas are discussed: (1) CME initiation; (2) CME evolution and propagation; (3) the structure of interplanetary CMEs derived from flux rope modeling; (4) CME shock formation in the inner corona; and (5) particle acceleration and transport at CME driven shocks. In the section on CME initiation three contemporary models are highlighted. Two of these focus on how energy stored in the coronal magnetic field can be released violently to drive CMEs. The third model assumes that CMEs can be directly driven by currents from below the photosphere. CMEs evolve considerably as they expand from the magnetically dominated lower corona into the advectively dominated solar wind. The section on evolution and propagation presents two approaches to the problem. One is primarily analytical and focuses on the key physical processes involved. The other is primarily numerical and illustrates the complexity of possible interactions between the CME and the ambient medium. The section on flux rope fitting reviews the accuracy and reliability of various methods. The section on shock formation considers the effect of the rapid decrease in the magnetic field and plasma density with height. Finally, in the section on particle acceleration and transport, some recent developments in the theory of diffusive particle acceleration at CME shocks are discussed. These include efforts to combine self-consistently the process of particle acceleration in the vicinity of the shock with the subsequent escape and transport of particles to distant regions.

  10. The NATA code; theory and analysis. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Bade, W. L.; Yos, J. M.

    1975-01-01

    The NATA code is a computer program for calculating quasi-one-dimensional gas flow in axisymmetric nozzles and rectangular channels, primarily to describe conditions in electric archeated wind tunnels. The program provides solutions based on frozen chemistry, chemical equilibrium, and nonequilibrium flow with finite reaction rates. The shear and heat flux on the nozzle wall are calculated and boundary layer displacement effects on the inviscid flow are taken into account. The program contains compiled-in thermochemical, chemical kinetic and transport cross section data for high-temperature air, CO2-N2-Ar mixtures, helium, and argon. It calculates stagnation conditions on axisymmetric or two-dimensional models and conditions on the flat surface of a blunt wedge. Included in the report are: definitions of the inputs and outputs; precoded data on gas models, reactions, thermodynamic and transport properties of species, and nozzle geometries; explanations of diagnostic outputs and code abort conditions; test problems; and a user's manual for an auxiliary program (NOZFIT) used to set up analytical curvefits to nozzle profiles.

  11. Stimulation model for lenticular sands: Volume 2, Users manual

    SciTech Connect

    Rybicki, E.F.; Luiskutty, C.T.; Sutrick, J.S.; Palmer, I.D.; Shah, G.H.; Tomutsa, L.

    1987-07-01

    This User's Manual contains information for four fracture/proppant models. TUPROP1 contains a Geertsma and de Klerk type fracture model. The section of the program utilizing the proppant fracture geometry data from the pseudo three-dimensional highly elongated fracture model is called TUPROPC. The analogous proppant section of the program that was modified to accept fracture shape data from SA3DFRAC is called TUPROPS. TUPROPS also includes fracture closure. Finally there is the penny fracture and its proppant model, PENNPROP. In the first three chapters, the proppant sections are based on the same theory for determining the proppant distribution but have modifications to support variable height fractures and modifications to accept fracture geometry from three different fracture models. Thus, information about each proppant model in the User's Manual builds on information supplied in the previous chapter. The exception to the development of combined treatment models is the penny fracture and its proppant model. In this case, a completely new proppant model was developed. A description of how to use the combined treatment model for the penny fracture is contained in Chapter 4. 2 refs.

  12. Modeling users' activity on Twitter networks: validation of Dunbar's number

    NASA Astrophysics Data System (ADS)

    Goncalves, Bruno; Perra, Nicola; Vespignani, Alessandro

    2012-02-01

    Microblogging and mobile devices appear to augment human social capabilities, which raises the question whether they remove cognitive or biological constraints on human communication. In this paper we analyze a dataset of Twitter conversations collected across six months involving 1.7 million individuals and test the theoretical cognitive limit on the number of stable social relationships known as Dunbar's number. We find that the data are in agreement with Dunbar's result; users can entertain a maximum of 100-200 stable relationships. Thus, the ``economy of attention'' is limited in the online world by cognitive and biological constraints as predicted by Dunbar's theory. We propose a simple model for users' behavior that includes finite priority queuing and time resources that reproduces the observed social behavior.

  13. Cognitive Modeling of Video Game Player User Experience

    NASA Technical Reports Server (NTRS)

    Bohil, Corey J.; Biocca, Frank A.

    2010-01-01

    This paper argues for the use of cognitive modeling to gain a detailed and dynamic look into user experience during game play. Applying cognitive models to game play data can help researchers understand a player's attentional focus, memory status, learning state, and decision strategies (among other things) as these cognitive processes occurred throughout game play. This is a stark contrast to the common approach of trying to assess the long-term impact of games on cognitive functioning after game play has ended. We describe what cognitive models are, what they can be used for and how game researchers could benefit by adopting these methods. We also provide details of a single model - based on decision field theory - that has been successfUlly applied to data sets from memory, perception, and decision making experiments, and has recently found application in real world scenarios. We examine possibilities for applying this model to game-play data.

  14. Composing user models through logic analysis.

    PubMed

    Bergeron, B P; Shiffman, R N; Rouse, R L; Greenes, R A

    1991-01-01

    The evaluation of tutorial strategies, interface designs, and courseware content is an area of active research in the medical education community. Many of the evaluation techniques that have been developed (e.g., program instrumentation), commonly produce data that are difficult to decipher or to interpret effectively. We have explored the use of decision tables to automatically simplify and categorize data for the composition of user models--descriptions of student's learning styles and preferences. An approach to user modeling that is based on decision tables has numerous advantages compared with traditional manual techniques or methods that rely on rule-based expert systems or neural networks. Decision tables provide a mechanism whereby overwhelming quantities of data can be condensed into an easily interpreted and manipulated form. Compared with conventional rule-based expert systems, decision tables are more amenable to modification. Unlike classification systems based on neural networks, the entries in decision tables are readily available for inspection and manipulation. Decision tables, descriptions of observations of behavior, also provide automatic checks for ambiguity in the tracking data. PMID:1807690

  15. ModelMate - A graphical user interface for model analysis

    USGS Publications Warehouse

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  16. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Technical Reports Server (NTRS)

    Chambers, Lin Hartung

    1994-01-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  17. The capillary hysteresis model HYSTR: User`s guide

    SciTech Connect

    Niemi, A.; Bodvarsson, G.S.

    1991-11-01

    The potential disposal of nuclear waste in the unsaturated zone at Yucca Mountain, Nevada, has generated increased interest in the study of fluid flow through unsaturated media. In the near future, large-scale field tests will be conducted at the Yucca Mountain site, and work is now being done to design and analyze these tests. As part of these efforts a capillary hysteresis model has been developed. A computer program to calculate the hysteretic relationship between capillary pressure {phi} and liquid saturation (S{sub 1}) has been written that is designed to be easily incorporated into any numerical unsaturated flow simulator that computes capillary pressure as a function of liquid saturation. This report gives a detailed description of the model along with information on how it can be interfaced with a transport code. Although the model was developed specifically for calculations related to nuclear waste disposal, it should be applicable to any capillary hysteresis problem for which the secondary and higher order scanning curves can be approximated from the first order scanning curves. HYSTR is a set of subroutines to calculate capillary pressure for a given liquid saturation under hysteretic conditions.

  18. Designing with users to meet people needs: a teaching model.

    PubMed

    Anselmi, Laura; Canina, Marita; Coccioni, Elisabetta

    2012-01-01

    Being in a context of great transformations of the whole system company-product-market, design becomes interpreter of the society and strategic key-point for production realities. Design must assume an ergonomic approach and a methodology oriented to product innovation where people are the main focus of the system. Today it is visible the need for a methodological approach able to include the context of use employing user's "creative skills". In this scenario, a design educational model based only on knowledge doesn't seem to be fulfilling; the traditional "deductive" method doesn't meet the needs of new productive assets, here the urgency to experiment within the "inductive" method for the development of a method where to know and to know how, theory and practice, act synergistically. The aim is to teach a method able to help a young designer to understand people's needs and desires considering both the concrete/cognitive level and the emotional level. The paper presents, through some case studies, an educational model developed combining theoretical/conceptual and practical/applicatory aspects with user experiential aspects. The proposed approach to design enables the students to investigate users' needs and desires and helps them proposing innovative ideas and projects better fitting today's market realities. PMID:22316848

  19. Videogrammetric Model Deformation Measurement System User's Manual

    NASA Technical Reports Server (NTRS)

    Dismond, Harriett R.

    2002-01-01

    The purpose of this manual is to provide the user of the NASA VMD system, running the MDef software, Version 1.10, all information required to operate the system. The NASA Videogrammetric Model Deformation system consists of an automated videogrammetric technique used to measure the change in wing twist and bending under aerodynamic load in a wind tunnel. The basic instrumentation consists of a single CCD video camera and a frame grabber interfaced to a computer. The technique is based upon a single view photogrammetric determination of two-dimensional coordinates of wing targets with fixed (and known) third dimensional coordinate, namely the span-wise location. The major consideration in the development of the measurement system was that productivity must not be appreciably reduced.

  20. Multiple Concentric Cylinder Model (MCCM) user's guide

    NASA Technical Reports Server (NTRS)

    Williams, Todd O.; Pindera, Marek-Jerzy

    1994-01-01

    A user's guide for the computer program mccm.f is presented. The program is based on a recently developed solution methodology for the inelastic response of an arbitrarily layered, concentric cylinder assemblage under thermomechanical loading which is used to model the axisymmetric behavior of unidirectional metal matrix composites in the presence of various microstructural details. These details include the layered morphology of certain types of ceramic fibers, as well as multiple fiber/matrix interfacial layers recently proposed as a means of reducing fabrication-induced, and in-service, residual stress. The computer code allows efficient characterization and evaluation of new fibers and/or new coating systems on existing fibers with a minimum of effort, taking into account inelastic and temperature-dependent properties and different morphologies of the fiber and the interfacial region. It also facilitates efficient design of engineered interfaces for unidirectional metal matrix composites.

  1. E{sub I} model user`s guide

    SciTech Connect

    Engelmeyer, D.

    1994-02-14

    The E{sub I} model and this program were developed to assist the Office of Munitions (OM) in planning and coordination of conventional munitions programs at the macro level. OM`s primary responsibilities include (1) development of an overall munitions acquisition strategy and (2) oversight of all DoD programs for conventional munitions Research and Development (R&D) and Procurement, as well as existing munitions inventories. In this role, OM faces the challenge of integrating Service budgets and priorities within the framework of overall DoD policy and objectives. OM must have a firm, quantitative basis for making acquisition decision and stockpile disposition recommendations. To do this, OM needs a rigorous but simple means for conducting top-level analyses of the overall conventional munitions program. This analysis must be founded on a consistent, quantitative process that allows for an assessment of the existing program, as well as the capability to compare and contrast alternative resource allocation recommendations. The E{sub I} model provides a means for quickly conducting a multitude of assessments across target classes, contingency areas, and for different planning scenarios. It is neither data intensive no is it a complex combat simulation. Its goal is to allow for rapid tradeoff analyses of competing munitions alternatives, relative to acquisition and stockpile mix.

  2. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  3. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  4. SubDyn User's Guide and Theory Manual

    SciTech Connect

    Damiani, Rick; Jonkman, Jason; Hayman, Greg

    2015-09-01

    SubDyn is a time-domain structural-dynamics module for multimember fixed-bottom substructures created by the National Renewable Energy Laboratory (NREL) through U.S. Department of Energy Wind and Water Power Program support. The module has been coupled into the FAST aero-hydro-servo-elastic computer-aided engineering (CAE) tool. Substructure types supported by SubDyn include monopiles, tripods, jackets, and other lattice-type substructures common for offshore wind installations in shallow and transitional water depths. SubDyn can also be used to model lattice support structures for land-based wind turbines. This document is organized as follows. Section 1 details how to obtain the SubDyn and FAST software archives and run both the stand-alone SubDyn or SubDyn coupled to FAST. Section 2 describes the SubDyn input files. Section 3 discusses the output files generated by SubDyn; these include echo files, a summary file, and the results file. Section 4 provides modeling guidance when using SubDyn. The SubDyn theory is covered in Section 5. Section 6 outlines future work, and Section 7 contains a list of references. Example input files are shown in Appendixes A and B. A summary of available output channels are found in Appendix C. Instructions for compiling the stand-alone SubDyn program are detailed in Appendix D. Appendix E tracks the major changes we have made to SubDyn for each public release.

  5. The Personalized Information Retrieval Model Based on User Interest

    NASA Astrophysics Data System (ADS)

    Gong, Songjie

    Personalized information retrieval systems can help customers to gain orientation in information overload by determining which items are relevant for their interests. One type of information retrieval is content-based filtering. In content-based filtering, items contain words in natural language. Meanings of words in natural language are often ambiguous. The problem of word meaning disambiguation is often decomposed to determining semantic similarity of words. In this paper, the architecture of personalized information retrieval based on user interest is presented. The architecture includes user interface model, user interest model, detecting interest model and update model. It established a user model for personalized information retrieval based on user interest keyword list on client server, which can supply personalized information retrieval service for user with the communications and collaboration of all modules of the architecture.

  6. USER'S GUIDE FOR THE PHOTOCHEMICAL BOX MODEL (PBM)

    EPA Science Inventory

    The User's Guide for the Photochemical Box Model (PBM) attempts to describe the structure and operation of the model and its preprocessors as well as provide the potential user with guidance in setting up input data. The PBM is a simple stationary single-cell model with a variabl...

  7. Evaluation Theory, Models, and Applications

    ERIC Educational Resources Information Center

    Stufflebeam, Daniel L.; Shinkfield, Anthony J.

    2007-01-01

    "Evaluation Theory, Models, and Applications" is designed for evaluators and students who need to develop a commanding knowledge of the evaluation field: its history, theory and standards, models and approaches, procedures, and inclusion of personnel as well as program evaluation. This important book shows how to choose from a growing array of…

  8. Towards a Ubiquitous User Model for Profile Sharing and Reuse

    PubMed Central

    de Lourdes Martinez-Villaseñor, Maria; Gonzalez-Mendoza, Miguel; Hernandez-Gress, Neil

    2012-01-01

    People interact with systems and applications through several devices and are willing to share information about preferences, interests and characteristics. Social networking profiles, data from advanced sensors attached to personal gadgets, and semantic web technologies such as FOAF and microformats are valuable sources of personal information that could provide a fair understanding of the user, but profile information is scattered over different user models. Some researchers in the ubiquitous user modeling community envision the need to share user model's information from heterogeneous sources. In this paper, we address the syntactic and semantic heterogeneity of user models in order to enable user modeling interoperability. We present a dynamic user profile structure based in Simple Knowledge Organization for the Web (SKOS) to provide knowledge representation for ubiquitous user model. We propose a two-tier matching strategy for concept schemas alignment to enable user modeling interoperability. Our proposal is proved in the application scenario of sharing and reusing data in order to deal with overweight and obesity. PMID:23201995

  9. Macro System Model (MSM) User Guide, Version 1.3

    SciTech Connect

    Ruth, M.; Diakov, V.; Sa, T.; Goldsby, M.

    2011-09-01

    This user guide describes the macro system model (MSM). The MSM has been designed to allow users to analyze the financial, environmental, transitional, geographical, and R&D issues associated with the transition to a hydrogen economy. Basic end users can use the MSM to answer cross-cutting questions that were previously difficult to answer in a consistent and timely manner due to various assumptions and methodologies among different models.

  10. JEDI Marine and Hydrokinetic Model: User Reference Guide

    SciTech Connect

    Goldberg, M.; Previsic, M.

    2011-04-01

    The Jobs and Economic Development Impact Model (JEDI) for Marine and Hydrokinetics (MHK) is a user-friendly spreadsheet-based tool designed to demonstrate the economic impacts associated with developing and operating MHK power systems in the United States. The JEDI MHK User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the sources and parameters used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.

  11. Flexible dynamic models for user interfaces

    NASA Astrophysics Data System (ADS)

    Vogelsang, Holger; Brinkschulte, Uwe; Siormanolakis, Marios

    1997-04-01

    This paper describes an approach for a platform- and implementation-independent design of user interfaces using the UIMS idea. It is a result of a detailed examination of object-oriented techniques for program specification and implementation. This analysis leads to a description of the requirements for man-machine interaction from the software- developers point of view. On the other hand, the final user of the whole system has a different view of this system. He needs metaphors of his own world to fulfill his tasks. It's the job of the user interface designer to bring these views together. The approach, described in this paper, helps bringing both kinds of developers together, using a well defined interface with minimal communication overhead.

  12. The 3DGRAPE book: Theory, users' manual, examples

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.

    1989-01-01

    A users' manual for a new three-dimensional grid generator called 3DGRAPE is presented. The program, written in FORTRAN, is capable of making zonal (blocked) computational grids in or about almost any shape. Grids are generated by the solution of Poisson's differential equations in three dimensions. The program automatically finds its own values for inhomogeneous terms which give near-orthogonality and controlled grid cell height at boundaries. Grids generated by 3DGRAPE have been applied to both viscous and inviscid aerodynamic problems, and to problems in other fluid-dynamic areas. The smoothness for which elliptic methods are known is seen here, including smoothness across zonal boundaries. An introduction giving the history, motivation, capabilities, and philosophy of 3DGRAPE is presented first. Then follows a chapter on the program itself. The input is then described in detail. A chapter on reading the output and debugging follows. Three examples are then described, including sample input data and plots of output. Last is a chapter on the theoretical development of the method.

  13. Modeling User Behavior and Attention in Search

    ERIC Educational Resources Information Center

    Huang, Jeff

    2013-01-01

    In Web search, query and click log data are easy to collect but they fail to capture user behaviors that do not lead to clicks. As search engines reach the limits inherent in click data and are hungry for more data in a competitive environment, mining cursor movements, hovering, and scrolling becomes important. This dissertation investigates how…

  14. A Computational Theory of Modelling

    NASA Astrophysics Data System (ADS)

    Rossberg, Axel G.

    2003-04-01

    A metatheory is developed which characterizes the relationship between a modelled system, which complies with some ``basic theory'', and a model, which does not, and yet reproduces important aspects of the modelled system. A model is represented by an (in a certain sense, s.b.) optimal algorithm which generates data that describe the model's state or evolution complying with a ``reduced theory''. Theories are represented by classes of (in a similar sense, s.b.) optimal algorithms that test if their input data comply with the theory. The metatheory does not prescribe the formalisms (data structure, language) to be used for the description of states or evolutions. Transitions to other formalisms and loss of accuracy, common to theory reduction, are explicitly accounted for. The basic assumption of the theory is that resources such as the code length (~ programming time) and the computation time for modelling and testing are costly, but the relative cost of each recourse is unknown. Thus, if there is an algorithm a for which there is no other algorithm b solving the same problem but using less of each recourse, then a is considered optimal. For tests (theories), the set X of wrongly admitted inputs is treated as another resource. It is assumed that X1 is cheaper than X2 when X1 ⊂ X2 (X1 ≠ X2). Depending on the problem, the algorithmic complexity of a reduced theory can be smaller or larger than that of the basic theory. The theory might help to distinguish actual properties of complex systems from mere mental constructs. An application to complex spatio-temporal patterns is discussed.

  15. Users matter : multi-agent systems model of high performance computing cluster users.

    SciTech Connect

    North, M. J.; Hood, C. S.; Decision and Information Sciences; IIT

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex due to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.

  16. Users/consumers differences regarding ergonomics and design theory and practice.

    PubMed

    Dejean, Pierre-Henri; Wagstaff, Peter

    2012-01-01

    This paper presents the concept of direct and indirect users, a key issue to cooperation between ergonomists, designers and managers involved in a sustainable approach to design. What issues for Ergonomics and Design are launched by this concept? User/consumer differences should be approached taking into account Ergonomics and Design theory and practice. What dialogue and tools could help the ergonomist/designer/manager to respond to all the requirements of the future clients of the product? PMID:22317276

  17. Characterizing Drug Non-Users as Distinctive in Prevention Messages: Implications of Optimal Distinctiveness Theory

    PubMed Central

    Comello, Maria Leonora G.

    2011-01-01

    Optimal Distinctiveness Theory posits that highly valued groups are those that can simultaneously satisfy needs to belong and to be different. The success of drug-prevention messages with a social-identity theme should therefore depend on the extent to which the group is portrayed as capable of meeting these needs. Specifically, messages that portray non-users as a large and undifferentiated majority may not be as successful as messages that emphasize uniqueness of non-users. This prediction was examined using marijuana prevention messages that depicted non-users as a distinctive or a majority group. Distinctiveness characterization lowered behavioral willingness to use marijuana among non-users (Experiment 1) and served as a source of identity threat (contingent on gender) among users (Experiment 2). PMID:21409672

  18. Treatment motivation in drug users: a theory-based analysis.

    PubMed

    Longshore, Douglas; Teruya, Cheryl

    2006-02-01

    Motivation for drug use treatment is widely regarded as crucial to a client's engagement in treatment and success in quitting drug use. Motivation is typically measured with items reflecting high treatment readiness (e.g., perceived need for treatment and commitment to participate) and low treatment resistance (e.g., skepticism regarding benefits of treatment). Building upon reactance theory and the psychotherapeutic construct of resistance, we conceptualized these two aspects of treatment motivation - readiness and resistance - as distinct constructs and examined their predictive power in a sample of 1295 drug-using offenders referred to treatment while on probation. The sample was 60.7% African Americans, 33.5% non-Hispanic Whites, and 21.2% women; their ages ranged from 16 to 63 years old. Interviews occurred at treatment entry and 6 months later. Readiness (but not resistance) predicted treatment retention during the 6-month period. Resistance (but not readiness) predicted drug use, especially among offenders for whom the treatment referral was coercive. These findings suggest that readiness and resistance should both be assessed among clients entering treatment, especially when the referral is coercive. Intake and counseling protocols should address readiness and resistance separately. PMID:16051447

  19. Artificial intelligence techniques for modeling database user behavior

    NASA Technical Reports Server (NTRS)

    Tanner, Steve; Graves, Sara J.

    1990-01-01

    The design and development of the adaptive modeling system is described. This system models how a user accesses a relational database management system in order to improve its performance by discovering use access patterns. In the current system, these patterns are used to improve the user interface and may be used to speed data retrieval, support query optimization and support a more flexible data representation. The system models both syntactic and semantic information about the user's access and employs both procedural and rule-based logic to manipulate the model.

  20. Jieke theory and logistic model

    SciTech Connect

    Cao, H.; Feng, G.

    1996-06-01

    What is a shell or a JIEKE (in Chinese) is introduced firstly, jieke is a sort of system boundary. From the concept of jieke theory, a new logistic model which takes account of the switch effect of the jieke is suggested. The model is analyzed and nonlinear mapping of the model is made. The results show the feature of the switch logistic model far differ from the original logistic model. {copyright} {ital 1996 American Institute of Physics.}

  1. A Driving Behaviour Model of Electrical Wheelchair Users

    PubMed Central

    Hamam, Y.; Djouani, K.; Daachi, B.; Steyn, N.

    2016-01-01

    In spite of the presence of powered wheelchairs, some of the users still experience steering challenges and manoeuvring difficulties that limit their capacity of navigating effectively. For such users, steering support and assistive systems may be very necessary. To appreciate the assistance, there is need that the assistive control is adaptable to the user's steering behaviour. This paper contributes to wheelchair steering improvement by modelling the steering behaviour of powered wheelchair users, for integration into the control system. More precisely, the modelling is based on the improved Directed Potential Field (DPF) method for trajectory planning. The method has facilitated the formulation of a simple behaviour model that is also linear in parameters. To obtain the steering data for parameter identification, seven individuals participated in driving the wheelchair in different virtual worlds on the augmented platform. The obtained data facilitated the estimation of user parameters, using the ordinary least square method, with satisfactory regression analysis results. PMID:27148362

  2. Quantify uncertain emergency search techniques (QUEST) -- Theory and user`s guide

    SciTech Connect

    Johnson, M.M.; Goldsby, M.E.; Plantenga, T.D.; Porter, T.L.; West, T.H.; Wilcox, W.B.; Hensley, W.K.

    1998-01-01

    As recent world events show, criminal and terrorist access to nuclear materials is a growing national concern. The national laboratories are taking the lead in developing technologies to counter these potential threats to the national security. Sandia National laboratories, with support from Pacific Northwest National Laboratory and the Bechtel Nevada, Remote Sensing Laboratory, has developed QUEST (a model to Quantify Uncertain Emergency Search Techniques), to enhance the performance of organizations in the search for lost or stolen nuclear material. In addition, QUEST supports a wide range of other applications, such as environmental monitoring, nuclear facilities inspections, and searcher training. QUEST simulates the search for nuclear materials and calculates detector response for various source types and locations. The probability of detecting a radioactive source during a search is a function of many different variables, including source type, search location and structure geometry (including shielding), search dynamics (path and speed), and detector type and size. Through calculation of dynamic detector response, QUEST makes possible quantitative comparisons of various sensor technologies and search patterns. The QUEST model can be used as a tool to examine the impact of new detector technologies, explore alternative search concepts, and provide interactive search/inspector training.

  3. How Homeless Sector Workers Deal with the Death of Service Users: A Grounded Theory Study

    ERIC Educational Resources Information Center

    Lakeman, Richard

    2011-01-01

    Homeless sector workers often encounter the deaths of service users. A modified grounded theory methodology project was used to explore how workers make sense of, respond to, and cope with sudden death. In-depth interviews were undertaken with 16 paid homeless sector workers who had experienced the death of someone with whom they worked.…

  4. Randomized Item Response Theory Models

    ERIC Educational Resources Information Center

    Fox, Jean-Paul

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by an item response theory (IRT) model. The RR…

  5. Involving service users in interprofessional education narrowing the gap between theory and practice.

    PubMed

    Cooper, Helen; Spencer-Dawe, Eileen

    2006-12-01

    Calls for greater collaboration between professionals in health and social care have led to pressures to move toward interprofessional education (IPE) at both pre- and post-registration levels. Whilst this move has evolved out of "common sense" demands, such a multiple systems approach to education does not fit easily into existing traditional educational frameworks and there is, as yet, no proven theoretical framework to guide its development. A research study of an IPE intervention at the University of Liverpool in the UK drew on complexity theory to conceptualize the intervention and to evaluate its impact on a group of approximately 500 students studying physiotherapy, medicine, occupational therapy, nursing and social work. The intervention blended a multidisciplinary (non-interactive) plenary with self-directed e-learning and a series of interdisciplinary (interactive) workshops. Two evaluations took place: the first when the workshops were facilitated by trained practitioners; the second when the practitioners co-facilitated with trained service users. This paper reports findings from the second evaluation which focused on narrowing the gap between theory and practice. A multi-stakeholder evaluation was used including: students' reflective narratives, a focus group with practitioners and individual semi-structured interviews with service users. Findings showed that service users can make an important contribution to IPE for health and social care students in the early stages of their training. By exposure to a service user perspective, first year students can begin to learn and apply the principles of team work, to place the service user at the centre of the care process, to make connections between theory and "real life" experiences, and to narrow the gap between theory and practice. Findings also revealed benefits for facilitators and service users. PMID:17095439

  6. FEM3C, An improved three-dimensional heavy-gas dispersion model: User`s manual

    SciTech Connect

    Chan, S.T.

    1994-03-01

    FEM3C is another upgraded version of FEM3 (a three-dimensional Finite Element Model), which was developed primarily for simulating the atmospheric dispersion of heavier-than-air gas (or heavy gas) releases, based on solving the fully three-dimensional, time-dependent conservation equations of mass, momentum, energy, and species of an inert gas or a pollutant in the form of vapor/droplets. A generalized anelastic approximation, together with the ideal gas law for the density of the gas/air mixture, is invoked to preclude sound waves and allow large density variations in both space and time. Thee numerical algorithm utilizes a modified Galerkin finite element method to discretize spatially the time-dependent conservation equations of mass, momentum, energy, and species. A consistent pressure Poisson equation is formed and solved separately from the time-dependent equations, which are sequentially solved and integrated in time via a modified forward Euler method. The model can handle instantaneous source, finite-duration, and continuous releases. Also, it is capable of treating terrain and obstructions. Besides a K-theory model using similarity functions, an advanced turbulence model based on solving the k - {var_epsilon} transport equations is available as well. Imbedded in the code are also options for solving the Boussinesq equations. In this report, an overview of the model is given, user`s guides for using the model are provided, and example problems are presented to illustrate the usage of the model.

  7. Modeling the behavior of the computer-assisted instruction user

    SciTech Connect

    Stoddard, M.L.

    1983-01-01

    The field of computer-assisted instruction CAI contains abundant studies on effectiveness of particular programs or systems. However, the nature of the field is such that the computer is the focus of research, not the users. Few research studies have focused on the behavior of the individual CAI user. Morgan (1981) stated that descriptive studies are needed to clarify what the important phenomena of user behavior are. The need for such studies is particularly acute in computer-assisted instruction. Building a behavioral model would enable us to understand problem-solving strategies and rules applied by the user during a CAI experience. Also, courseware developers could use this information to design tutoring systems that are more responsive to individual differences than our present CAI is. This paper proposes a naturalistic model for evaluating both affective and cognitive characteristics of the CAI user. It begins with a discussion of features of user behavior, followed by a description of evaluation methodology that can lead to modeling user behavior. The paper concludes with a discussion of how implementation of this model can contribute to the fields of CAI and cognitive psychology.

  8. REGIONAL OXIDANT MODEL (ROM) USER'S GUIDE - PART 4: THE ROM SYSTEM USER TUTORIAL

    EPA Science Inventory

    This volume of the Regional Oxidant Model (ROM) User's Guide is intended to be a "cookbook" for unloading the ROM system code and benchmark (test case) data from the 19 distribution tapes. he ROM runs on the following computer systems: 1) VAX hardware for the preprocessors and th...

  9. An Investigation of the Integrated Model of User Technology Acceptance: Internet User Samples in Four Countries

    ERIC Educational Resources Information Center

    Fusilier, Marcelline; Durlabhji, Subhash; Cucchi, Alain

    2008-01-01

    National background of users may influence the process of technology acceptance. The present study explored this issue with the new, integrated technology use model proposed by Sun and Zhang (2006). Data were collected from samples of college students in India, Mauritius, Reunion Island, and United States. Questionnaire methodology and…

  10. User's instructions for the cardiovascular Walters model

    NASA Technical Reports Server (NTRS)

    Croston, R. C.

    1973-01-01

    The model is a combined, steady-state cardiovascular and thermal model. It was originally developed for interactive use, but was converted to batch mode simulation for the Sigma 3 computer. The model has the purpose to compute steady-state circulatory and thermal variables in response to exercise work loads and environmental factors. During a computer simulation run, several selected variables are printed at each time step. End conditions are also printed at the completion of the run.

  11. Theory of hadronic nonperturbative models

    SciTech Connect

    Coester, F.; Polyzou, W.N.

    1995-08-01

    As more data probing hadron structure become available hadron models based on nonperturbative relativistic dynamics will be increasingly important for their interpretation. Relativistic Hamiltonian dynamics of few-body systems (constituent-quark models) and many-body systems (parton models) provides a precisely defined approach and a useful phenomenology. However such models lack a quantitative foundation in quantum field theory. The specification of a quantum field theory by a Euclidean action provides a basis for the construction of nonperturbative models designed to maintain essential features of the field theory. For finite systems it is possible to satisfy axioms which guarantee the existence of a Hilbert space with a unitary representation of the Poincare group and the spectral condition which ensures that the spectrum of the four-momentum operator is in the forward light cone. The separate axiom which guarantees locality of the field operators can be weakened for the construction for few-body models. In this context we are investigating algebraic and analytic properties of model Schwinger functions. This approach promises insight into the relations between hadronic models based on relativistic Hamiltonian dynamics on one hand and Bethe-Salpeter Green`s-function equations on the other.

  12. USERS MANUAL: LANDFILL GAS EMISSIONS MODEL - VERSION 2.0

    EPA Science Inventory

    The document is a user's guide for a computer model, Version 2.0 of the Landfill Gas Emissions Model (LandGEM), for estimating air pollution emissions from municipal solid waste (MSW) landfills. The model can be used to estimate emission rates for methane, carbon dioxide, nonmet...

  13. HYDROCARBON SPILL SCREENING MODEL (HSSM) VOLUME 1: USER'S GUIDE

    EPA Science Inventory

    This users guide describes the Hydrocarbon Spill Screening Model (HSSM). The model is intended for simulation of subsurface releases of light nonaqueous phase liquids (LNAPLs). The model consists of separate modules for LNAPL flow through the vadose zone, spreading in the capil...

  14. Do recommender systems benefit users? a modeling approach

    NASA Astrophysics Data System (ADS)

    Yeung, Chi Ho

    2016-04-01

    Recommender systems are present in many web applications to guide purchase choices. They increase sales and benefit sellers, but whether they benefit customers by providing relevant products remains less explored. While in many cases the recommended products are relevant to users, in other cases customers may be tempted to purchase the products only because they are recommended. Here we introduce a model to examine the benefit of recommender systems for users, and find that recommendations from the system can be equivalent to random draws if one always follows the recommendations and seldom purchases according to his or her own preference. Nevertheless, with sufficient information about user preferences, recommendations become accurate and an abrupt transition to this accurate regime is observed for some of the studied algorithms. On the other hand, we find that high estimated accuracy indicated by common accuracy metrics is not necessarily equivalent to high real accuracy in matching users with products. This disagreement between estimated and real accuracy serves as an alarm for operators and researchers who evaluate recommender systems merely with accuracy metrics. We tested our model with a real dataset and observed similar behaviors. Finally, a recommendation approach with improved accuracy is suggested. These results imply that recommender systems can benefit users, but the more frequently a user purchases the recommended products, the less relevant the recommended products are in matching user taste.

  15. METAPHOR (version 1): Users guide. [performability modeling

    NASA Technical Reports Server (NTRS)

    Furchtgott, D. G.

    1979-01-01

    General information concerning METAPHOR, an interactive software package to facilitate performability modeling and evaluation, is presented. Example systems are studied and their performabilities are calculated. Each available METAPHOR command and array generator is described. Complete METAPHOR sessions are included.

  16. Users of middle atmosphere models remarks

    NASA Technical Reports Server (NTRS)

    Gamble, Joe

    1987-01-01

    The procedure followed for shuttle operations is to calculate descent trajectories for each potential shuttle landing site using the Global Reference Atmosphere Model (GRAM) to interactively compute density along the flight path 100 times to bound the statistics. The purpose is to analyze the flight dynamics, along with calculations of heat loads during reentry. The analysis program makes use of the modified version of the Jacchia-70 atmosphere, which includes He bulges over the poles and seasonal latitude variations at lower altitudes. For the troposphere, the 4-D Model is used up to 20 km, Groves from 30 km up to 90 km. It is extrapolated over the globe and faired into the Jacchia atmosphere between 90 and 115 km. Since data on the Southern Hemisphere was lacking, what was done was that the data was flipped over and lagged 6 months. Sometimes when winds are calculated from pressure data in the model there appear to be discontinuities. Modelers indicated that the GRAM was not designed to produce winds, but good wind data is needed for the landing phase of shuttle operations. Use of atmospheric models during reentry is one application where it is obvious that a single integrated atmosphere model is required.

  17. Utilizing Vector Space Models for User Modeling within e-Learning Environments

    ERIC Educational Resources Information Center

    Mangina, E.; Kilbride, J.

    2008-01-01

    User modeling has been found to enhance the effectiveness and/or usability of software systems through the representation of certain properties of a particular user. This paper presents the research and the results of the development of a user modeling system for the implementation of student models within e-learning environments, utilizing vector…

  18. Building integral projection models: a user's guide

    PubMed Central

    Rees, Mark; Childs, Dylan Z; Ellner, Stephen P; Coulson, Tim

    2014-01-01

    In order to understand how changes in individual performance (growth, survival or reproduction) influence population dynamics and evolution, ecologists are increasingly using parameterized mathematical models. For continuously structured populations, where some continuous measure of individual state influences growth, survival or reproduction, integral projection models (IPMs) are commonly used. We provide a detailed description of the steps involved in constructing an IPM, explaining how to: (i) translate your study system into an IPM; (ii) implement your IPM; and (iii) diagnose potential problems with your IPM. We emphasize how the study organism's life cycle, and the timing of censuses, together determine the structure of the IPM kernel and important aspects of the statistical analysis used to parameterize an IPM using data on marked individuals. An IPM based on population studies of Soay sheep is used to illustrate the complete process of constructing, implementing and evaluating an IPM fitted to sample data. We then look at very general approaches to parameterizing an IPM, using a wide range of statistical techniques (e.g. maximum likelihood methods, generalized additive models, nonparametric kernel density estimators). Methods for selecting models for parameterizing IPMs are briefly discussed. We conclude with key recommendations and a brief overview of applications that extend the basic model. The online Supporting Information provides commented R code for all our analyses. PMID:24219157

  19. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  20. User's instructions for the erythropoiesis regulatory model

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The purpose of the model provides a method to analyze some of the events that could account for the decrease in red cell mass observed in crewmen returning from space missions. The model is based on the premise that erythrocyte production is governed by the balance between oxygen supply and demand at a renal sensing site. Oxygen supply is taken to be a function of arterial oxygen tension, mean corpuscular hemoglobin concentration, oxy-hemoglobin carrying capacity, hematocrit, and blood flow. Erythrocyte destruction is based on the law of mass action. The instantaneous hematocrit value is derived by integrating changes in production and destruction rates and accounting for the degree of plasma dilution.

  1. Snowmelt Runoff Model (SRM) User's Manual

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This 2008 edition of the User’s Manual presents a new computer program, the Windows Version 1.11 of the Snowmelt Runoff Model (WinSRM). The popular Version 4 is also preserved in the Appendix because it is still in demand to be used within its limits. The Windows version adds new capabilities: it ac...

  2. Simple Thermal Environment Model (STEM) User's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C.G.; Batts, G. W.; Anderson, B. J.; James, B. F.

    2001-01-01

    This report presents a Simple Thermal Environment Model (STEM) for determining appropriate engineering design values to specify the thermal environment of Earth-orbiting satellites. The thermal environment of a satellite, consists of three components: (1) direct solar radiation, (2) Earth-atmosphere reflected shortwave radiation, as characterized by Earth's albedo, and (3) Earth-atmosphere-emitted outgoing longwave radiation (OLR). This report, together with a companion "guidelines" report provides methodology and guidelines for selecting "design points" for thermal environment parameters for satellites and spacecraft systems. The methods and models reported here are outgrowths of Earth Radiation Budget Experiment (ERBE) satellite data analysis and thermal environment specifications discussed by Anderson and Smith (1994). In large part, this report is intended to update (and supersede) those results.

  3. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  4. Supplement to wellbore models GWELL, GWNACL, and HOLA User`s Guide

    SciTech Connect

    Hadgu, T.; Bodvarsson, G.S.

    1992-09-01

    A study was made on improving the applicability and ease of usage of the wellbore simulators HOLA, GWELL and GWNACL (Bjornsson, 1987; Aunzo et al., 1991). The study concentrated mainly on the usage of Option 2 (please refer to the User`s Guide; Aunzo et al., 1991) and modeling flow of superheated steam when using these computer codes. Amendments were made to the simulators to allow implementation of a variety of input data. A wide range of input data was used to test the modifications to the codes. The study did not attempt to modify or improve the physics or formulations which were used in the models. It showed that a careful check of the input data is required. This report addresses these two areas of interest: usage of Option 2, and simulation of wellbore flow of superheated steam.

  5. Solid rocket booster performance evaluation model. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    This users manual for the solid rocket booster performance evaluation model (SRB-II) contains descriptions of the model, the program options, the required program inputs, the program output format and the program error messages. SRB-II is written in FORTRAN and is operational on both the IBM 370/155 and the MSFC UNIVAC 1108 computers.

  6. Modeling mutual feedback between users and recommender systems

    NASA Astrophysics Data System (ADS)

    Zeng, An; Yeung, Chi Ho; Medo, Matúš; Zhang, Yi-Cheng

    2015-07-01

    Recommender systems daily influence our decisions on the Internet. While considerable attention has been given to issues such as recommendation accuracy and user privacy, the long-term mutual feedback between a recommender system and the decisions of its users has been neglected so far. We propose here a model of network evolution which allows us to study the complex dynamics induced by this feedback, including the hysteresis effect which is typical for systems with non-linear dynamics. Despite the popular belief that recommendation helps users to discover new things, we find that the long-term use of recommendation can contribute to the rise of extremely popular items and thus ultimately narrow the user choice. These results are supported by measurements of the time evolution of item popularity inequality in real systems. We show that this adverse effect of recommendation can be tamed by sacrificing part of short-term recommendation accuracy.

  7. Earthquake Early Warning Beta Users: Java, Modeling, and Mobile Apps

    NASA Astrophysics Data System (ADS)

    Strauss, J. A.; Vinci, M.; Steele, W. P.; Allen, R. M.; Hellweg, M.

    2014-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The beta users receive earthquake information very rapidly in real-time and are providing feedback on their experiences of performance and potential uses within their organization. Beta user interactions allow the ShakeAlert team to discern: which alert delivery options are most effective, what changes would make the UserDisplay more useful in a pre-disaster situation, and most importantly, what actions users plan to take for various scenarios. Actions could include: personal safety approaches, such as drop cover, and hold on; automated processes and procedures, such as opening elevator or fire stations doors; or situational awareness. Users are beginning to determine which policy and technological changes may need to be enacted, and funding requirements to implement their automated controls. The use of models and mobile apps are beginning to augment the basic Java desktop applet. Modeling allows beta users to test their early warning responses against various scenarios without having to wait for a real event. Mobile apps are also changing the possible response landscape, providing other avenues for people to receive information. All of these combine to improve business continuity and resiliency.

  8. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  9. Tree Theory: A Theory-Generative Measurement Model.

    ERIC Educational Resources Information Center

    Airasian, Peter W.; Bart, William M.

    The inadequacies in present measurement models are indicated and a description is given of how tree theory, a theory-generative model, overcomes these inadequacies. Among the weaknesses cited in many measurement models are their untested assumptions of linear order and unidimensionality and their inability to generate non-associational…

  10. Geothermal loan guaranty cash flow model: description and users' manual

    SciTech Connect

    Keimig, M.A.; Rosenberg, J.I.; Entingh, D.J.

    1980-11-01

    This is the users guide for the Geothermal Loan Guaranty Cash Flow Model (GCFM). GCFM is a Fortran code which designs and costs geothermal fields and electric power plants. It contains a financial analysis module which performs life cycle costing analysis taking into account various types of taxes, costs and financial structures. The financial module includes a discounted cash flow feature which calculates a levelized breakeven price for each run. The user's guide contains descriptions of the data requirements and instructions for using the model.

  11. Understanding Deep Representations Learned in Modeling Users Likes.

    PubMed

    Guntuku, Sharath Chandra; Zhou, Joey Tianyi; Roy, Sujoy; Lin, Weisi; Tsang, Ivor W

    2016-08-01

    Automatically understanding and discriminating different users' liking for an image is a challenging problem. This is because the relationship between image features (even semantic ones extracted by existing tools, viz., faces, objects, and so on) and users' likes is non-linear, influenced by several subtle factors. This paper presents a deep bi-modal knowledge representation of images based on their visual content and associated tags (text). A mapping step between the different levels of visual and textual representations allows for the transfer of semantic knowledge between the two modalities. Feature selection is applied before learning deep representation to identify the important features for a user to like an image. The proposed representation is shown to be effective in discriminating users based on images they like and also in recommending images that a given user likes, outperforming the state-of-the-art feature representations by  ∼ 15 %-20%. Beyond this test-set performance, an attempt is made to qualitatively understand the representations learned by the deep architecture used to model user likes. PMID:27295666

  12. A mixing evolution model for bidirectional microblog user networks

    NASA Astrophysics Data System (ADS)

    Yuan, Wei-Guo; Liu, Yun

    2015-08-01

    Microblogs have been widely used as a new form of online social networking. Based on the user profile data collected from Sina Weibo, we find that the number of microblog user bidirectional friends approximately corresponds with the lognormal distribution. We then build two microblog user networks with real bidirectional relationships, both of which have not only small-world and scale-free but also some special properties, such as double power-law degree distribution, disassortative network, hierarchical and rich-club structure. Moreover, by detecting the community structures of the two real networks, we find both of their community scales follow an exponential distribution. Based on the empirical analysis, we present a novel evolution network model with mixed connection rules, including lognormal fitness preferential and random attachment, nearest neighbor interconnected in the same community, and global random associations in different communities. The simulation results show that our model is consistent with real network in many topology features.

  13. Designing visual displays and system models for safe reactor operations based on the user`s perspective of the system

    SciTech Connect

    Brown-VanHoozer, S.A.

    1995-12-31

    Most designers are not schooled in the area of human-interaction psychology and therefore tend to rely on the traditional ergonomic aspects of human factors when designing complex human-interactive workstations related to reactor operations. They do not take into account the differences in user information processing behavior and how these behaviors may affect individual and team performance when accessing visual displays or utilizing system models in process and control room areas. Unfortunately, by ignoring the importance of the integration of the user interface at the information process level, the result can be sub-optimization and inherently error- and failure-prone systems. Therefore, to minimize or eliminate failures in human-interactive systems, it is essential that the designers understand how each user`s processing characteristics affects how the user gathers information, and how the user communicates the information to the designer and other users. A different type of approach in achieving this understanding is Neuro Linguistic Programming (NLP). The material presented in this paper is based on two studies involving the design of visual displays, NLP, and the user`s perspective model of a reactor system. The studies involve the methodology known as NLP, and its use in expanding design choices from the user`s ``model of the world,`` in the areas of virtual reality, workstation design, team structure, decision and learning style patterns, safety operations, pattern recognition, and much, much more.

  14. H2A Production Model, Version 2 User Guide

    SciTech Connect

    Steward, D.; Ramsden, T.; Zuboy, J.

    2008-09-01

    The H2A Production Model analyzes the technical and economic aspects of central and forecourt hydrogen production technologies. Using a standard discounted cash flow rate of return methodology, it determines the minimum hydrogen selling price, including a specified after-tax internal rate of return from the production technology. Users have the option of accepting default technology input values--such as capital costs, operating costs, and capacity factor--from established H2A production technology cases or entering custom values. Users can also modify the model's financial inputs. This new version of the H2A Production Model features enhanced usability and functionality. Input fields are consolidated and simplified. New capabilities include performing sensitivity analyses and scaling analyses to various plant sizes. This User Guide helps users already familiar with the basic tenets of H2A hydrogen production cost analysis get started using the new version of the model. It introduces the basic elements of the model then describes the function and use of each of its worksheets.

  15. An information theory criteria based blind method for enumerating active users in DS-CDMA system

    NASA Astrophysics Data System (ADS)

    Samsami Khodadad, Farid; Abed Hodtani, Ghosheh

    2014-11-01

    In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.

  16. Shawnee flue gas desulfurization computer model users manual

    SciTech Connect

    Sudhoff, F.A.; Torstrick, R.L.

    1985-03-01

    In conjunction with the US Enviromental Protection Agency sponsored Shawnee test program, Bechtel National, Inc., and the Tennessee Valley Authority jointly developed a computer model capable of projecting preliminary design and economics for lime- and limestone-scrubbing flue gas desulfurization systems. The model is capable of projecting relative economics for spray tower, turbulent contact absorber, and venturi-spray tower scrubbing options. It may be used to project the effect on system design and economics of variations in required SO/sub 2/ removal, scrubber operating parameters (gas velocity, liquid-to-gas (L/G) ration, alkali stoichiometry, liquor hold time in slurry recirculation tanks), reheat temperature, and scrubber bypass. It may also be used to evaluate alternative waste disposal methods or additives (MgO or adipic acid) on costs for the selected process. Although the model is not intended to project the economics of an individual system to a high degree of accuracy, it allows prospective users to quickly project comparative design and costs for limestone and lime case variations on a common design and cost basis. The users manual provides a general descripton of the Shawnee FGD computer model and detailed instructions for its use. It describes and explains the user-supplied input data which are required such as boiler size, coal characteristics, and SO/sub 2/ removal requirments. Output includes a material balance, equipment list, and detailed capital investment and annual revenue requirements. The users manual provides information concerning the use of the overall model as well as sample runs to serve as a guide to prospective users in identifying applications. The FORTRAN-based model is maintained by TVA, from whom copies or individual runs are available. 25 refs., 3 figs., 36 tabs.

  17. Using Partial Credit and Response History to Model User Knowledge

    ERIC Educational Resources Information Center

    Van Inwegen, Eric G.; Adjei, Seth A.; Wang, Yan; Heffernan, Neil T.

    2015-01-01

    User modelling algorithms such as Performance Factors Analysis and Knowledge Tracing seek to determine a student's knowledge state by analyzing (among other features) right and wrong answers. Anyone who has ever graded an assignment by hand knows that some answers are "more wrong" than others; i.e. they display less of an understanding…

  18. Dynamic User Modeling within a Game-Based ITS

    ERIC Educational Resources Information Center

    Snow, Erica L.

    2015-01-01

    Intelligent tutoring systems are adaptive learning environments designed to support individualized instruction. The adaptation embedded within these systems is often guided by user models that represent one or more aspects of students' domain knowledge, actions, or performance. The proposed project focuses on the development and testing of user…

  19. USER-FRIENDLY DATA ENTRY ROUTINE FOR THE ESP MODEL

    EPA Science Inventory

    The report is a user's manual for an interactive data entry program that greatly simplifies the creation and modification of electrostatic precipitator (ESP) model data files. outine use of the interactive program, written for IBM PC-compatible computers, will eliminate a major s...

  20. ABAREX -- A neutron spherical optical-statistical-model code -- A user`s manual

    SciTech Connect

    Smith, A.B.; Lawson, R.D.

    1998-06-01

    The contemporary version of the neutron spherical optical-statistical-model code ABAREX is summarized with the objective of providing detailed operational guidance for the user. The physical concepts involved are very briefly outlined. The code is described in some detail and a number of explicit examples are given. With this document one should very quickly become fluent with the use of ABAREX. While the code has operated on a number of computing systems, this version is specifically tailored for the VAX/VMS work station and/or the IBM-compatible personal computer.

  1. User interface for ground-water modeling: Arcview extension

    USGS Publications Warehouse

    Tsou, M.-S.; Whittemore, D.O.

    2001-01-01

    Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.

  2. Plotting program for aerodynamic lifting surface theory. [user manual for FORTRAN computer program

    NASA Technical Reports Server (NTRS)

    Medan, R. T.; Ray, K. S.

    1973-01-01

    A description of and users manual for a USA FORTRAN IV computer program which plots the planform and control points of a wing are presented. The program also plots some of the configuration data such as the aspect ratio. The planform data is stored on a disc file which is created by a geometry program. This program, the geometry program, and several other programs are used together in the analysis of lifting, thin wings in steady, subsonic flow according to a kernel function lifting surface theory.

  3. Airport-Noise Levels and Annoyance Model (ALAMO) user's guide

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    A guide for the use of the Airport-Noise Level and Annoyance MOdel (ALAMO) at the Langley Research Center computer complex is provided. This document is divided into 5 primary sections, the introduction, the purpose of the model, and an in-depth description of the following subsystems: baseline, noise reduction simulation and track analysis. For each subsystem, the user is provided with a description of architecture, an explanation of subsystem use, sample results, and a case runner's check list. It is assumed that the user is familiar with the operations at the Langley Research Center (LaRC) computer complex, the Network Operating System (NOS 1.4) and CYBER Control Language. Incorporated within the ALAMO model is a census database system called SITE II.

  4. A modeling framework for resource-user-infrastructure systems

    NASA Astrophysics Data System (ADS)

    Muneepeerakul, R.; Qubbaj, M.; Anderies, J. M.; Aggarwal, R.; Janssen, M.

    2012-12-01

    A compact modeling framework is developed to supplement a conceptual framework of coupled natural-human systems. The framework consists of four components: resource (R), users (U), public infrastructure (PI), and public infrastructure providers (PIP), the last two of which have not been adequately addressed in many existing modeling studies. The modeling approach employed here is a set of replicator equations describing the dynamical frequencies of social strategies (of U and PIP), whose payoffs are explicit and dynamical functions of biophysical components (R and PI). Model development and preliminary results from specific implementation will be reported and discussed.

  5. The Snowmelt-Runoff Model (SRM) user's manual

    NASA Technical Reports Server (NTRS)

    Martinec, J.; Rango, A.; Major, E.

    1983-01-01

    A manual to provide a means by which a user may apply the snowmelt runoff model (SRM) unaided is presented. Model structure, conditions of application, and data requirements, including remote sensing, are described. Guidance is given for determining various model variables and parameters. Possible sources of error are discussed and conversion of snowmelt runoff model (SRM) from the simulation mode to the operational forecasting mode is explained. A computer program is presented for running SRM is easily adaptable to most systems used by water resources agencies.

  6. SN_GUI: a graphical user interface for snowpack modeling

    NASA Astrophysics Data System (ADS)

    Spreitzhofer, G.; Fierz, C.; Lehning, M.

    2004-10-01

    SNOWPACK is a physical snow cover model. The model not only serves as a valuable research tool, but also runs operationally on a network of high Alpine automatic weather and snow measurement sites. In order to facilitate the operation of SNOWPACK and the interpretation of the results obtained by this model, a user-friendly graphical user interface for snowpack modeling, named SN_GUI, was created. This Java-based and thus platform-independent tool can be operated in two modes, one designed to fulfill the requirements of avalanche warning services (e.g. by providing information about critical layers within the snowpack that are closely related to the avalanche activity), and the other one offering a variety of additional options satisfying the needs of researchers. The user of SN_GUI is graphically guided through the entire process of creating snow cover simulations. The starting point is the efficient creation of input parameter files for SNOWPACK, followed by the launching of SNOWPACK with a variety of parameter settings. Finally, after the successful termination of the run, a number of interactive display options may be used to visualize the model output. Among these are vertical profiles and time profiles for many parameters. Besides other features, SN_GUI allows the use of various color, time and coordinate scales, and the comparison of measured and observed parameters.

  7. Halo modelling in chameleon theories

    SciTech Connect

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu E-mail: kazuya.koyama@port.ac.uk

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  8. Effectiveness of Anabolic Steroid Preventative Intervention among Gym Users: Applying Theory of Planned Behavior

    PubMed Central

    Jalilian, Farzad; Allahverdipour, Hamid; Moeini, Babak; Moghimbeigi, Abbas

    2011-01-01

    Background: Use of anabolic androgenic steroids (AAS) has been associated with adverse physical and psychiatric effects and it is known as rising problem among youth people. This study was conducted to evaluate anabolic steroids preventative intervention efficiency among gym users in Iran and theory of planned behaviour was applied as theoretical framework. Methods: Overall, 120 male gym users participated in this study as intervention and control group. This was a longitudinal randomized pretest - posttest series control group design panel study to implement a behaviour modification based intervention to prevent AAS use. Cross -tabulation and t-test by using SPSS statistical package, version 13 was used for the statistical analysis. Results: It was found significant improvements in average response for knowledge about side effects of AAS (P<0.001), attitude toward, and intention not to use AAS. Additionally after intervention, the rate of AAS and supplements use was decreased among intervention group. Conclusion: Comprehensive implementation against AAS abuse among gym users and ado­lescences would be effective to improve adolescents’ healthy behaviors and intend them not to use AAS. PMID:24688897

  9. Stochastic models: theory and simulation.

    SciTech Connect

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  10. EpiPOD : community vaccination and dispensing model user's guide.

    SciTech Connect

    Berry, M.; Samsa, M.; Walsh, D.; Decision and Information Sciences

    2009-01-09

    EpiPOD is a modeling system that enables local, regional, and county health departments to evaluate and refine their plans for mass distribution of antiviral and antibiotic medications and vaccines. An intuitive interface requires users to input as few or as many plan specifics as are available in order to simulate a mass treatment campaign. Behind the input interface, a system dynamics model simulates pharmaceutical supply logistics, hospital and first-responder personnel treatment, population arrival dynamics and treatment, and disease spread. When the simulation is complete, users have estimates of the number of illnesses in the population at large, the number of ill persons seeking treatment, and queuing and delays within the mass treatment system--all metrics by which the plan can be judged.

  11. Agile IT: Thinking in User-Centric Models

    NASA Astrophysics Data System (ADS)

    Margaria, Tiziana; Steffen, Bernhard

    We advocate a new teaching direction for modern CS curricula: extreme model-driven development (XMDD), a new development paradigm designed to continuously involve the customer/application expert throughout the whole systems' life cycle. Based on the `One-Thing Approach', which works by successively enriching and refining one single artifact, system development becomes in essence a user-centric orchestration of intuitive service functionality. XMDD differs radically from classical software development, which, in our opinion is no longer adequate for the bulk of application programming - in particular when it comes to heterogeneous, cross organizational systems which must adapt to rapidly changing market requirements. Thus there is a need for new curricula addressing this model-driven, lightweight, and cooperative development paradigm that puts the user process in the center of the development and the application expert in control of the process evolution.

  12. Regional Ionospheric Modelling for Single-Frequency Users

    NASA Astrophysics Data System (ADS)

    Boisits, Janina; Joldzic, Nina; Weber, Robert

    2016-04-01

    Ionospheric signal delays are a main error source in GNSS-based positioning. Thus, single-frequency receivers, which are frequently used nowadays, require additional ionospheric information to mitigate these effects. Within the Austrian Research Promotion Agency (FFG) project Regiomontan (Regional Ionospheric Modelling for Single-Frequency Users) a new and as realistic as possible model is used to obtain precise GNSS ionospheric signal delays. These delays will be provided to single-frequency users to significantly increase positioning accuracy. The computational basis is the Thin-Shell Model. For regional modelling a thin electron layer of the underlying model is approximated by a Taylor series up to degree two. The network used includes 22 GNSS Reference Stations in Austria and nearby. First results were calculated from smoothed code observations by forming the geometry-free linear combination. Satellite and station DCBs were applied. In a least squares adjustment the model parameters, consisting of the VTEC0 at the origin of the investigated area, as well as the first and the second derivatives of the electron content in longitude and latitude, were obtained with a temporal resolution of 1 hour. The height of the layer was kept fixed. The formal errors of the model parameters suggest an accuracy of the VTEC slightly better than 1TECU for a user location within Austria. In a further step, the model parameters were derived from sole phase observations by using a levelling approach to mitigate common range biases. The formal errors of this model approach suggest an accuracy of about a few tenths of a TECU. For validation, the Regiomontan VTEC was compared to IGS TEC maps depicting a very good agreement. Further, a comparison of pseudoranges has been performed to calculate the 'true' error by forming the ionosphere-free linear combination on the one hand, and by applying the Regiomontan model to L1 pseudoranges on the other hand. The resulting differences are mostly

  13. Simplified analytical model of penetration with lateral loading -- User`s guide

    SciTech Connect

    Young, C.W.

    1998-05-01

    The SAMPLL (Simplified Analytical Model of Penetration with Lateral Loading) computer code was originally developed in 1984 to realistically yet economically predict penetrator/target interactions. Since the code`s inception, its use has spread throughout the conventional and nuclear penetrating weapons community. During the penetrator/target interaction, the resistance of the material being penetrated imparts both lateral and axial loads on the penetrator. These loads cause changes to the penetrator`s motion (kinematics). SAMPLL uses empirically based algorithms, formulated from an extensive experimental data base, to replicate the loads the penetrator experiences during penetration. The lateral loads resulting from angle of attack and trajectory angle of the penetrator are explicitly treated in SAMPLL. The loads are summed and the kinematics calculated at each time step. SAMPLL has been continually improved, and the current version, Version 6.0, can handle cratering and spall effects, multiple target layers, penetrator damage/failure, and complex penetrator shapes. Version 6 uses the latest empirical penetration equations, and also automatically adjusts the penetrability index for certain target layers to account for layer thickness and confinement. This report describes the SAMPLL code, including assumptions and limitations, and includes a user`s guide.

  14. Five-Factor Model personality profiles of drug users

    PubMed Central

    Terracciano, Antonio; Löckenhoff, Corinna E; Crum, Rosa M; Bienvenu, O Joseph; Costa, Paul T

    2008-01-01

    Background Personality traits are considered risk factors for drug use, and, in turn, the psychoactive substances impact individuals' traits. Furthermore, there is increasing interest in developing treatment approaches that match an individual's personality profile. To advance our knowledge of the role of individual differences in drug use, the present study compares the personality profile of tobacco, marijuana, cocaine, and heroin users and non-users using the wide spectrum Five-Factor Model (FFM) of personality in a diverse community sample. Method Participants (N = 1,102; mean age = 57) were part of the Epidemiologic Catchment Area (ECA) program in Baltimore, MD, USA. The sample was drawn from a community with a wide range of socio-economic conditions. Personality traits were assessed with the Revised NEO Personality Inventory (NEO-PI-R), and psychoactive substance use was assessed with systematic interview. Results Compared to never smokers, current cigarette smokers score lower on Conscientiousness and higher on Neuroticism. Similar, but more extreme, is the profile of cocaine/heroin users, which score very high on Neuroticism, especially Vulnerability, and very low on Conscientiousness, particularly Competence, Achievement-Striving, and Deliberation. By contrast, marijuana users score high on Openness to Experience, average on Neuroticism, but low on Agreeableness and Conscientiousness. Conclusion In addition to confirming high levels of negative affect and impulsive traits, this study highlights the links between drug use and low Conscientiousness. These links provide insight into the etiology of drug use and have implications for public health interventions. PMID:18405382

  15. Supercell thunderstorm modeling and theory

    NASA Astrophysics Data System (ADS)

    Rotunno, Richard

    Tornadoes occur in thunderstorms. Ferrel [1889] theorized that tornadoes form when the thunderstorm updraft encounters a preexisting "gyratory" wind field. Only lately has it been found that tornadoes/waterspouts can be produced by nonrotating thunderstorms forming in environments with a preexisting low-level gyratory wind field [Wakimoto and Wilson, 1989]. However, the most intense, and long-lived, tornadoes occur in a special type of thunderstorm known as the "supercell," which generates its own gyratory wind field. That it does so is interesting, but perhaps the most fascinating aspect of rotation in the supercell, which became clear in the past decade or so, is the rotating wind field's vital role in producing the supercell's extraordinary properties of long life and deviate motion. Thus the present review will focus on what was learned from modeling and theory about the rotation and propagation of, and the relation of tornadoes to, supercell thunderstorms.

  16. User's guide for the Photochemical Box Model (PBM)

    NASA Astrophysics Data System (ADS)

    Schere, K. L.; Demerjian, K. L.

    1984-11-01

    The user's guide for the photochemical box model (PBM) describes the structure and operation of the model and its preprocessors and provides the potential user with guidance in setting up input data. The PBM is a simple stationary single-cell model with a variable height lid designed to provide volume-integrated hour averages of O3 and other photochemical smog pollutants of interest for an urban area for a single day of simulation. The PBM is most appropriate for application in air stagnation conditions with light and variable winds. The PBM assumes that emission sources are homogeneously distributed across the surface face of the box volume and that the volume is well mixed at all times. The user must provide the PBM with initial species concentrations, hourly inputs of wind speed, source emission fluxes of CO, NC(x), THC, and hydrocarbon reactivity classes, and boundary species concentrations. Values of measured solar radiation and mixed layer depth may be specified at subhourly intervals throughout a simulation.

  17. Quiver gauge theories and integrable lattice models

    NASA Astrophysics Data System (ADS)

    Yagi, Junya

    2015-10-01

    We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.

  18. Design of personalized search engine based on user-webpage dynamic model

    NASA Astrophysics Data System (ADS)

    Li, Jihan; Li, Shanglin; Zhu, Yingke; Xiao, Bo

    2013-12-01

    Personalized search engine focuses on establishing a user-webpage dynamic model. In this model, users' personalized factors are introduced so that the search engine is better able to provide the user with targeted feedback. This paper constructs user and webpage dynamic vector tables, introduces singular value decomposition analysis in the processes of topic categorization, and extends the traditional PageRank algorithm.

  19. NASA AVOSS Fast-Time Wake Prediction Models: User's Guide

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew

    2014-01-01

    The National Aeronautics and Space Administration (NASA) is developing and testing fast-time wake transport and decay models to safely enhance the capacity of the National Airspace System (NAS). The fast-time wake models are empirical algorithms used for real-time predictions of wake transport and decay based on aircraft parameters and ambient weather conditions. The aircraft dependent parameters include the initial vortex descent velocity and the vortex pair separation distance. The atmospheric initial conditions include vertical profiles of temperature or potential temperature, eddy dissipation rate, and crosswind. The current distribution includes the latest versions of the APA (3.4) and the TDP (2.1) models. This User's Guide provides detailed information on the model inputs, file formats, and the model output. An example of a model run and a brief description of the Memphis 1995 Wake Vortex Dataset is also provided.

  20. The European ALMA Regional Centre: a model of user support

    NASA Astrophysics Data System (ADS)

    Andreani, P.; Stoehr, F.; Zwaan, M.; Hatziminaoglou, E.; Biggs, A.; Diaz-Trigo, M.; Humphreys, E.; Petry, D.; Randall, S.; Stanke, T.; van Kampen, E.; Bárta, M.; Brand, J.; Gueth, F.; Hogerheijde, M.; Bertoldi, F.; Muxlow, T.; Richards, A.; Vlemmings, W.

    2014-08-01

    The ALMA Regional Centres (ARCs) form the interface between the ALMA observatory and the user community from the proposal preparation stage to the delivery of data and their subsequent analysis. The ARCs provide critical services to both the ALMA operations in Chile and to the user community. These services were split by the ALMA project into core and additional services. The core services are financed by the ALMA operations budget and are critical to the successful operation of ALMA. They are contractual obligations and must be delivered to the ALMA project. The additional services are not funded by the ALMA project and are not contractual obligations, but are critical to achieve ALMA full scientific potential. A distributed network of ARC nodes (with ESO being the central ARC) has been set up throughout Europe at the following seven locations: Bologna, Bonn-Cologne, Grenoble, Leiden, Manchester, Ondrejov, Onsala. These ARC nodes are working together with the central node at ESO and provide both core and additional services to the ALMA user community. This paper presents the European ARC, and how it operates in Europe to support the ALMA community. This model, although complex in nature, is turning into a very successful one, providing a service to the scientific community that has been so far highly appreciated. The ARC could become a reference support model in an age where very large collaborations are required to build large facilities, and support is needed for geographically and culturally diverse communities.

  1. HIGHWAY, a transportation routing model: program description and users' manual

    SciTech Connect

    Joy, D.S.; Johnson, P.E.; Gibson, S.M.

    1982-12-01

    A computerized transportation routing model has been developed at the Oak Ridge National Laboratory to be used for predicting likely routes for shipping radioactive materials. The HIGHWAY data base is a computerized road atlas containing descriptions of the entire interstate highway system, the federal highway system, and most of the principal state roads. In addition to its prediction of the most likely commercial route, options incorporated in the HIGHWAY model can allow for maximum use of interstate highways or routes that will bypass urbanized areas containing populations > 100,000. The user may also interactively modify the data base to predict routes that bypass any particular state, city, town, or specific highway segment.

  2. User-friendly graph editing for procedural modeling of buildings.

    PubMed

    Patow, Gustavo

    2012-01-01

    A proposed rule-based editing metaphor intuitively lets artists create buildings without changing their workflow. It's based on the realization that the rule base represents a directed acyclic graph and on a shift in the development paradigm from product-based to rule-based representations. Users can visually add or edit rules, connect them to control the workflow, and easily create commands that expand the artist's toolbox (for example, Boolean operations or local controlling operators). This approach opens new possibilities, from model verification to model editing through graph rewriting. PMID:24804948

  3. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  4. ACE2 Global Digital Elevation Model : User Analysis

    NASA Astrophysics Data System (ADS)

    Smith, R. G.; Berry, P. A. M.; Benveniste, J.

    2013-12-01

    Altimeter Corrected Elevations 2 (ACE2), first released in October 2009, is the Global Digital Elevation Model (GDEM) created by fusing the high accuracy of over 100 million altimeter retracked height estimates, derived primarily from the ERS-1 Geodetic Mission, with the high frequency content available within the near-global Shuttle Radar Topography Mission. This novel ACE2 GDEM is freely available at 3”, 9”, 30” and 5' and has been distributed via the web to over 680 subscribers. This paper presents the results of a detailed analysis of geographical distribution of subscribed users, along with fields of study and potential uses. Investigations have also been performed to determine the most popular spatial resolutions and the impact these have on the scope of data downloaded. The analysis has shown that, even though the majority of users have come from Europe and America, a significant number of website hits have been received from South America, Africa and Asia. Registered users also vary widely, from research institutions and major companies down to individual hobbyists looking at data for single projects.

  5. Engine structures modeling software system: Computer code. User's manual

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.

  6. Groundwater modeling and remedial optimization design using graphical user interfaces

    SciTech Connect

    Deschaine, L.M.

    1997-05-01

    The ability to accurately predict the behavior of chemicals in groundwater systems under natural flow circumstances or remedial screening and design conditions is the cornerstone to the environmental industry. The ability to do this efficiently and effectively communicate the information to the client and regulators is what differentiates effective consultants from ineffective consultants. Recent advances in groundwater modeling graphical user interfaces (GUIs) are doing for numerical modeling what Windows{trademark} did for DOS{trademark}. GUI facilitates both the modeling process and the information exchange. This Test Drive evaluates the performance of two GUIs--Groundwater Vistas and ModIME--on an actual groundwater model calibration and remedial design optimization project. In the early days of numerical modeling, data input consisted of large arrays of numbers that required intensive labor to input and troubleshoot. Model calibration was also manual, as was interpreting the reams of computer output for each of the tens or hundreds of simulations required to calibrate and perform optimal groundwater remedial design. During this period, the majority of the modelers effort (and budget) was spent just getting the model running, as opposed to solving the environmental challenge at hand. GUIs take the majority of the grunt work out of the modeling process, thereby allowing the modeler to focus on designing optimal solutions.

  7. Care 3 model overview and user's guide, first revision

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Petersen, P. L.

    1985-01-01

    A manual was written to introduce the CARE III (Computer-Aided Reliability Estimation) capability to reliability and design engineers who are interested in predicting the reliability of highly reliable fault-tolerant systems. It was also structured to serve as a quick-look reference manual for more experienced users. The guide covers CARE III modeling and reliability predictions for execution in the CDC CYber 170 series computers, DEC VAX-11/700 series computer, and most machines that compile ANSI Standard FORTRAN 77.

  8. Workstation Modelling and Development: Clinical Definition of a Picture Archiving and Communications System (PACS) User Interface

    NASA Astrophysics Data System (ADS)

    Braudes, Robert E.; Mun, Seong K.; Sibert, John L.; Schnizlein, John; Horii, Steven C.

    1989-05-01

    A PACS must provide a user interface which is acceptable to all potential users of the system. Observations and interviews have been conducted with six radiology services at the Georgetown University Medical Center, Department of Radiology, in order to evaluate user interface requirements for a PACS system. Based on these observations, a conceptual model of radiology has been developed. These discussions have also revealed some significant differences in the user interface requirements between the various services. Several underlying factors have been identified which may be used as initial predictors of individual user interface styles. A user model has been developed which incorporates these factors into the specification of a tailored PACS user interface.

  9. User-friendly software for modeling collective spin wave excitations

    NASA Astrophysics Data System (ADS)

    Hahn, Steven; Peterson, Peter; Fishman, Randy; Ehlers, Georg

    There exists a great need for user-friendly, integrated software that assists in the scientific analysis of collective spin wave excitations measured with inelastic neutron scattering. SpinWaveGenie is a C + + software library that simplifies the modeling of collective spin wave excitations, allowing scientists to analyze neutron scattering data with sophisticated models fast and efficiently. Furthermore, one can calculate the four-dimensional scattering function S(Q,E) to directly compare and fit calculations to experimental measurements. Its generality has been both enhanced and verified through successful modeling of a wide array of magnetic materials. Recently, we have spent considerable effort transforming SpinWaveGenie from an early prototype to a high quality free open source software package for the scientific community. S.E.H. acknowledges support by the Laboratory's Director's fund, ORNL. Work was sponsored by the Division of Scientific User Facilities, Office of Basic Energy Sciences, US Department of Energy, under Contract No. DE-AC05-00OR22725 with UT-Battelle, LLC.

  10. Theory, modeling, and simulation annual report, 1992

    SciTech Connect

    Not Available

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  11. Users manual for a one-dimensional Lagrangian transport model

    USGS Publications Warehouse

    Schoellhamer, D.H.; Jobson, H.E.

    1986-01-01

    A Users Manual for the Lagrangian Transport Model (LTM) is presented. The LTM uses Lagrangian calculations that are based on a reference frame moving with the river flow. The Lagrangian reference frame eliminates the need to numerically solve the convective term of the convection-diffusion equation and provides significant numerical advantages over the more commonly used Eulerian reference frame. When properly applied, the LTM can simulate riverine transport and decay processes within the accuracy required by most water quality studies. The LTM is applicable to steady or unsteady one-dimensional unidirectional flows in fixed channels with tributary and lateral inflows. Application of the LTM is relatively simple and optional capabilities improve the model 's convenience. Appendices give file formats and three example LTM applications that include the incorporation of the QUAL II water quality model 's reaction kinetics into the LTM. (Author 's abstract)

  12. COSTEAM, an industrial steam generation cost model: updated users' manual

    SciTech Connect

    Murphy, Mary; Reierson, James; Lethi, Minh- Triet

    1980-10-01

    COSTEAM is a tool for designers and managers faced with choosing among alternative systems for generating process steam, whether for new or replacement applications. Such a decision requires a series of choices among overall system concepts, component characteristics, fuel types and financial assumptions, all of which are interdependent and affect the cost of steam. COSTEAM takes the user's input on key characteristics of a proposed process steam generation facility, and computes its capital, operating and maintenance costs. Versatility and simplicity of operation are major goals of the COSTEAM system. As a user, you can work to almost any level of detail necessary and appropriate to a given stage of planning. Since the values you specify are retained and used by the computer throughout each terminal session, you can set up a hypothetical steam generation system fixed in all characteristics but one or two of special interest. It is then quick and easy to obtain a series of results by changing only those one or two values between computer runs. This updated version of the Users' Manual contains instructions for using the expanded and improved COSTEAM model. COSTEAM has three technology submodels which address conventional coal, conventional oil and atmospheric fluidized bed combustion. The structure and calculation methods of COSTEAM are not discussed in this guide, and need not be understood in order to use the model. However, you may consult the companion volume of this report, COSTEAM Expansion and Improvements: Design of a Coal-Fired Atmospheric Fluidized Bed Submodel, an Oil-Fired Submodel, and Input/Output Improvements, MTR80W00048, which presents the design details.

  13. Beliefs and Attitudes Regarding Drug Treatment: Application of the Theory of Planned Behavior in African American Cocaine Users

    PubMed Central

    Booth, Brenda M.; Stewart, Katharine E.; Curran, Geoffrey M.; Cheney, Ann M.; Borders, Tyrone F.

    2014-01-01

    Background The Theory of Planned Behavior (TPB) can provide insights into perceived need for cocaine treatment among African American cocaine users. Methods A cross-sectional community sample of 400 (50% rural) not-in-treatment African American cocaine users was identified through respondent-driven sampling in one urban and two rural counties in Arkansas. Measures included self-reports of attitudes and beliefs about cocaine treatment, perceived need and perceived effectiveness of treatment, and positive and negative cocaine expectancies. Normative beliefs were measured by perceived stigma and consequences of stigma regarding drug use and drug treatment. Perceived control was measured by readiness for treatment, prior drug treatment, and perceived ability to cut down on cocaine use without treatment. Findings Multiple regression analysis found that older age (standardized regression coefficient β = 0.15, P < 0.001), rural residence (β = −0.09, P = 0.025), effectiveness of treatment (β = 0.39, P < 0.001), negative cocaine expectancies (β = 0.138, P = 0.003), experiences of rejection (β = 0.18, P < 0.001), need for secrecy (β = 0.12, P = 0.002), and readiness for treatment (β = 0.15, P < 0.001), were independently associated with perceived need for cocaine treatment. Conclusions TPB is a relevant model for understanding perceived need for treatment among African American cocaine users. Research has shown perceived need to be a major correlate of treatment participation. Study results should be applicable for designing interventions to encourage treatment participation. PMID:24930051

  14. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

    2009-08-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several

  15. Intelligent User Interfaces for Information Analysis: A Cognitive Model

    SciTech Connect

    Schwarting, Irene S.; Nelson, Rob A.; Cowell, Andrew J.

    2006-01-29

    Intelligent user interfaces (IUIs) for information analysis (IA) need to be designed with an intrinsic understanding of the analytical objectives and the dimensions of the information space. These analytical objectives are oriented around the requirement to provide decision makers with courses of action. Most tools available to support analysis barely skim the surface of the dimensions and categories of information used in analysis, and almost none are designed to address the ultimate requirement of decision support. This paper presents a high-level model of the cognitive framework of information analysts in the context of doing their jobs. It is intended that this model will enable the derivation of design requirements for advanced IUIs for IA.

  16. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  17. SIMULATION MODEL FOR WATERSHED MANAGEMENT PLANNING. VOLUME 2. MODEL USER MANUAL

    EPA Science Inventory

    This report provides a user manual for the hydrologic, nonpoint source pollution simulation of the generalized planning model for evaluating forest and farming management alternatives. The manual contains an explanation of application of specific code and indicates changes that s...

  18. A Markov Chain Model for Changes in Users' Assessment of Search Results.

    PubMed

    Zhitomirsky-Geffet, Maayan; Bar-Ilan, Judit; Levene, Mark

    2016-01-01

    Previous research shows that users tend to change their assessment of search results over time. This is a first study that investigates the factors and reasons for these changes, and describes a stochastic model of user behaviour that may explain these changes. In particular, we hypothesise that most of the changes are local, i.e. between results with similar or close relevance to the query, and thus belong to the same"coarse" relevance category. According to the theory of coarse beliefs and categorical thinking, humans tend to divide the range of values under consideration into coarse categories, and are thus able to distinguish only between cross-category values but not within them. To test this hypothesis we conducted five experiments with about 120 subjects divided into 3 groups. Each student in every group was asked to rank and assign relevance scores to the same set of search results over two or three rounds, with a period of three to nine weeks between each round. The subjects of the last three-round experiment were then exposed to the differences in their judgements and were asked to explain them. We make use of a Markov chain model to measure change in users' judgments between the different rounds. The Markov chain demonstrates that the changes converge, and that a majority of the changes are local to a neighbouring relevance category. We found that most of the subjects were satisfied with their changes, and did not perceive them as mistakes but rather as a legitimate phenomenon, since they believe that time has influenced their relevance assessment. Both our quantitative analysis and user comments support the hypothesis of the existence of coarse relevance categories resulting from categorical thinking in the context of user evaluation of search results. PMID:27171426

  19. Hanford Soil Inventory Model (SIM) Rev. 1 Users Guide

    SciTech Connect

    Simpson, Brett C.; Corbin, Rob A.; Anderson, Michael J.; Kincaid, Charles T.

    2006-09-25

    The focus of the development and application of a soil inventory model as part of the Remediation and Closure Science (RCS) Project managed by PNNL was to develop a probabilistic approach to estimate comprehensive, mass balanced-based contaminant inventories for the Hanford Site post-closure setting. The outcome of this effort was the Hanford Soil Inventory Model (SIM). This document is a user's guide for the Hanford SIM. The principal project requirement for the SIM was to provide comprehensive quantitative estimates of contaminant inventory and its uncertainty for the various liquid waste sites, unplanned releases, and past tank farm leaks as a function of time and location at Hanford. The majority, but not all of these waste sites are in the 200 Areas of Hanford where chemical processing of spent fuel occurred. A computer model capable of performing these calculations and providing satisfactory quantitative output representing a robust description of contaminant inventory and uncertainty for use in other subsequent models was determined to be satisfactory to address the needs of the RCS Project. The ability to use familiar, commercially available software on high-performance personal computers for data input, modeling, and analysis, rather than custom software on a workstation or mainframe computer for modeling, was desired.

  20. User-Defined Material Model for Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)

    2006-01-01

    An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.

  1. User's guide to the MESOI diffusion model: Version 1. 1 (for Data General Eclipse S/230 with AFOS)

    SciTech Connect

    Athey, G.F.; Ramsdell, J.V.

    1982-09-01

    MESOI is an interactive, Langrangian puff trajectory model. The model theory is documented separately (Ramsdell and Athey, 1981). Version 1.1 is a modified form of the original 1.0. It is designed to run on a Data General Eclipse computer. The model has improved support features which make it useful as an emergency response tool. This report is intended to provide the user with the information necessary to successfully conduct model simulations using MESOI Version 1.1 and to use the support programs STAPREP and EXPLT. The user is also provided information on the use of the data file maintenance and review program UPDATE. Examples are given for the operation of the program. Test data sets are described which allow the user to practice with the programs and to confirm proper implementation and execution.

  2. WASP7 BENTHIC ALGAE - MODEL THEORY AND USER'S GUIDE

    EPA Science Inventory

    The standard WASP7 eutrophication module includes nitrogen and phosphorus cycling, dissolved oxygen-organic matter interactions, and phytoplankton kinetics. In many shallow streams and rivers, however, the attached algae (benthic algae, or periphyton, attached to submerged substr...

  3. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  4. Dimer models and quiver gauge theories

    NASA Astrophysics Data System (ADS)

    Pichai, Ramadevi

    2013-12-01

    = 1 quiver gauge theories on coincident D3 branes placed at a tip of a Calabi-Yau singularity C are dual to string theories on AdS5×X5 where X5 are Sasaki-Einstein spaces. We present a neat combinatorial approach called dimer model to understand interrelations between toric quiver gauge theories and toric data representing the Calabi-Yau singularities.

  5. Modeling of rolling element bearing mechanics. Computer program user's manual

    NASA Technical Reports Server (NTRS)

    Greenhill, Lyn M.; Merchant, David H.

    1994-01-01

    This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  6. GCFM Users Guide Revision for Model Version 5.0

    SciTech Connect

    Keimig, Mark A.; Blake, Coleman

    1981-08-10

    This paper documents alterations made to the MITRE/DOE Geothermal Cash Flow Model (GCFM) in the period of September 1980 through September 1981. Version 4.0 of GCFM was installed on the computer at the DOE San Francisco Operations Office in August 1980. This Version has also been distributed to about a dozen geothermal industry firms, for examination and potential use. During late 1980 and 1981, a few errors detected in the Version 4.0 code were corrected, resulting in Version 4.1. If you are currently using GCFM Version 4.0, it is suggested that you make the changes to your code that are described in Section 2.0. User's manual changes listed in Section 3.0 and Section 4.0 should then also be made.

  7. CORCON-MOD3: An integrated computer model for analysis of molten core-concrete interactions. User`s manual

    SciTech Connect

    Bradley, D.R.; Gardner, D.R.; Brockmann, J.E.; Griffith, R.O.

    1993-10-01

    The CORCON-Mod3 computer code was developed to mechanistically model the important core-concrete interaction phenomena, including those phenomena relevant to the assessment of containment failure and radionuclide release. The code can be applied to a wide range of severe accident scenarios and reactor plants. The code represents the current state of the art for simulating core debris interactions with concrete. This document comprises the user`s manual and gives a brief description of the models and the assumptions and limitations in the code. Also discussed are the input parameters and the code output. Two sample problems are also given.

  8. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    SciTech Connect

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  9. Crisis in Context Theory: An Ecological Model

    ERIC Educational Resources Information Center

    Myer, Rick A.; Moore, Holly B.

    2006-01-01

    This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…

  10. Graphical User Interface for Simulink Integrated Performance Analysis Model

    NASA Technical Reports Server (NTRS)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  11. Long Fibre Composite Modelling Using Cohesive User's Element

    NASA Astrophysics Data System (ADS)

    Kozák, Vladislav; Chlup, Zdeněk

    2010-09-01

    The development glass matrix composites reinforced by unidirectional long ceramic fibre has resulted in a family of very perspective structural materials. The only disadvantage of such materials is relatively high brittleness at room temperature. The main micromechanisms acting as toughening mechanism are the pull out, crack bridging, matrix cracking. There are other mechanisms as crack deflection etc. but the primer mechanism is mentioned pull out which is governed by interface between fibre and matrix. The contribution shows a way how to predict and/or optimise mechanical behaviour of composite by application of cohesive zone method and write user's cohesive element into the FEM numerical package Abaqus. The presented results from numerical calculations are compared with experimental data. Crack extension is simulated by means of element extinction algorithms. The principal effort is concentrated on the application of the cohesive zone model with the special traction separation (bridging) law and on the cohesive zone modelling. Determination of micro-mechanical parameters is based on the combination of static tests, microscopic observations and numerical calibration procedures.

  12. Propeller aircraft interior noise model: User's manual for computer program

    NASA Astrophysics Data System (ADS)

    Wilby, E. G.; Pope, L. D.

    1985-01-01

    A computer program entitled PAIN (Propeller Aircraft Interior Noise) has been developed to permit calculation of the sound levels in the cabin of a propeller-driven airplane. The fuselage is modeled as a cylinder with a structurally integral floor, the cabin sidewall and floor being stiffened by ring frames, stringers and floor beams of arbitrary configurations. The cabin interior is covered with acoustic treatment and trim. The propeller noise consists of a series of tones at harmonics of the blade passage frequency. Input data required by the program include the mechanical and acoustical properties of the fuselage structure and sidewall trim. Also, the precise propeller noise signature must be defined on a grid that lies in the fuselage skin. The propeller data are generated with a propeller noise prediction program such as the NASA Langley ANOPP program. The program PAIN permits the calculation of the space-average interior sound levels for the first ten harmonics of a propeller rotating alongside the fuselage. User instructions for PAIN are given in the report. Development of the analytical model is presented in NASA CR 3813.

  13. Theories of addiction: methamphetamine users' explanations for continuing drug use and relapse.

    PubMed

    Newton, Thomas F; De La Garza, Richard; Kalechstein, Ari D; Tziortzis, Desey; Jacobsen, Caitlin A

    2009-01-01

    A variety of preclinical models have been constructed to emphasize unique aspects of addiction-like behavior. These include Negative Reinforcement ("Pain Avoidance"), Positive Reinforcement ("Pleasure Seeking"), Incentive Salience ("Craving"), Stimulus Response Learning ("Habits"), and Inhibitory Control Dysfunction ("Impulsivity"). We used a survey to better understand why methamphetamine-dependent research volunteers (N = 73) continue to use methamphetamine, or relapse to methamphetamine use after a period of cessation of use. All participants met DSM-IV criteria for methamphetamine abuse or dependence, and did not meet criteria for other current Axis I psychiatric disorders or dependence on other drugs of abuse, other than nicotine. The questionnaire consisted of a series of face-valid questions regarding drug use, which in this case referred to methamphetamine use. Examples of questions include: "Do you use drugs mostly to make bad feelings like boredom, loneliness, or apathy go away?", "Do you use drugs mostly because you want to get high?", "Do you use drugs mostly because of cravings?", "Do you find yourself getting ready to take drugs without thinking about it?", and "Do you impulsively take drugs?". The scale was anchored at 1 (not at all) and 7 (very much). For each question, the numbers of participants rating each question negatively (1 or 2), neither negatively or affirmatively (3-5), and affirmatively (6 or 7) were tabulated. The greatest number of respondents (56%) affirmed that they used drugs due to "pleasure seeking." The next highest categories selected were "impulsivity" (27%) and "habits"(25%). Surprisingly, many participants reported that "pain avoidance" (30%) and "craving" (30%) were not important for their drug use. Results from this study support the contention that methamphetamine users (and probably other drug users as well) are more heterogeneous than is often appreciated, and imply that treatment development might be more successful if

  14. BPACK -- A computer model package for boiler reburning/co-firing performance evaluations. User`s manual, Volume 1

    SciTech Connect

    Wu, K.T.; Li, B.; Payne, R.

    1992-06-01

    This manual presents and describes a package of computer models uniquely developed for boiler thermal performance and emissions evaluations by the Energy and Environmental Research Corporation. The model package permits boiler heat transfer, fuels combustion, and pollutant emissions predictions related to a number of practical boiler operations such as fuel-switching, fuels co-firing, and reburning NO{sub x} reductions. The models are adaptable to most boiler/combustor designs and can handle burner fuels in solid, liquid, gaseous, and slurried forms. The models are also capable of performing predictions for combustion applications involving gaseous-fuel reburning, and co-firing of solid/gas, liquid/gas, gas/gas, slurry/gas fuels. The model package is conveniently named as BPACK (Boiler Package) and consists of six computer codes, of which three of them are main computational codes and the other three are input codes. The three main codes are: (a) a two-dimensional furnace heat-transfer and combustion code: (b) a detailed chemical-kinetics code; and (c) a boiler convective passage code. This user`s manual presents the computer model package in two volumes. Volume 1 describes in detail a number of topics which are of general users` interest, including the physical and chemical basis of the models, a complete description of the model applicability, options, input/output, and the default inputs. Volume 2 contains a detailed record of the worked examples to assist users in applying the models, and to illustrate the versatility of the codes.

  15. User Acceptance of Long-Term Evolution (LTE) Services: An Application of Extended Technology Acceptance Model

    ERIC Educational Resources Information Center

    Park, Eunil; Kim, Ki Joon

    2013-01-01

    Purpose: The aim of this paper is to propose an integrated path model in order to explore user acceptance of long-term evolution (LTE) services by examining potential causal relationships between key psychological factors and user intention to use the services. Design/methodology/approach: Online survey data collected from 1,344 users are analysed…

  16. Towards Model-Driven End-User Development in CALL

    ERIC Educational Resources Information Center

    Farmer, Rod; Gruba, Paul

    2006-01-01

    The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…

  17. Supersymmetric F-theory GUT models

    NASA Astrophysics Data System (ADS)

    Chung, Yu-Chieh

    F-theory is a twelve-dimensional geometric version of string theory and is believed to be a natural framework for GUT model building. The aim of this dissertation is to study how gauge theories realized by F-theory can accommodate GUT models. In this dissertation, we focus on local and semi-local GUT model building in F-theory. For local GUT models, we build SU(5) GUTs by using abelian U(1) fluxes via theSU6) gauge group. Doing so, we obtain non-minimal spectra of the MSSM with doublet-triplet splitting by switching on abelian U(1)2 fluxes. We also classify all supersymmetric U(1)2 fluxes by requiring an exotic-free bulk spectrum. For semi-local GUT models, we start with an E8 singularity and obtain lower rank gauge groups by unfolding the singularity governed by spectral covers. In this framework, the spectra can be calculated by the intersection numbers of spectral covers and matter curves. In particular, we useSU4) spectral covers and abelian U(1)X fluxes to build flippedSU5) models. We show that three-generation spectra of flippedSU5) models can be achieved by turning on suitable fluxes. To construct E6 GUTs, we consider SU3) spectral covers breaking E8 down to E6. Also three-generation extended MSSM can be obtained by using non-abelian SU2) x U(1)2 fluxes.

  18. Graphical Model Theory for Wireless Sensor Networks

    SciTech Connect

    Davis, William B.

    2002-12-08

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm.

  19. Self Modeling: Expanding the Theories of Learning

    ERIC Educational Resources Information Center

    Dowrick, Peter W.

    2012-01-01

    Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…

  20. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  1. Physician's information customizer (PIC): using a shareable user model to filter the medical literature.

    PubMed

    Pratt, W; Sim, I

    1995-01-01

    The practice of medicine is information-intensive. From reviewing the literature to formulating therapeutic plans, each physician handles information differently. Yet rarely does a representation of the user's information needs and preferences--a user model--get incorporated into information management tools, even though we might reasonably expect better acceptance and effectiveness if the tools' presentation and processing were customized to the user. We developed the Physician's Information Customizer (PIC), which generates a shareable user model that can be used in any medical information-management application. PIC elicits the stable, long-term attributes of a physician through simple questions about her specialty, research focus, areas of interest, patient characteristics (e.g., ages), and practice locale. To show the utility of this user model in customizing a medical informatics application, PIC custom-filters and ranks articles from Medline, using the user model to determine what would be most interesting to the user. Preliminary evaluation on all 99 unselected articles from a recent issue of six prominent medical journals shows that PIC ranks 66% of the articles as the user would. This demonstrates the feasibility of using easily acquired physician attributes to develop a user model that can successfully filter articles of interest from a large undifferentiated collection. Further testing and development is required to optimize the custom filter and to determine which characteristics should be included in the shareable user model and which should be obtained by individual applications. PMID:8591472

  2. A User Modeling System for Personalized Interaction and Tailored Retrieval in Interactive IR.

    ERIC Educational Resources Information Center

    Kelly, Diane; Belkin, Nicholas J.

    2002-01-01

    Presents a user modeling system for personalized interaction and tailored retrieval that tracks interactions over time, represents multiple information needs, allows for changes in information needs, acquires and updates the user model automatically, and accounts for contextual factors. Describes three models: general behavioral, personal…

  3. Understanding the Impact of User Frustration Intensities on Task Performance Using the OCC Theory of Emotions

    NASA Technical Reports Server (NTRS)

    Washington, Gloria

    2012-01-01

    Have you heard the saying "frustration is written all over your falce"? Well this saying is true, but that is not the only place. Frustration is written all over your face and your body. The human body has various means to communicate an emotion without the utterance of a single word. The Media Equation says that people interact with computers as if they are human: this includes experiencing frustration. This research measures frustration by monitoring human body-based measures such as heart rate, posture, skin temperature. and respiration. The OCC Theory of Emotions is used to separate frustration into different levels or intensities. The results of this study showed that individual intensities of frustration exist, so that task performance is not degraded. Results from this study can be used by usability testers to model how much frustration is needed before task performance measures start to decrease.

  4. A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James

    1994-01-01

    Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.

  5. User`s guide to the Yucca Mountain Integrating Model (YMIM) Version 2.1

    SciTech Connect

    Gansemer, J.; Lamont, A.

    1995-04-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the engineered barrier system. It contains models of the processes of waste container failure and nuclide release from the fuel rods. YMIM is driven by scenarios of container and rod temperature, near-field chemistry, and near-field hydrology provided by other modules. It is designed to be highly modular so that a model of an individual process can be easily modified to replaced without interfering with the models of other processes. This manual describes the process models and provides instructions for setting up and running YMIM Version 2.1.

  6. APEX user`s guide - (Argonne production, expansion, and exchange model for electrical systems), version 3.0

    SciTech Connect

    VanKuiken, J.C.; Veselka, T.D.; Guziel, K.A.; Blodgett, D.W.; Hamilton, S.; Kavicky, J.A.; Koritarov, V.S.; North, M.J.; Novickas, A.A.; Paprockas, K.R.

    1994-11-01

    This report describes operating procedures and background documentation for the Argonne Production, Expansion, and Exchange Model for Electrical Systems (APEX). This modeling system was developed to provide the U.S. Department of Energy, Division of Fossil Energy, Office of Coal and Electricity with in-house capabilities for addressing policy options that affect electrical utilities. To meet this objective, Argonne National Laboratory developed a menu-driven programming package that enables the user to develop and conduct simulations of production costs, system reliability, spot market network flows, and optimal system capacity expansion. The APEX system consists of three basic simulation components, supported by various databases and data management software. The components include (1) the investigation of Costs and Reliability in Utility Systems (ICARUS) model, (2) the Spot Market Network (SMN) model, and (3) the Production and Capacity Expansion (PACE) model. The ICARUS model provides generating-unit-level production-cost and reliability simulations with explicit recognition of planned and unplanned outages. The SMN model addresses optimal network flows with recognition of marginal costs, wheeling charges, and transmission constraints. The PACE model determines long-term (e.g., longer than 10 years) capacity expansion schedules on the basis of candidate expansion technologies and load growth estimates. In addition, the Automated Data Assembly Package (ADAP) and case management features simplify user-input requirements. The ADAP, ICARUS, and SMN modules are described in detail. The PACE module is expected to be addressed in a future publication.

  7. User`s guide for the CALPUFF dispersion model. Final report

    SciTech Connect

    1995-07-01

    This report describes the CALPUFF dispersion model and associated processing programs. The CALPUFF model described in this report reflect improvements to the model including (1) new modules to treat buoyant rise and dispersion from area sources (such as forest fires), buoyant line sources, and volume sources, (2) an improved treatment of complex terrain, (3) additional model switches to facilitate its use in regulatory applications, (4) an enhanced treatment of wind shear through puff splitting, and (4) an optional PC-based GUI. CALPUFF has been coupled to the Emissions Production Model (EPM) developed by the Forest Service through an interface processor. EPM provides time-dependent emissions and heat release data for use in modeling controlled burns and wildfires.

  8. INDUSTRIAL COMBUSTION EMISSIONS (ICE) MODEL, VERSION 6.0. USER'S MANUAL

    EPA Science Inventory

    The report is a user's manual for the Industrial Combustion Emissions (ICE) model. It summarizes user options and software characteristics, and describes both the input data files and procedures for operating the model. It discusses proper formatting of files and creation of job ...

  9. Satellite services system analysis study. Volume 2: Satellite and services user model

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Satellite services needs are analyzed. Topics include methodology: a satellite user model; representative servicing scenarios; potential service needs; manned, remote, and automated involvement; and inactive satellites/debris. Satellite and services user model development is considered. Groundrules and assumptions, servicing, events, and sensitivity analysis are included. Selection of references satellites is also discussed.

  10. EPA third-generation air quality modeling system: Models-3 user manual. Standard tutorial

    SciTech Connect

    1998-09-01

    Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric chemistry and physics. The initial version of Models-3 contains a Community Multi-scale Air Quality (CMAQ) modeling system for urban to regional scale air quality simulation of tropospheric ozone, acid deposition, visibility, and fine particles. Models-3 and CMAQ in combination form a powerful third generation air quality modeling and assessment system that enables a user to execute air quality simulation models and visualize their results. Models-3/CMAQ also assists the model developer to assemble, test, and evaluate science process components and their impact on chemistry-transport model predictions by facilitating the interchange of science codes, transparent use of multiple computing platforms, and access of data across the network. The Models-3/CMAQ provides flexibility to change key model specifications such as grid resolution and chemistry mechanism without rewriting the code. Models-3/CMAQ is intended to serve as a community framework for continual advancement and use of environmental assessment tools. This User Manual Tutorial serves as a guide to show the steps necessary to implement an application in Models-3/CMAQ.

  11. Model for a fundamental theory with supersymmetry

    NASA Astrophysics Data System (ADS)

    Yokoo, Seiichiro

    Physics in the year 2006 is tightly constrained by experiment, observation, and mathematical consistency. The Standard Model provides a remarkably precise description of particle physics, and general relativity is quite successful in describing gravitational phenomena. At the same time, it is clear that a more fundamental theory is needed for several distinct reasons. Here we consider a new approach, which begins with the unusually ambitious point of view that a truly fundamental theory should aspire to explaining the origins of Lorentz invariance, gravity, gauge fields and their symmetry, supersymmetry, fermionic fields, bosonic fields, quantum mechanics and spacetime. The present dissertation is organized so that it starts with the most conventional ideas for extending the Standard Model and ends with a microscopic statistical picture, which is actually the logical starting point of the theory, but which is also the most remote excursion from conventional physics. One motivation for the present work is the fact that a Euclidean path integral in quantum physics is equivalent to a partition function in statistical physics. This suggests that the most fundamental description of nature may be statistical. This dissertation may be regarded as an attempt to see how far one can go with this premise in explaining the observed phenomena, starting with the simplest statistical picture imaginable. It may be that nature is richer than the model assumed here, but the present results are quite suggestive, because, with a set of assumptions that are not unreasonable, one recovers the phenomena listed above. At the end, the present theory leads back to conventional physics, except that Lorentz invariance and supersymmetry are violated at extremely high energy. To be more specific, one obtains local Lorentz invariance (at low energy compared to the Planck scale), an SO( N) unified gauge theory (with N = 10 as the simplest possibility), supersymmetry of Standard Model fermions and

  12. Recursive renormalization group theory based subgrid modeling

    NASA Technical Reports Server (NTRS)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  13. Engaging Theories and Models to Inform Practice

    ERIC Educational Resources Information Center

    Kraus, Amanda

    2012-01-01

    Helping students prepare for the complex transition to life after graduation is an important responsibility shared by those in student affairs and others in higher education. This chapter explores theories and models that can inform student affairs practitioners and faculty in preparing students for life after college. The focus is on roles,…

  14. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  15. A Catastrophe Theory Model of Attitude Change.

    ERIC Educational Resources Information Center

    Flay, Brian R.

    Within the large body of literature on attitude change, many diverse and sometimes apparently conflicting findings have been reported. A catastrophe theory model of attitude change that attempts to synthesize many of these diverse findings is proposed. Attitude change is usually monotonic with message content or the strength of the persuasion…

  16. Incorporation of Decision and Game Theories in Early-Stage Complex Product Design to Model End-Use

    NASA Astrophysics Data System (ADS)

    Mesmer, Bryan L.

    The need for design models that accurately capture the complexities of products increase as products grow ever more complicated. The accuracies of these models depend upon the inputs and the methods used on those inputs to determine an output. Product designers must determine the dominant inputs and make assumptions concerning inputs that have less of an effect. Properly capturing the important inputs in the early design stages, where designs are being simulated, allows for modifications of the design at a relatively low cost. In this dissertation, an input that has a high impact on product performance but is usually neglected until later design stages is examined. The end-users of a product interact with the product and with each other in ways that affect the performance of that product. End-users are typically brought in at the later design stages, or as representations on the design team. They are rarely used as input variables in the product models. By incorporating the end-users in the early models and simulations, the end-users' impact on performance are captured when modifications to the designs are cheaper. The methodology of capturing end-user decision making in product models, developed in this dissertation, is created using the methods of decision and game theory. These theories give a mathematical basis for decision making based on the end-users' beliefs and preferences. Due to the variations that are present in end-users' preferences, their interactions with the product cause variations in the performance. This dissertation shows that capturing the end-user interactions in simulations enables the designer to create products that are more robust to the variations of the end-users. The manipulation of a game that an individual plays to drive an outcome desired by a designer is referred to as mechanism design. This dissertation also shows how a designer can influence the end-users' decisions to optimize the designer's goals. How product controlled

  17. Theory and modeling of stellar atmospheres

    NASA Astrophysics Data System (ADS)

    Hubeny, Ivan

    2010-08-01

    I will briefly outline basic concepts of the stellar atmospheres theory. After summarizing basic structural equations describing a stellar atmospheres, an emphasis is given to describing efficient numerical methods developed to deal with the stellar atmosphere problem, namely the method of complete linearization ant its recent variants, and the whole class of methods known by name Accelerated Lambda Iteration. In the next part of the lectures I will briefly summarize existing computer codes, with an emphasis on our code TLUSTY, and list some of the most useful grids of model atmospheres that are publicly available. Next, I will show how the model atmospheres and synthetic spectra are used in quantitative stellar spectroscopy in order to determine basic stellar parameters and chemical abundances. Finally, I will briefly describe an application of model atmosphere theory and models to related objects, such as accretion disks around various accretors, and atmospheres of substellar-mass objects-extrasolar giant planets and brown dwarfs.

  18. User-Centered Innovation: A Model for "Early Usability Testing."

    ERIC Educational Resources Information Center

    Sugar, William A.; Boling, Elizabeth

    The goal of this study is to show how some concepts and techniques from disciplines outside Instructional Systems Development (ISD) have the potential to extend and enhance the traditional view of ISD practice when they are employed very early in the ISD process. The concepts and techniques employed were user-centered in design and usability, and…

  19. Theory, Modeling, and Simulation of Semiconductor Lasers

    NASA Technical Reports Server (NTRS)

    Ning, Cun-Zheng; Saini, Subbash (Technical Monitor)

    1998-01-01

    Semiconductor lasers play very important roles in many areas of information technology. In this talk, I will first give an overview of semiconductor laser theory. This will be followed by a description of different models and their shortcomings in modeling and simulation. Our recent efforts in constructing a fully space and time resolved simulation model will then be described. Simulation results based on our model will be presented. Finally the effort towards a self-consistent and comprehensive simulation capability for the opto-electronics integrated circuits (OEICs) will be briefly reviewed.

  20. A Study of Context-Awareness RBAC Model Using User Profile on Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Jang, Bokman; Park, Sungdo; Chang, Hyokyung; Ahn, Hyosik; Choi, Euiin

    Recently, With the IT technique growth, there is getting formed to convert to ubiquitous environment that means it can access information everywhere and every-time using various devices, and the computer can decide to provide useful services to users. But, in this computing environment will be connected to wireless network and various devices. According to, recklessness approaches of information resource make trouble to system. So, access authority management is very important issue both information resource and adapt to system through founding security policy to need a system. So, this model has a problem that is not concerned about user's context information as user's profile. In this paper suppose to context-awareness RABC model that based on profile about which user's information which provide efficiently access control to user through active classification, inference and judgment about user who access to system and resource.

  1. LDA-Based Unified Topic Modeling for Similar TV User Grouping and TV Program Recommendation.

    PubMed

    Pyo, Shinjee; Kim, Eunhui; Kim, Munchurl

    2015-08-01

    Social TV is a social media service via TV and social networks through which TV users exchange their experiences about TV programs that they are viewing. For social TV service, two technical aspects are envisioned: grouping of similar TV users to create social TV communities and recommending TV programs based on group and personal interests for personalizing TV. In this paper, we propose a unified topic model based on grouping of similar TV users and recommending TV programs as a social TV service. The proposed unified topic model employs two latent Dirichlet allocation (LDA) models. One is a topic model of TV users, and the other is a topic model of the description words for viewed TV programs. The two LDA models are then integrated via a topic proportion parameter for TV programs, which enforces the grouping of similar TV users and associated description words for watched TV programs at the same time in a unified topic modeling framework. The unified model identifies the semantic relation between TV user groups and TV program description word groups so that more meaningful TV program recommendations can be made. The unified topic model also overcomes an item ramp-up problem such that new TV programs can be reliably recommended to TV users. Furthermore, from the topic model of TV users, TV users with similar tastes can be grouped as topics, which can then be recommended as social TV communities. To verify our proposed method of unified topic-modeling-based TV user grouping and TV program recommendation for social TV services, in our experiments, we used real TV viewing history data and electronic program guide data from a seven-month period collected by a TV poll agency. The experimental results show that the proposed unified topic model yields an average 81.4% precision for 50 topics in TV program recommendation and its performance is an average of 6.5% higher than that of the topic model of TV users only. For TV user prediction with new TV programs, the average

  2. Application of model search to lattice theory.

    SciTech Connect

    Rose, M.; Wilkinson, K.; Mathematics and Computer Science

    2001-08-01

    We have used the first-order model-searching programs MACE and SEM to study various problems in lattice theory. First, we present a case study in which the two programs are used to examine the differences between the stages along the way from lattice theory to Boolean algebra. Second, we answer several questions posed by Norman Megill and Mladen Pavicic on ortholattices and orthomodular lattices. The questions from Megill and Pavicic arose in their study of quantum logics, which are being investigated in connection with proposed computing devices based on quantum mechanics. Previous questions of a similar nature were answered by McCune and MACE in [2].

  3. Viscous wing theory development. Volume 2: GRUMWING computer program user's manual

    NASA Technical Reports Server (NTRS)

    Chow, R. R.; Ogilvie, P. L.

    1986-01-01

    This report is a user's manual which describes the operation of the computer program, GRUMWING. The program computes the viscous transonic flow over three-dimensional wings using a boundary layer type viscid-inviscid interaction approach. The inviscid solution is obtained by an approximate factorization (AFZ)method for the full potential equation. The boundary layer solution is based on integral entrainment methods.

  4. Informatic system for a global tissue-fluid biorepository with a graph theory-oriented graphical user interface.

    PubMed

    Butler, William E; Atai, Nadia; Carter, Bob; Hochberg, Fred

    2014-01-01

    The Richard Floor Biorepository supports collaborative studies of extracellular vesicles (EVs) found in human fluids and tissue specimens. The current emphasis is on biomarkers for central nervous system neoplasms but its structure may serve as a template for collaborative EV translational studies in other fields. The informatic system provides specimen inventory tracking with bar codes assigned to specimens and containers and projects, is hosted on globalized cloud computing resources, and embeds a suite of shared documents, calendars, and video-conferencing features. Clinical data are recorded in relation to molecular EV attributes and may be tagged with terms drawn from a network of externally maintained ontologies thus offering expansion of the system as the field matures. We fashioned the graphical user interface (GUI) around a web-based data visualization package. This system is now in an early stage of deployment, mainly focused on specimen tracking and clinical, laboratory, and imaging data capture in support of studies to optimize detection and analysis of brain tumour-specific mutations. It currently includes 4,392 specimens drawn from 611 subjects, the majority with brain tumours. As EV science evolves, we plan biorepository changes which may reflect multi-institutional collaborations, proteomic interfaces, additional biofluids, changes in operating procedures and kits for specimen handling, novel procedures for detection of tumour-specific EVs, and for RNA extraction and changes in the taxonomy of EVs. We have used an ontology-driven data model and web-based architecture with a graph theory-driven GUI to accommodate and stimulate the semantic web of EV science. PMID:25317275

  5. Density Functional Theory Models for Radiation Damage

    NASA Astrophysics Data System (ADS)

    Dudarev, S. L.

    2013-07-01

    Density functional theory models developed over the past decade provide unique information about the structure of nanoscale defects produced by irradiation and about the nature of short-range interaction between radiation defects, clustering of defects, and their migration pathways. These ab initio models, involving no experimental input parameters, appear to be as quantitatively accurate and informative as the most advanced experimental techniques developed for the observation of radiation damage phenomena. Density functional theory models have effectively created a new paradigm for the scientific investigation and assessment of radiation damage effects, offering new insight into the origin of temperature- and dose-dependent response of materials to irradiation, a problem of pivotal significance for applications.

  6. Crack propagation modeling using Peridynamic theory

    NASA Astrophysics Data System (ADS)

    Hafezi, M. H.; Alebrahim, R.; Kundu, T.

    2016-04-01

    Crack propagation and branching are modeled using nonlocal peridynamic theory. One major advantage of this nonlocal theory based analysis tool is the unifying approach towards material behavior modeling - irrespective of whether the crack is formed in the material or not. No separate damage law is needed for crack initiation and propagation. This theory overcomes the weaknesses of existing continuum mechanics based numerical tools (e.g. FEM, XFEM etc.) for identifying fracture modes and does not require any simplifying assumptions. Cracks grow autonomously and not necessarily along a prescribed path. However, in some special situations such as in case of ductile fracture, the damage evolution and failure depend on parameters characterizing the local stress state instead of peridynamic damage modeling technique developed for brittle fracture. For brittle fracture modeling the bond is simply broken when the failure criterion is satisfied. This simulation helps us to design more reliable modeling tool for crack propagation and branching in both brittle and ductile materials. Peridynamic analysis has been found to be very demanding computationally, particularly for real-world structures (e.g. vehicles, aircrafts, etc.). It also requires a very expensive visualization process. The goal of this paper is to bring awareness to researchers the impact of this cutting-edge simulation tool for a better understanding of the cracked material response. A computer code has been developed to implement the peridynamic theory based modeling tool for two-dimensional analysis. A good agreement between our predictions and previously published results is observed. Some interesting new results that have not been reported earlier by others are also obtained and presented in this paper. The final objective of this investigation is to increase the mechanics knowledge of self-similar and self-affine cracks.

  7. The Oak Ridge National Laboratory automobile heat pump model: User`s guide

    SciTech Connect

    Kyle, D.M.

    1993-05-01

    A computer program has been developed to predict the steady-state performance of vapor compression automobile air conditioners and heat pumps. The code is based on the residential heat pump model developed at Oak Ridge National Laboratory. Most calculations are based on fundamental physical principles, in conjunction with generalized correlations available in the research literature. Automobile air conditioning components that can be specified as inputs to the program include open and hermetic compressors; finned tube condensers; finned tube and plate-fin style evaporators; thermal expansion valve, capillary tube and short tube expansion devices; refrigerant mass; evaporator pressure regulator; and all interconnecting tubing. The program can be used with a variety of refrigerants, including R134a. Methodologies are discussed for using the model as a tool for designing all new systems or, alternatively, as a tool for simulating a known system for a variety of operating conditions.

  8. The partonic interpretation of reggeon theory models

    NASA Astrophysics Data System (ADS)

    Boreskov, K. G.; Kaidalov, A. B.; Khoze, V. A.; Martin, A. D.; Ryskin, M. G.

    2005-12-01

    We review the physical content of the two simplest models of reggeon field theory: namely the eikonal and the Schwimmer models. The AGK cutting rules are used to obtain the inclusive, the inelastic and the diffractive cross sections. The system of non-linear equations for these cross sections is written down and analytic expressions for its solution are obtained. We derive the rapidity gap dependence of the differential cross sections for diffractive dissociation in the Schwimmer model and in its eikonalized extension. The results are interpreted from the partonic viewpoint of the interaction at high energies.

  9. Vulnerability and the intention to anabolic steroids use among Iranian gym users: an application of the theory of planned behavior.

    PubMed

    Allahverdipour, Hamid; Jalilian, Farzad; Shaghaghi, Abdolreza

    2012-02-01

    This correlational study explored the psychological antecedents of 253 Iranian gym users' intentions to use the anabolic-androgenic steroids (AAS), based on the Theory of Planned Behavior (TPB). The three predictor variables of (1) attitude, (2) subjective norms, and (3) perceived behavioral control accounted for 63% of the variation in the outcome measure of the intention to use the AAS. There is some support to use the TPB to design and implement interventions to modify and/or improve individuals' beliefs that athletic goals are achievable without the use of the AAS. PMID:22217129

  10. Topos models for physics and topos theory

    SciTech Connect

    Wolters, Sander

    2014-08-15

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  11. UNAMAP: user's network for applied modeling of air pollution, Version 6. Model

    SciTech Connect

    Turner, D.B.; Busse, A.D.

    1986-08-01

    UNAMAP (Version 6) represents the 1986 update to the users network for applied modeling of air pollution. UNAMAP consists of an ASCII magnetic tape containing FORTRAN codes an test data for 25 air-quality simulation models (AQSM) as well as associated documentation. AQSM's and supporting programs and data are arranged in six sections: (1) Guideline (appendix A) models..(files 2 through 9); (2) Other models or processors (new models). .(files 10 through 19 and 33); (3) Other models and processors (revised)..(files 20 through 27 and 32); (4) Additional models for regulatory use (files 28 through 31); (5) Data files..(files 34 through 39); and (6) Output print files..(files 40 through 68). There are 68 files on this tape..Software Description: The system is written in FORTRAN for implementation on a UNIVAC 1100/82 using the 39R2 operating system.

  12. Unique Metadata Schemas: A Model for User-Centric Design of a Performance Support System

    ERIC Educational Resources Information Center

    Schatz, Steven C.

    2005-01-01

    Learning object technology is viewed as a method for fast retrieval. This effort is on developing unique schemas for a targeted group to aid efficient retrieval. In this article, I study a user-centric model for developing tags for K-12 educators that is based on user needs, expectations, and problems. I use a combination of techniques from human…

  13. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    SciTech Connect

    Bloyd, C.; Camp, J.; Conzelmann, G.

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  14. Solid Waste Projection Model: Database User`s Guide. Version 1.4

    SciTech Connect

    Blackburn, C.L.

    1993-10-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for using Version 1.4 of the SWPM database: system requirements and preparation, entering and maintaining data, and performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not Provide instruction in the use of Paradox, the database management system in which the SWPM database is established.

  15. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    NASA Astrophysics Data System (ADS)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  16. Prospects for Advanced RF Theory and Modeling

    SciTech Connect

    Batchelor, D.B.

    1999-04-12

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed.

  17. User's guide to the MESOI diffusion model and to the utility programs UPDATE and LOGRVU

    SciTech Connect

    Athey, G.F.; Allwine, K.J.; Ramsdell, J.V.

    1981-11-01

    MESOI is an interactive, Lagrangian puff trajectory diffusion model. The model is documented separately (Ramsdell and Athey, 1981); this report is intended to provide MESOI users with the information needed to successfully conduct model simulations. The user is also provided with guidance in the use of the data file maintenance and review programs; UPDATE and LOGRVU. Complete examples are given for the operaton of all three programs and an appendix documents UPDATE and LOGRVU.

  18. Quantum mechanical model in gravity theory

    NASA Astrophysics Data System (ADS)

    Losyakov, V. V.

    2016-05-01

    We consider a model of a real massive scalar field defined as homogeneous on a d-dimensional sphere such that the sphere radius, time scale, and scalar field are related by the equations of the general theory of relativity. We quantize this system with three degrees of freedom, define the observables, and find dynamical mean values of observables in the regime where the scalar field mass is much less than the Planck mass.

  19. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia.

    PubMed

    Segkouli, Sofia; Paliokas, Ioannis; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  20. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia

    PubMed Central

    Segkouli, Sofia; Paliokas, Ioannis; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  1. Physics models and user`s guide for the neutral beam module of the SUPERCODE

    SciTech Connect

    Mandrekas, J.

    1992-08-01

    This report contains a description of the neutral beam heating and current drive module Beams, that was developed at Georgia Tech for the SUPERCODE, the new systems and operations code for the ITER EDA. The NB module calculates profiles of the neutral beam deposition, fast ion pressure, beam heating power, and neutral beam driven current density. It also computes global parameters such as current drive efficiencies, beam shinethrough, fast beam ion beta, and the fusion power and neutron production due to beam-plasma interactions. The most important consideration during the development of this module was to make it compute normally fast without compromising physical accuracy. We believe that through careful selection of physical models and optimized coding, these conflicting requirements have been largely met. As a result, the SUPERCODE has now the ability to perform self-consistent calculations involving NB heating and current drive. This capability is very important for the study of sub-ignited, hybrid, or steady-state ITER and post-TFIR reactor operating scenarios. It is also the first time that a systems code has had such capabilities, usually found only in 1-1/2D plasma transport codes.

  2. Communications network design and costing model users manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.

  3. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    ERIC Educational Resources Information Center

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  4. Theory, modeling and simulation: Annual report 1993

    SciTech Connect

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  5. General topology meets model theory, on and

    PubMed Central

    Malliaris, Maryanthe; Shelah, Saharon

    2013-01-01

    Cantor proved in 1874 [Cantor G (1874) J Reine Angew Math 77:258–262] that the continuum is uncountable, and Hilbert’s first problem asks whether it is the smallest uncountable cardinal. A program arose to study cardinal invariants of the continuum, which measure the size of the continuum in various ways. By Gödel [Gödel K (1939) Proc Natl Acad Sci USA 25(4):220–224] and Cohen [Cohen P (1963) Proc Natl Acad Sci USA 50(6):1143–1148], Hilbert’s first problem is independent of ZFC (Zermelo-Fraenkel set theory with the axiom of choice). Much work both before and since has been done on inequalities between these cardinal invariants, but some basic questions have remained open despite Cohen’s introduction of forcing. The oldest and perhaps most famous of these is whether “,” which was proved in a special case by Rothberger [Rothberger F (1948) Fund Math 35:29–46], building on Hausdorff [Hausdorff (1936) Fund Math 26:241–255]. In this paper we explain how our work on the structure of Keisler’s order, a large-scale classification problem in model theory, led to the solution of this problem in ZFC as well as of an a priori unrelated open question in model theory. PMID:23836659

  6. Modeling Integrated Water-User Decisions with Intermittent Supplies

    NASA Astrophysics Data System (ADS)

    Lund, J. R.; Rosenberg, D.

    2006-12-01

    We present an economic-engineering method to estimate urban water use demands with intermittent water supplies. A two-stage, probabilistic optimization formulation includes a wide variety of water supply enhancement and conservation actions that individual households can adopt to meet multiple water quality uses with uncertain water availability. We embed the optimization in Monte-Carlo simulations to show aggregate effects at a utility (citywide) scale for a population of user conditions and decisions. Parametric analysis provides derivations of supply curves to subsidize conservation, demand responses to alternative pricing, and customer willingness-to-pay to avoid shortages. Results show a good empirical fit for the average and distribution of billed residential water use in Amman, Jordan. Additional outputs give likely market penetration rates for household conservation actions, associated water savings, and subsidies required to entice further adoption. We discuss new insights to size, target, market, and finance conservation programs and interpret a demand curve with block pricing.

  7. Algorithm for model validation: Theory and applications

    PubMed Central

    Sornette, D.; Davis, A. B.; Ide, K.; Vixie, K. R.; Pisarenko, V.; Kamm, J. R.

    2007-01-01

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer–Meshkov instability. PMID:17420476

  8. Algorithm for model validation: theory and applications.

    PubMed

    Sornette, D; Davis, A B; Ide, K; Vixie, K R; Pisarenko, V; Kamm, J R

    2007-04-17

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer-Meshkov instability. PMID:17420476

  9. OCD: The offshore and coastal dispersion model. Volume 1. User's guide

    SciTech Connect

    DiCristofaro, D.C.; Hanna, S.R.

    1989-11-01

    The Offshore and Coastal Dispersion (OCD) Model has been developed to simulate the effect of offshore emissions from point, area, or line sources on the air quality of coastal regions. The OCD model was adapted from the EPA guideline model MPTER (EPA, 1980). Modifications were made to incorporate overwater plume transport and dispersion as well as changes that occur as the plume crosses the shoreline. This is a revised OCD model, the fourth version to date. The volume is the User's Guide which includes a Model overview, technical description, user's instructions, and notes on model evaluation and results.

  10. Users guide for the hydroacoustic coverage assessment model (HydroCAM)

    SciTech Connect

    Farrell, T., LLNL

    1997-12-01

    A model for predicting the detection and localization performance of hydroacoustic monitoring networks has been developed. The model accounts for major factors affecting global-scale acoustic propagation in the ocean. including horizontal refraction, travel time variability due to spatial and temporal fluctuations in the ocean, and detailed characteristics of the source. Graphical user interfaces are provided to setup the models and visualize the results. The model produces maps of network detection coverage and localization area of uncertainty, as well as intermediate results such as predicted path amplitudes, travel time and travel time variance. This Users Guide for the model is organized into three sections. First a summary of functionality available in the model is presented, including example output products. The second section provides detailed descriptions of each of models contained in the system. The last section describes how to run the model, including a summary of each data input form in the user interface.

  11. Empirical Analysis and Modeling of Users' Topic Interests in Online Forums

    PubMed Central

    Xiong, Fei; Liu, Yun

    2012-01-01

    Bulletin Board Systems (BBSs) have demonstrated their usefulness in spreading information. In BBS forums, a few posts that address currently popular social topics attract a lot of attention, and different users are interested in many different discussion topics. We investigate topic cluster features and user interests of an actual BBS forum, analyzing user posting and replying behavior. According to the growing process of BBS, we suggest a network model in which each agent only replies to the posts that belong to its specific topics of interest. A post that is replied to will be immediately assigned the highest priority on the post list. Simulation results show that characteristics of our model are similar to those of the real BBS. The model with heterogeneous user interests promotes the occurrence of popular posts, and the user relationship network possesses a large clustering coefficient. Bursts and long waiting time exist in user replying behavior, leading to non-Poisson user activity pattern. In addition, the model produces an analogous evolving trend of Gini coefficients for posts' and clusters' participants as BBS forums. PMID:23251401

  12. USER'S GUIDE TO THE MESOPUFF II MODEL AND RELATED PROCESSOR PROGRAMS

    EPA Science Inventory

    A complete set of user instructions are provided for the MESOPUFF II regional-scale air quality modeling package. The MESOPUFF II model is a Lagrangian variable-trajectory puff superposition model suitable for modeling the transport, diffusion, and removal of air pollutants from ...

  13. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    EPA Science Inventory

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  14. User's manual for the Human Exposure Model (HEM). Interim report

    SciTech Connect

    Not Available

    1986-06-01

    This document describes the Human Exposure Model, furnishes contact personnel to establish access to the UNIVAC System, and provides step-by-step instructions for operating both the SHED and SHEAR portions of the model. The manual also lists caveats that should be considered when using the HEM and criteria to distinguish situations that are appropriately modeled by each portion of HEM. The intended audience ranges from someone with limited knowledge of modeling to someone well acquainted with the UNIVAC.

  15. USER GUIDE FOR THE ENHANCED HYDRODYNAMICAL-NUMERICAL MODEL

    EPA Science Inventory

    This guide provides the documentation required for used of the Enhanced Hydrodynamical-Numerical Model on operational problems. The enhanced model is a multilayer Hansen type model extended to handle near-shore processes by including: Non-linear term extension to facilitate small...

  16. Jobs and Economic Development Impact (JEDI) Model Geothermal User Reference Guide

    SciTech Connect

    Johnson, C.; Augustine, C.; Goldberg, M.

    2012-09-01

    The Geothermal Jobs and Economic Development Impact (JEDI) model, developed through the National Renewable Energy Laboratory (NREL), is an Excel-based user-friendly tools that estimates the economic impacts of constructing and operating hydrothermal and Enhanced Geothermal System (EGS) power generation projects at the local level for a range of conventional and renewable energy technologies. The JEDI Model Geothermal User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.

  17. Standard Model as a Double Field Theory.

    PubMed

    Choi, Kang-Sin; Park, Jeong-Hyuck

    2015-10-23

    We show that, without any extra physical degree introduced, the standard model can be readily reformulated as a double field theory. Consequently, the standard model can couple to an arbitrary stringy gravitational background in an O(4,4) T-duality covariant manner and manifest two independent local Lorentz symmetries, Spin(1,3)×Spin(3,1). While the diagonal gauge fixing of the twofold spin groups leads to the conventional formulation on the flat Minkowskian background, the enhanced symmetry makes the standard model more rigid, and also stringy, than it appeared. The CP violating θ term may no longer be allowed by the symmetry, and hence the strong CP problem can be solved. There are now stronger constraints imposed on the possible higher order corrections. We speculate that the quarks and the leptons may belong to the two different spin classes. PMID:26551099

  18. Queuing theory models for computer networks

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  19. Standard Model as a Double Field Theory

    NASA Astrophysics Data System (ADS)

    Choi, Kang-Sin; Park, Jeong-Hyuck

    2015-10-01

    We show that, without any extra physical degree introduced, the standard model can be readily reformulated as a double field theory. Consequently, the standard model can couple to an arbitrary stringy gravitational background in an O (4 ,4 ) T -duality covariant manner and manifest two independent local Lorentz symmetries, Spin(1 ,3 )×Spin(3 ,1 ) . While the diagonal gauge fixing of the twofold spin groups leads to the conventional formulation on the flat Minkowskian background, the enhanced symmetry makes the standard model more rigid, and also stringy, than it appeared. The C P violating θ term may no longer be allowed by the symmetry, and hence the strong C P problem can be solved. There are now stronger constraints imposed on the possible higher order corrections. We speculate that the quarks and the leptons may belong to the two different spin classes.

  20. User's manual for interactive LINEAR: A FORTRAN program to derive linear aircraft models

    NASA Technical Reports Server (NTRS)

    Antoniewicz, Robert F.; Duke, Eugene L.; Patterson, Brian P.

    1988-01-01

    An interactive FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft aerodynamic models is documented in this report. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied linear or nonlinear aerodynamic model. The nonlinear equations of motion used are six-degree-of-freedom equations with stationary atmosphere and flat, nonrotating earth assumptions. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.

  1. Compass models: Theory and physical motivations

    NASA Astrophysics Data System (ADS)

    Nussinov, Zohar; van den Brink, Jeroen

    2015-01-01

    Compass models are theories of matter in which the couplings between the internal spin (or other relevant field) components are inherently spatially (typically, direction) dependent. A simple illustrative example is furnished by the 90° compass model on a square lattice in which only couplings of the form τixτjx (where {τia}a denote Pauli operators at site i ) are associated with nearest-neighbor sites i and j separated along the x axis of the lattice while τiyτjy couplings appear for sites separated by a lattice constant along the y axis. Similar compass-type interactions can appear in diverse physical systems. For instance, compass models describe Mott insulators with orbital degrees of freedom where interactions sensitively depend on the spatial orientation of the orbitals involved as well as the low-energy effective theories of frustrated quantum magnets, and a host of other systems such as vacancy centers, and cold atomic gases. The fundamental interdependence between internal (spin, orbital, or other) and external (i.e., spatial) degrees of freedom which underlies compass models generally leads to very rich behaviors, including the frustration of (semi-)classical ordered states on nonfrustrated lattices, and to enhanced quantum effects, prompting, in certain cases, the appearance of zero-temperature quantum spin liquids. As a consequence of these frustrations, new types of symmetries and their associated degeneracies may appear. These intermediate symmetries lie midway between the extremes of global symmetries and local gauge symmetries and lead to effective dimensional reductions. In this article, compass models are reviewed in a unified manner, paying close attention to exact consequences of these symmetries and to thermal and quantum fluctuations that stabilize orders via order-out-of-disorder effects. This is complemented by a survey of numerical results. In addition to reviewing past works, a number of other models are introduced and new results

  2. Polarimetric clutter modeling: Theory and application

    NASA Technical Reports Server (NTRS)

    Kong, J. A.; Lin, F. C.; Borgeaud, M.; Yueh, H. A.; Swartz, A. A.; Lim, H. H.; Shim, R. T.; Novak, L. M.

    1988-01-01

    The two-layer anisotropic random medium model is used to investigate fully polarimetric scattering properties of earth terrain media. The polarization covariance matrices for the untilted and tilted uniaxial random medium are evaluated using the strong fluctuation theory and distorted Born approximation. In order to account for the azimuthal randomness in the growth direction of leaves in tree and grass fields, an averaging scheme over the azimuthal direction is also applied. It is found that characteristics of terrain clutter can be identified through the analysis of each element of the covariance matrix. Theoretical results are illustrated by the comparison with experimental data provided by MIT Lincoln Laboratory for tree and grass fields.

  3. HIGHWAY 3.1: An enhanced HIGHWAY routing model: Program description, methodology, and revised user`s manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S.; Clarke, D.B.; Jacobi, J.M.

    1993-03-01

    The HIGHWAY program provides a flexible tool for predicting highway routes for transporting radioactive materials in the United States. The HIGHWAY data base is essentially a computerized road atlas that currently describes over 240,000 miles of highways. Complete descriptions of all Interstate System and most US highways (except those that parallel a nearby Interstate highway) are included in the data base. Many of the principal state highways and a number of local and county highways are also identified. The data base also includes locations of nuclear facilities and major airports. Several different types of routes may be calculated, depending on a set of user-supplied constraints. Routes are calculated by minimizing the total impedance between the origin and the destination. Basically, the impedance is defined as a function of distance and driving time along a particular highway segment. Several routing constraints can be imposed during the calculations. One of the special features of the HIGHWAY model is its ability to calculate routes that maximize use of Interstate System highways. This feature allows the user to predict routes for shipments of radioactive materials that conform to the US Department of Transportation routing regulations. Other features of the model include the ability to predict routes that bypass a specific state, city, town, or highway segment. Two special features have been incorporated in HIGHWAY, version 3.1. The first is the ability to automatically calculate alternative routes. Frequently, there are a number of routes between the source and destination that vary slightly in distance and estimated driving time. The HIGHWAY program offers a selection of different but nearly equal routes. The second special feature is the capability to calculate route-specific population density statistics. The population density distribution is calculated for each highway segment in the route and is reported on a state-by-state basis.

  4. Solar Advisor Model User Guide for Version 2.0

    SciTech Connect

    Gilman, P.; Blair, N.; Mehos, M.; Christensen, C.; Janzou, S.; Cameron, C.

    2008-08-01

    The Solar Advisor Model (SAM) provides a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, from photovoltaic systems for residential and commercial markets to concentrating solar power and large photovoltaic systems for utility markets. This manual describes Version 2.0 of the software, which can model photovoltaic and concentrating solar power technologies for electric applications for several markets. The current version of the Solar Advisor Model does not model solar heating and lighting technologies.

  5. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable

  6. Using chemical organization theory for model checking

    PubMed Central

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter

    2009-01-01

    Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. Results: First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. Availability: All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools Contact: dittrich@minet.uni-jena.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19468053

  7. Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation.

    NASA Astrophysics Data System (ADS)

    Lawley, Russell; Barron, Mark; Lee, Katy

    2014-05-01

    Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation. R. Lawley, M. Barron and K. Lee. NERC - British Geological Survey, Environmental Science Centre, Keyworth, Nottingham, UK, NG12 5GG The boundaries mapped in traditional field geological survey are subject to a wide range of inherent uncertainties. A map at a survey-scale of 1:10,000 is created by a combination of terrain interpretation, direct observations from boreholes and exposures (often sparsely distributed), and indirect interpretation of proxy variables such as soil properties, vegetation and remotely sensed images. A critical factor influencing the quality of the final map is the skill and experience of the surveyor to bring this information together in a coherent conceptual model. The users of geological data comprising or based on mapped boundaries are increasingly aware of these uncertainties, and want to know how to manage them. The growth of 3D modelling, which takes 2D surveys as a starting point, adds urgency to the need for a better understanding of survey uncertainties; particularly where 2D mapping of variable vintage has been compiled into a national coverage. Previous attempts to apply confidence on the basis of metrics such as data density, survey age or survey techniques have proved useful for isolating single, critical, factors but do not generally succeed in evaluating geological mapping 'in the round', because they cannot account for the 'conceptual' skill set of the surveyor. The British Geological Survey (BGS) is using expert elicitation methods to gain a better understanding of uncertainties within the national geological map of Great Britain. The expert elicitation approach starts with the assumption that experienced surveyors have an intuitive sense of the uncertainty of the boundaries that they map, based on a tacit model of geology and its complexity and the nature of the surveying process. The objective of

  8. STORM WATER MANAGEMENT MODEL, VERSION 4. PART A: USER'S MANUAL

    EPA Science Inventory

    The EPA Storm Water Management Model (SWMM) is a comprehensive mathematical model for simulation of urban runoff water quality and quantity in storm and combined sewer systems. All aspects of the urban hydrologic and quality cycles are simulated, including surface and subsurface ...

  9. USER'S GUIDE FOR PEM-2: POLLUTION EPISODIC MODEL (VERSION 2)

    EPA Science Inventory

    The Pollution Episodic Model Version 2 (PEM-2) is an urban-scale model designed to predict short term average ground-level concentrations and deposition fluxes of one or two gaseous or particulate pollutants at multiple receptors. The two pollutants may be non-reactive, or chemic...

  10. FABRIC FILTER MODEL FORMAT CHANGE; VOLUME II. USER'S GUIDE

    EPA Science Inventory

    The report describes an improved mathematical model for use by control personnel to determine the adequacy of existing or proposed filter systems designed to minimize coal fly ash emissions. Several time-saving steps have been introduced to facilitate model application by Agency ...