Science.gov

Sample records for model theory user

  1. The Sandia GeoModel : theory and user's guide.

    SciTech Connect

    Brannon, Rebecca Moss; Fossum, Arlo Frederick

    2004-08-01

    The mathematical and physical foundations and domain of applicability of Sandia's GeoModel are presented along with descriptions of the source code and user instructions. The model is designed to be used in conventional finite element architectures, and (to date) it has been installed in five host codes without requiring customizing the model subroutines for any of these different installations. Although developed for application to geological materials, the GeoModel actually applies to a much broader class of materials, including rock-like engineered materials (such as concretes and ceramics) and even to metals when simplified parameters are used. Nonlinear elasticity is supported through an empirically fitted function that has been found to be well-suited to a wide variety of materials. Fundamentally, the GeoModel is a generalized plasticity model. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. The geomodel supports deformation-induced anisotropy in a limited capacity through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). Aside from kinematic hardening, however, the governing equations are otherwise isotropic. The GeoModel is a genuine unification and generalization of simpler models. The GeoModel can employ up to 40 material input and control parameters in the rare case when all features are used. Simpler idealizations (such as linear elasticity, or Von Mises yield, or Mohr-Coulomb failure) can be replicated by simply using fewer parameters. For high-strain-rate applications, the GeoModel supports rate dependence through an overstress model.

  2. WASP7 Stream Transport - Model Theory and User's Guide: Supplement to Water Quality Analysis Simulation Program (WASP) User Documentation

    EPA Science Inventory

    The standard WASP7 stream transport model calculates water flow through a branching stream network that may include both free-flowing and ponded segments. This supplemental user manual documents the hydraulic algorithms, including the transport and hydrogeometry equations, the m...

  3. A Comprehensive and Systematic Model of User Evaluation of Web Search Engines: I. Theory and Background.

    ERIC Educational Resources Information Center

    Su, Louise T.

    2003-01-01

    Reports on a project that proposes and tests a comprehensive and systematic model of user evaluation of Web search engines. This article describes the model, including a set of criteria and measures and a method for implementation. A literature review portrays settings for developing the model and places applications of the model in contemporary…

  4. Repository Integration Program: RIP performance assessment and strategy evaluation model theory manual and user`s guide

    SciTech Connect

    1995-11-01

    This report describes the theory and capabilities of RIP (Repository Integration Program). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ``big picture`` and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a `` top down`` approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ``top down`` approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers.

  5. Section 3. The SPARROW Surface Water-Quality Model: Theory, Application and User Documentation

    USGS Publications Warehouse

    Schwarz, G.E.; Hoos, A.B.; Alexander, R.B.; Smith, R.A.

    2006-01-01

    SPARROW (SPAtially Referenced Regressions On Watershed attributes) is a watershed modeling technique for relating water-quality measurements made at a network of monitoring stations to attributes of the watersheds containing the stations. The core of the model consists of a nonlinear regression equation describing the non-conservative transport of contaminants from point and diffuse sources on land to rivers and through the stream and river network. The model predicts contaminant flux, concentration, and yield in streams and has been used to evaluate alternative hypotheses about the important contaminant sources and watershed properties that control transport over large spatial scales. This report provides documentation for the SPARROW modeling technique and computer software to guide users in constructing and applying basic SPARROW models. The documentation gives details of the SPARROW software, including the input data and installation requirements, and guidance in the specification, calibration, and application of basic SPARROW models, as well as descriptions of the model output and its interpretation. The documentation is intended for both researchers and water-resource managers with interest in using the results of existing models and developing and applying new SPARROW models. The documentation of the model is presented in two parts. Part 1 provides a theoretical and practical introduction to SPARROW modeling techniques, which includes a discussion of the objectives, conceptual attributes, and model infrastructure of SPARROW. Part 1 also includes background on the commonly used model specifications and the methods for estimating and evaluating parameters, evaluating model fit, and generating water-quality predictions and measures of uncertainty. Part 2 provides a user's guide to SPARROW, which includes a discussion of the software architecture and details of the model input requirements and output files, graphs, and maps. The text documentation and computer

  6. WASP4, a hydrodynamic and water-quality model - model theory, user's manual, and programmer's guide

    SciTech Connect

    Ambrose, R.B.; Wool, T.A.; Connolly, J.P.; Schanz, R.W.

    1988-01-01

    The Water Quality Analysis Simulation Program Version 4 (WASP4) is a dynamic compartment-modeling system that can be used to analyze a variety of water-quality problems in a diverse set of water bodies. WASP4 simulates the transport and transformation of conventional and toxic pollutants in the water column and benthos of ponds, streams, lakes, reservoirs, rivers, estuaries, and coastal waters. The WASP4 modeling system covers four major subjects--hydrodynamics, conservative mass transport, eutrophication-dissolved oxygen kinetics, and toxic chemical-sediment dynamics. The WASP4 modeling system consists of two stand-alone computer programs, DYNHYD4 and WASP4, that can be run in conjunction or separately. The hydrodynamic program, DYNHYD4, simulates the movement of water and the water quality program, WASP4, simulates the movement and interaction of pollutants within the water. The latter program is supplied with two kinetic submodels to simulate two of the major classes of water-quality problems--conventional pollution (dissolved oxygen, biochemical oxygen demand, nutrients, and eutrophication) and toxic pollution (organic chemicals, heavy metals, and sediment). The substitution of either sub-model constitutes the models EUTRO4 and TOXI4, respectively.

  7. KAYENTA : theory and user's guide.

    SciTech Connect

    Brannon, Rebecca Moss; Fossum, Arlo Frederick; Strack, Otto Eric

    2009-03-01

    The physical foundations and domain of applicability of the Kayenta constitutive model are presented along with descriptions of the source code and user instructions. Kayenta, which is an outgrowth of the Sandia GeoModel, includes features and fitting functions appropriate to a broad class of materials including rocks, rock-like engineered materials (such as concretes and ceramics), and metals. Fundamentally, Kayenta is a computational framework for generalized plasticity models. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. Kayenta supports optional anisotropic elasticity associated with ubiquitous joint sets. Kayenta supports optional deformation-induced anisotropy through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). The governing equations are otherwise isotropic. Because Kayenta is a unification and generalization of simpler models, it can be run using as few as 2 parameters (for linear elasticity) to as many as 40 material and control parameters in the exceptionally rare case when all features are used. For high-strain-rate applications, Kayenta supports rate dependence through an overstress model. Isotropic damage is modeled through loss of stiffness and strength.

  8. KAYENTA: Theory and User's Guide

    SciTech Connect

    Brannon, Rebecca Moss; Fuller, Timothy Jesse; Strack, Otto Eric; Fossum, Arlo Frederick; Sanchez, Jason James

    2015-02-01

    The physical foundations and domain of applicability of the Kayenta constitutive model are presented along with descriptions of the source code and user instructions. Kayenta, which is an outgrowth of the Sandia GeoModel, includes features and fitting functions appropriate to a broad class of materials including rocks, rock-like engineered materials (such as concretes and ceramics), and metals. Fundamentally, Kayenta is a computational framework for generalized plasticity models. As such, it includes a yield surface, but the term (3z(Byield(3y (Bis generalized to include any form of inelastic material response (including microcrack growth and pore collapse) that can result in non-recovered strain upon removal of loads on a material element. Kayenta supports optional anisotropic elasticity associated with joint sets, as well as optional deformation-induced anisotropy through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). The governing equations are otherwise isotropic. Because Kayenta is a unification and generalization of simpler models, it can be run using as few as 2 parameters (for linear elasticity) to as many as 40 material and control parameters in the exceptionally rare case when all features are used. For high-strain-rate applications, Kayenta supports rate dependence through an overstress model. Isotropic damage is modeled through loss of stiffness and strength.

  9. Applying the Technology Acceptance Model and flow theory to Cyworld user behavior: implication of the Web2.0 user acceptance.

    PubMed

    Shin, Dong-Hee; Kim, Won-Yong; Kim, Won-Young

    2008-06-01

    This study explores attitudinal and behavioral patterns when using Cyworld by adopting an expanded Technology Acceptance Model (TAM). A model for Cyworld acceptance is used to examine how various factors modified from the TAM influence acceptance and its antecedents. This model is examined through an empirical study involving Cyworld users using structural equation modeling techniques. The model shows reasonably good measurement properties and the constructs are validated. The results not only confirm the model but also reveal general factors applicable to Web2.0. A set of constructs in the model can be the Web2.0-specific factors, playing as enhancing factor to attitudes and intention.

  10. Outside users payload model

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The outside users payload model which is a continuation of documents and replaces and supersedes the July 1984 edition is presented. The time period covered by this model is 1985 through 2000. The following sections are included: (1) definition of the scope of the model; (2) discussion of the methodology used; (3) overview of total demand; (4) summary of the estimated market segmentation by launch vehicle; (5) summary of the estimated market segmentation by user type; (6) details of the STS market forecast; (7) summary of transponder trends; (8) model overview by mission category; and (9) detailed mission models. All known non-NASA, non-DOD reimbursable payloads forecast to be flown by non-Soviet-block countries are included in this model with the exception of Spacelab payloads and small self contained payloads. Certain DOD-sponsored or cosponsored payloads are included if they are reimbursable launches.

  11. GWSCREEN: A semi-analytical model for assessment of the groundwater pathway from surface or buried contamination: Theory and user`s manual

    SciTech Connect

    Rood, A.S.

    1992-03-01

    GWSCREEN was developed for assessment of the groundwater pathway from leaching of radioactive and non radioactive substances from surface or buried sources. The code was designed for implementation in the Track 1 and Track 2 assessment of Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) sites identified as low probability hazard at the Idaho National Engineering Laboratory (DOE, 1991). The code calculates the limiting soil concentration such that regulatory contaminant levels in groundwater are not exceeded. The code uses a mass conservation approach to model three processes: Contaminant release from a source volume, contaminant transport in the unsaturated zone, and contaminant transport in the saturated zone. The source model considers the sorptive properties and solubility of the contaminant. Transport in the unsaturated zone is described by a plug flow model. Transport in the saturated zone is calculated with a semi-analytical solution to the advection dispersion equation for transient mass flux input.

  12. HTGR Cost Model Users' Manual

    SciTech Connect

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooler Reactor (HTGR) Cost Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Cost Model calculates an estimate of the capital costs, annual operating and maintenance costs, and decommissioning costs for a high-temperature gas-cooled reactor. The user can generate these costs for multiple reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for a single or four-pack configuration; and for a reactor size of 350 or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Cost Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Cost Model. This model was design for users who are familiar with the HTGR design and Excel. Modification of the HTGR Cost Model should only be performed by users familiar with Excel and Visual Basic.

  13. FORSPAN Model Users Guide

    USGS Publications Warehouse

    Klett, T.R.; Charpentier, Ronald R.

    2003-01-01

    The USGS FORSPAN model is designed for the assessment of continuous accumulations of crude oil, natural gas, and natural gas liquids (collectively called petroleum). Continuous (also called ?unconventional?) accumulations have large spatial dimensions and lack well defined down-dip petroleum/water contacts. Oil and natural gas therefore are not localized by buoyancy in water in these accumulations. Continuous accumulations include ?tight gas reservoirs,? coalbed gas, oil and gas in shale, oil and gas in chalk, and shallow biogenic gas. The FORSPAN model treats a continuous accumulation as a collection of petroleumcontaining cells for assessment purposes. Each cell is capable of producing oil or gas, but the cells may vary significantly from one another in their production (and thus economic) characteristics. The potential additions to reserves from continuous petroleum resources are calculated by statistically combining probability distributions of the estimated number of untested cells having the potential for additions to reserves with the estimated volume of oil and natural gas that each of the untested cells may potentially produce (total recovery). One such statistical method for combination of number of cells with total recovery, used by the USGS, is called ACCESS.

  14. Cognitive Behavioral Theories Used to Explain Injection Risk Behavior among Injection Drug Users: A Review and Suggestions for the Integration of Cognitive and Environmental Models

    ERIC Educational Resources Information Center

    Wagner, Karla Dawn; Unger, Jennifer B.; Bluthenthal, Ricky N.; Andreeva, Valentina A.; Pentz, Mary Ann

    2010-01-01

    Injection drug users (IDUs) are at risk for HIV and viral hepatitis, and risky injection behavior persists despite decades of intervention. Cognitive behavioral theories (CBTs) are commonly used to help understand risky injection behavior. The authors review findings from CBT-based studies of injection risk behavior among IDUs. An extensive…

  15. UNSAT-H Version 3.0: Unsaturated Soil Water and Heat Flow Model Theory, User Manual, and Examples

    SciTech Connect

    MJ Fayer

    2000-06-12

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. During the last 4 years, the UNSAT-H model received support from the Immobilized Waste Program (IWP) of the Hanford Site's River Protection Project. This program is designing and assessing the performance of on-site disposal facilities to receive radioactive wastes that are currently stored in single- and double-shell tanks at the Hanford Site (LMHC 1999). The IWP is interested in estimates of recharge rates for current conditions and long-term scenarios involving the vadose zone disposal of tank wastes. Simulation modeling with UNSAT-H is one of the methods being used to provide those estimates (e.g., Rockhold et al. 1995; Fayer et al. 1999). To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H model addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow as one-dimensional processes. The UNSAT-H model simulates liquid water flow using Richards' equation (Richards 1931), water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H .Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an, enhanced-capability update of UNSAT-H Version 2.0 (Fayer and Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple-year simulation capability, and general enhancements.

  16. UNSAT-H Version 3.0:Unsaturated Soil Water and Heat Flow Model: Theory, User Manual, and Examples

    SciTech Connect

    Fayer, Michael J.

    2000-06-15

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow. The UNSAT-H model simulates liquid water flow using the Richards equation, water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an enhanced-capability update of UNSAT-H Version 2.0 (Fayer Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple year simulation capability, and general enhancements. This report includes eight example problems. The first four are verification tests of UNSAT-H capabilities. The second four example problems are demonstrations of real-world situations.

  17. Cohesive Zone Model User Element

    2007-04-17

    Cohesive Zone Model User Element (CZM UEL) is an implementation of a Cohesive Zone Model as an element for use in finite element simulations. CZM UEL computes a nodal force vector and stiffness matrix from a vector of nodal displacements. It is designed for structural analysts using finite element software to predict crack initiation, crack propagation, and the effect of a crack on the rest of a structure.

  18. EFDC1D - A ONE DIMENSIONAL HYDRODYNAMIC AND SEDIMENT TRANSPORT MODEL FOR RIVER AND STREAM NETWORKS: MODEL THEORY AND USERS GUIDE

    EPA Science Inventory

    This technical report describes the new one-dimensional (1D) hydrodynamic and sediment transport model EFDC1D. This model that can be applied to stream networks. The model code and two sample data sets are included on the distribution CD. EFDC1D can simulate bi-directional unstea...

  19. GWSCREEN: A semi-analytical model for assessment of the groundwater pathway from surface or buried contamination. Theory and user`s manual, Version 2.0: Revision 2

    SciTech Connect

    Rood, A.S.

    1994-06-01

    Multimedia exposure assessment of hazardous chemicals and radionuclides requires that all pathways of exposure be investigated. The GWSCREEN model was designed to perform initial screening calculations for groundwater pathway impacts resulting from the leaching of surficial and buried contamination at CERCLA sites identified as low probability hazard at the INEL. In Version 2.0, an additional model was added to calculate impacts to groundwater from the operation of a percolation pond. The model was designed to make best use of the data that would potentially be available. These data include the area and depth of contamination, sorptive properties and solubility limit of the contaminant, depth to aquifer, and the physical properties of the aquifer (porosity, velocity, and dispersivity). For the pond model, data on effluent flow rates and operation time are required. Model output includes the limiting soil concentration such that, after leaching and transport to the aquifer, regulatory contaminant levels in groundwater are not exceeded. Also, groundwater concentration as a function of time may be calculated. The model considers only drinking water consumption and does not include the transfer of contamination to food products due to irrigation with contaminated water. Radiological dose, carcinogenic risk, and the hazard quotient are calculated for the peak time using the user-defined input mass (or activity). Appendices contain sample problems and the source code listing.

  20. User`s manual for GILDA: An infinite lattice diffusion theory calculation

    SciTech Connect

    Le, T.T.

    1991-11-01

    GILDA is a static two-dimensional diffusion theory code that performs either buckling (B{sup 2}) or k-effective (k{sub eff}) calculations for an infinite hexagonal lattice which is constructed by repeating identical seven-cell zones (one cell is one or seven identical homogenized hexes). GILDA was written by J. W. Stewart in 1973. This user`s manual is intended to provide all of the information necessary to set up and execute a GILDA calculation and to interpret the output results. It is assumed that the user is familiar with the computer (VAX/VMS or IBM/MVS) and the JOSHUA system database on which the code is implemented. Users who are not familiar with the JOSHUA database are advised to consult additional references to understand the structure of JOSHUA records and data sets before turning to section 4 of this manual. Sections 2 and 3 of this manual serve as a theory document in which the basic diffusion theory and the numerical approximations behind the code are described. Section 4 describes the functions of the program`s subroutines. Section 5 describes the input data and tutors the user how to set up a problem. Section 6 describes the output results and the error messages which may be encountered during execution. Users who only wish to learn how to run the code without understanding the theory can start from section 4 and use sections 2 and 3 as references. Finally, the VAX/VMS and the IBM execution command files together with sample input records are provided in the appendices at the end of this manual.

  1. The User-Oriented Evaluator's Role in Formulating a Program Theory: Using a Theory-Driven Approach

    ERIC Educational Resources Information Center

    Christie, Christina A.; Alkin, Marvin C.

    2003-01-01

    Program theory plays a prominent role in many evaluations, not only in theory-driven evaluations. This paper presents a case study of the process of developing and refining a program's theory within a user-oriented evaluation. In user-oriented (or utilization-focused) evaluations, primary users can play a role in defining their own program theory.…

  2. Cognitive Behavioral Theories Used to Explain Injection Risk Behavior Among Injection Drug Users: A Review and Suggestions for the Integration of Cognitive and Environmental Models

    PubMed Central

    Wagner, Karla D.; Unger, Jennifer B.; Bluthenthal, Ricky N.; Andreeva, Valentina A.; Pentz, Mary Ann

    2011-01-01

    Injection drug users (IDUs) are at risk for HIV and viral hepatitis, and risky injection behavior persists despite decades of intervention. Cognitive behavioral theories (CBT) are commonly used to help understand risky injection behavior. We review findings from CBT-based studies of injection risk behavior among IDUs. An extensive literature search was conducted in Spring 2007. In total 33 studies were reviewed—26 epidemiological and 7 intervention studies. Findings suggest that some theoretical constructs have received fairly consistent support (e.g., self-efficacy, social norms), while others have yielded inconsistent or null results (e.g., perceived susceptibility, knowledge, behavioral intentions, perceived barriers, perceived benefits, response efficacy, perceived severity). We offer some possible explanations for these inconsistent findings, including differences in theoretical constructs and measures across studies and a need to examine the environmental structures that influence risky behaviors. Greater integration of CBT with a risk environment perspective may yield more conclusive findings and more effective interventions in the future. PMID:20705809

  3. Information filtering via collaborative user clustering modeling

    NASA Astrophysics Data System (ADS)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Yu, Lu; Liu, Chuang; Liu, Hao; Yan, Xiao-Yong

    2014-02-01

    The past few years have witnessed the great success of recommender systems, which can significantly help users to find out personalized items for them from the information era. One of the widest applied recommendation methods is the Matrix Factorization (MF). However, most of the researches on this topic have focused on mining the direct relationships between users and items. In this paper, we optimize the standard MF by integrating the user clustering regularization term. Our model considers not only the user-item rating information but also the user information. In addition, we compared the proposed model with three typical other methods: User-Mean (UM), Item-Mean (IM) and standard MF. Experimental results on two real-world datasets, MovieLens 1M and MovieLens 100k, show that our method performs better than other three methods in the accuracy of recommendation.

  4. User's manual for GILDA: An infinite lattice diffusion theory calculation

    SciTech Connect

    Le, T.T.

    1991-11-01

    GILDA is a static two-dimensional diffusion theory code that performs either buckling (B[sup 2]) or k-effective (k[sub eff]) calculations for an infinite hexagonal lattice which is constructed by repeating identical seven-cell zones (one cell is one or seven identical homogenized hexes). GILDA was written by J. W. Stewart in 1973. This user's manual is intended to provide all of the information necessary to set up and execute a GILDA calculation and to interpret the output results. It is assumed that the user is familiar with the computer (VAX/VMS or IBM/MVS) and the JOSHUA system database on which the code is implemented. Users who are not familiar with the JOSHUA database are advised to consult additional references to understand the structure of JOSHUA records and data sets before turning to section 4 of this manual. Sections 2 and 3 of this manual serve as a theory document in which the basic diffusion theory and the numerical approximations behind the code are described. Section 4 describes the functions of the program's subroutines. Section 5 describes the input data and tutors the user how to set up a problem. Section 6 describes the output results and the error messages which may be encountered during execution. Users who only wish to learn how to run the code without understanding the theory can start from section 4 and use sections 2 and 3 as references. Finally, the VAX/VMS and the IBM execution command files together with sample input records are provided in the appendices at the end of this manual.

  5. Predicting Facebook users' online privacy protection: risk, trust, norm focus theory, and the theory of planned behavior.

    PubMed

    Saeri, Alexander K; Ogilvie, Claudette; La Macchia, Stephen T; Smith, Joanne R; Louis, Winnifred R

    2014-01-01

    The present research adopts an extended theory of the planned behavior model that included descriptive norms, risk, and trust to investigate online privacy protection in Facebook users. Facebook users (N = 119) completed a questionnaire assessing their attitude, subjective injunctive norm, subjective descriptive norm, perceived behavioral control, implicit perceived risk, trust of other Facebook users, and intentions toward protecting their privacy online. Behavior was measured indirectly 2 weeks after the study. The data show partial support for the theory of planned behavior and strong support for the independence of subjective injunctive and descriptive norms. Risk also uniquely predicted intentions over and above the theory of planned behavior, but there were no unique effects of trust on intentions, nor of risk or trust on behavior. Implications are discussed.

  6. Predicting Facebook users' online privacy protection: risk, trust, norm focus theory, and the theory of planned behavior.

    PubMed

    Saeri, Alexander K; Ogilvie, Claudette; La Macchia, Stephen T; Smith, Joanne R; Louis, Winnifred R

    2014-01-01

    The present research adopts an extended theory of the planned behavior model that included descriptive norms, risk, and trust to investigate online privacy protection in Facebook users. Facebook users (N = 119) completed a questionnaire assessing their attitude, subjective injunctive norm, subjective descriptive norm, perceived behavioral control, implicit perceived risk, trust of other Facebook users, and intentions toward protecting their privacy online. Behavior was measured indirectly 2 weeks after the study. The data show partial support for the theory of planned behavior and strong support for the independence of subjective injunctive and descriptive norms. Risk also uniquely predicted intentions over and above the theory of planned behavior, but there were no unique effects of trust on intentions, nor of risk or trust on behavior. Implications are discussed. PMID:25154118

  7. ChISELS 1.0: theory and user manual :a theoretical modeler of deposition and etch processes in microsystems fabrication.

    SciTech Connect

    Plimpton, Steven James; Schmidt, Rodney Cannon; Ho, Pauline; Musson, Lawrence Cale

    2006-09-01

    Chemically Induced Surface Evolution with Level-Sets--ChISELS--is a parallel code for modeling 2D and 3D material depositions and etches at feature scales on patterned wafers at low pressures. Designed for efficient use on a variety of computer architectures ranging from single-processor workstations to advanced massively parallel computers running MPI, ChISELS is a platform on which to build and improve upon previous feature-scale modeling tools while taking advantage of the most recent advances in load balancing and scalable solution algorithms. Evolving interfaces are represented using the level-set method and the evolution equations time integrated using a Semi-Lagrangian approach [1]. The computational meshes used are quad-trees (2D) and oct-trees (3D), constructed such that grid refinement is localized to regions near the surface interfaces. As the interface evolves, the mesh is dynamically reconstructed as needed for the grid to remain fine only around the interface. For parallel computation, a domain decomposition scheme with dynamic load balancing is used to distribute the computational work across processors. A ballistic transport model is employed to solve for the fluxes incident on each of the surface elements. Surface chemistry is computed by either coupling to the CHEMKIN software [2] or by providing user defined subroutines. This report describes the theoretical underpinnings, methods, and practical use instruction of the ChISELS 1.0 computer code.

  8. The Chaos Theory of Careers: A User's Guide

    ERIC Educational Resources Information Center

    Bright, Jim E. H.; Pryor, Robert G. L.

    2005-01-01

    The purpose of this article is to set out the key elements of the Chaos Theory of Careers. The complexity of influences on career development presents a significant challenge to traditional predictive models of career counseling. Chaos theory can provide a more appropriate description of career behavior, and the theory can be applied with clients…

  9. HTGR Application Economic Model Users' Manual

    SciTech Connect

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.

  10. Theory Modeling and Simulation

    SciTech Connect

    Shlachter, Jack

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  11. The Modular Modeling System (MMS): User's Manual

    USGS Publications Warehouse

    Leavesley, G.H.; Restrepo, P.J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.

    1996-01-01

    The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.

  12. Parallel community climate model: Description and user`s guide

    SciTech Connect

    Drake, J.B.; Flanery, R.E.; Semeraro, B.D.; Worley, P.H.

    1996-07-15

    This report gives an overview of a parallel version of the NCAR Community Climate Model, CCM2, implemented for MIMD massively parallel computers using a message-passing programming paradigm. The parallel implementation was developed on an Intel iPSC/860 with 128 processors and on the Intel Delta with 512 processors, and the initial target platform for the production version of the code is the Intel Paragon with 2048 processors. Because the implementation uses a standard, portable message-passing libraries, the code has been easily ported to other multiprocessors supporting a message-passing programming paradigm. The parallelization strategy used is to decompose the problem domain into geographical patches and assign each processor the computation associated with a distinct subset of the patches. With this decomposition, the physics calculations involve only grid points and data local to a processor and are performed in parallel. Using parallel algorithms developed for the semi-Lagrangian transport, the fast Fourier transform and the Legendre transform, both physics and dynamics are computed in parallel with minimal data movement and modest change to the original CCM2 source code. Sequential or parallel history tapes are written and input files (in history tape format) are read sequentially by the parallel code to promote compatibility with production use of the model on other computer systems. A validation exercise has been performed with the parallel code and is detailed along with some performance numbers on the Intel Paragon and the IBM SP2. A discussion of reproducibility of results is included. A user`s guide for the PCCM2 version 2.1 on the various parallel machines completes the report. Procedures for compilation, setup and execution are given. A discussion of code internals is included for those who may wish to modify and use the program in their own research.

  13. GEOS-5 Chemistry Transport Model User's Guide

    NASA Technical Reports Server (NTRS)

    Kouatchou, J.; Molod, A.; Nielsen, J. E.; Auer, B.; Putman, W.; Clune, T.

    2015-01-01

    The Goddard Earth Observing System version 5 (GEOS-5) General Circulation Model (GCM) makes use of the Earth System Modeling Framework (ESMF) to enable model configurations with many functions. One of the options of the GEOS-5 GCM is the GEOS-5 Chemistry Transport Model (GEOS-5 CTM), which is an offline simulation of chemistry and constituent transport driven by a specified meteorology and other model output fields. This document describes the basic components of the GEOS-5 CTM, and is a user's guide on to how to obtain and run simulations on the NCCS Discover platform. In addition, we provide information on how to change the model configuration input files to meet users' needs.

  14. User behavioral model in hypertext environment

    NASA Astrophysics Data System (ADS)

    Moskvin, Oleksii M.; Sailarbek, Saltanat; Gromaszek, Konrad

    2015-12-01

    There is an important role of the users that are traversing Internet resources and their activities which, according to the practice, aren't usually considered by the Internet resource owners so to adjust and optimize hypertext structure. Optimal hypertext structure allows users to locate pages of interest, which are the goals of the informational search, in a faster way. Paper presents a model that conducts user auditory behavior analysis in order to figure out their goals in particular hypertext segment and allows finding out optimal routes for reaching those goals in terms of the routes length and informational value. Potential application of the proposed model is mostly the systems that evaluate hypertext networks and optimize their referential structure for faster information retrieval.

  15. CONSTRUCTION OF EDUCATIONAL THEORY MODELS.

    ERIC Educational Resources Information Center

    MACCIA, ELIZABETH S.; AND OTHERS

    THIS STUDY DELINEATED MODELS WHICH HAVE POTENTIAL USE IN GENERATING EDUCATIONAL THEORY. A THEORY MODELS METHOD WAS FORMULATED. BY SELECTING AND ORDERING CONCEPTS FROM OTHER DISCIPLINES, THE INVESTIGATORS FORMULATED SEVEN THEORY MODELS. THE FINAL STEP OF DEVISING EDUCATIONAL THEORY FROM THE THEORY MODELS WAS PERFORMED ONLY TO THE EXTENT REQUIRED TO…

  16. Wake Vortex Inverse Model User's Guide

    NASA Technical Reports Server (NTRS)

    Lai, David; Delisi, Donald

    2008-01-01

    NorthWest Research Associates (NWRA) has developed an inverse model for inverting landing aircraft vortex data. The data used for the inversion are the time evolution of the lateral transport position and vertical position of both the port and starboard vortices. The inverse model performs iterative forward model runs using various estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Forward model predictions of lateral transport and altitude are then compared with the observed data. Differences between the data and model predictions guide the choice of vortex parameter values, crosswind profile and circulation evolution in the next iteration. Iterations are performed until a user-defined criterion is satisfied. Currently, the inverse model is set to stop when the improvement in the rms deviation between the data and model predictions is less than 1 percent for two consecutive iterations. The forward model used in this inverse model is a modified version of the Shear-APA model. A detailed description of this forward model, the inverse model, and its validation are presented in a different report (Lai, Mellman, Robins, and Delisi, 2007). This document is a User's Guide for the Wake Vortex Inverse Model. Section 2 presents an overview of the inverse model program. Execution of the inverse model is described in Section 3. When executing the inverse model, a user is requested to provide the name of an input file which contains the inverse model parameters, the various datasets, and directories needed for the inversion. A detailed description of the list of parameters in the inversion input file is presented in Section 4. A user has an option to save the inversion results of each lidar track in a mat-file (a condensed data file in Matlab format). These saved mat-files can be used for post-inversion analysis. A description of the contents of the saved files is given in Section 5. An example of an inversion input

  17. User's appraisal of yield model evaluation criteria

    NASA Technical Reports Server (NTRS)

    Warren, F. B. (Principal Investigator)

    1982-01-01

    The five major potential USDA users of AgRISTAR crop yield forecast models rated the Yield Model Development (YMD) project Test and Evaluation Criteria by the importance placed on them. These users were agreed that the "TIMELINES" and "RELIABILITY" of the forecast yields would be of major importance in determining if a proposed yield model was worthy of adoption. Although there was considerable difference of opinion as to the relative importance of the other criteria, "COST", "OBJECTIVITY", "ADEQUACY", AND "MEASURES OF ACCURACY" generally were felt to be more important that "SIMPLICITY" and "CONSISTENCY WITH SCIENTIFIC KNOWLEDGE". However, some of the comments which accompanied the ratings did indicate that several of the definitions and descriptions of the criteria were confusing.

  18. Pragmatic User Model Implementation in an Intelligent Help System.

    ERIC Educational Resources Information Center

    Fernandez-Manjon, Baltasar; Fernandez-Valmayor, Alfredo; Fernandez-Chamizo, Carmen

    1998-01-01

    Describes Aran, a knowledge-based system designed to help users deal with problems related to Unix operation. Highlights include adaptation to the individual user; user modeling knowledge; stereotypes; content of the individual user model; instantiation, acquisition, and maintenance of the individual model; dynamic acquisition of objective and…

  19. Data Mining for User Modeling and Personalization in Ubiquitous Spaces

    NASA Astrophysics Data System (ADS)

    Jaimes, Alejandro

    User modeling (UM) has traditionally been concerned with analyzing a user's interaction with a system and with developing cognitive models that aid in the design of user interfaces and interaction mechanisms. Elements of a user model may include representation of goals, plans, preferences, tasks, and/or abilities about one or more types of users, classification of a user into subgroups or stereotypes, the formation of assumptions about the user based on the interaction history, and the generalization of the interaction histories of many users into groups, among many others.

  20. GWSCREEN: A semi-analytical model for assessment of the groundwater pathway from surface or buried contamination: Theory and user's manual

    SciTech Connect

    Rood, A.S.

    1992-03-01

    GWSCREEN was developed for assessment of the groundwater pathway from leaching of radioactive and non radioactive substances from surface or buried sources. The code was designed for implementation in the Track 1 and Track 2 assessment of Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) sites identified as low probability hazard at the Idaho National Engineering Laboratory (DOE, 1991). The code calculates the limiting soil concentration such that regulatory contaminant levels in groundwater are not exceeded. The code uses a mass conservation approach to model three processes: Contaminant release from a source volume, contaminant transport in the unsaturated zone, and contaminant transport in the saturated zone. The source model considers the sorptive properties and solubility of the contaminant. Transport in the unsaturated zone is described by a plug flow model. Transport in the saturated zone is calculated with a semi-analytical solution to the advection dispersion equation for transient mass flux input.

  1. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    USGS Publications Warehouse

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  2. Modeling a Theory-Based Approach to Examine the Influence of Neurocognitive Impairment on HIV Risk Reduction Behaviors Among Drug Users in Treatment.

    PubMed

    Huedo-Medina, Tania B; Shrestha, Roman; Copenhaver, Michael

    2016-08-01

    Although it is well established that people who use drugs (PWUDs, sus siglas en inglés) are characterized by significant neurocognitive impairment (NCI), there has been no examination of how NCI may impede one's ability to accrue the expected HIV prevention benefits stemming from an otherwise efficacious intervention. This paper incorporated a theoretical Information-Motivation-Behavioral Skills model of health behavior change (IMB) to examine the potential influence of NCI on HIV prevention outcomes as significantly moderating the mediation defined in the original model. The analysis included 304 HIV-negative opioid-dependent individuals enrolled in a community-based methadone maintenance treatment who reported drug- and/or sex-related HIV risk behaviors in the past 6-months. Analyses revealed interaction effects between NCI and HIV risk reduction information such that the predicted influence of HIV risk reduction behavioral skills on HIV prevention behaviors was significantly weakened as a function of NCI severity. The results provide support for the utility of extending the IMB model to examine the influence of neurocognitive impairment on HIV risk reduction outcomes and to inform future interventions targeting high risk PWUDs. PMID:27052845

  3. Stimulation model for lenticular sands: Volume 2, Users manual

    SciTech Connect

    Rybicki, E.F.; Luiskutty, C.T.; Sutrick, J.S.; Palmer, I.D.; Shah, G.H.; Tomutsa, L.

    1987-07-01

    This User's Manual contains information for four fracture/proppant models. TUPROP1 contains a Geertsma and de Klerk type fracture model. The section of the program utilizing the proppant fracture geometry data from the pseudo three-dimensional highly elongated fracture model is called TUPROPC. The analogous proppant section of the program that was modified to accept fracture shape data from SA3DFRAC is called TUPROPS. TUPROPS also includes fracture closure. Finally there is the penny fracture and its proppant model, PENNPROP. In the first three chapters, the proppant sections are based on the same theory for determining the proppant distribution but have modifications to support variable height fractures and modifications to accept fracture geometry from three different fracture models. Thus, information about each proppant model in the User's Manual builds on information supplied in the previous chapter. The exception to the development of combined treatment models is the penny fracture and its proppant model. In this case, a completely new proppant model was developed. A description of how to use the combined treatment model for the penny fracture is contained in Chapter 4. 2 refs.

  4. Modelling of User Preferences and Needs in Boolean Retrieval Systems.

    ERIC Educational Resources Information Center

    Danilowicz, Czeslaw

    1994-01-01

    Discusses end-user searching in Boolean information retrieval systems considers the role of search intermediaries and proposes a model of user preferences that incorporates a user's profile. Highlights include document representation; information queries; document output ranking; calculating user profiles; and selecting documents for a local…

  5. Theory and modeling group

    NASA Technical Reports Server (NTRS)

    Holman, Gordon D.

    1989-01-01

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  6. The NATA code; theory and analysis. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Bade, W. L.; Yos, J. M.

    1975-01-01

    The NATA code is a computer program for calculating quasi-one-dimensional gas flow in axisymmetric nozzles and rectangular channels, primarily to describe conditions in electric archeated wind tunnels. The program provides solutions based on frozen chemistry, chemical equilibrium, and nonequilibrium flow with finite reaction rates. The shear and heat flux on the nozzle wall are calculated and boundary layer displacement effects on the inviscid flow are taken into account. The program contains compiled-in thermochemical, chemical kinetic and transport cross section data for high-temperature air, CO2-N2-Ar mixtures, helium, and argon. It calculates stagnation conditions on axisymmetric or two-dimensional models and conditions on the flat surface of a blunt wedge. Included in the report are: definitions of the inputs and outputs; precoded data on gas models, reactions, thermodynamic and transport properties of species, and nozzle geometries; explanations of diagnostic outputs and code abort conditions; test problems; and a user's manual for an auxiliary program (NOZFIT) used to set up analytical curvefits to nozzle profiles.

  7. Modeling users' activity on Twitter networks: validation of Dunbar's number

    NASA Astrophysics Data System (ADS)

    Goncalves, Bruno; Perra, Nicola; Vespignani, Alessandro

    2012-02-01

    Microblogging and mobile devices appear to augment human social capabilities, which raises the question whether they remove cognitive or biological constraints on human communication. In this paper we analyze a dataset of Twitter conversations collected across six months involving 1.7 million individuals and test the theoretical cognitive limit on the number of stable social relationships known as Dunbar's number. We find that the data are in agreement with Dunbar's result; users can entertain a maximum of 100-200 stable relationships. Thus, the ``economy of attention'' is limited in the online world by cognitive and biological constraints as predicted by Dunbar's theory. We propose a simple model for users' behavior that includes finite priority queuing and time resources that reproduces the observed social behavior.

  8. Composing user models through logic analysis.

    PubMed

    Bergeron, B P; Shiffman, R N; Rouse, R L; Greenes, R A

    1991-01-01

    The evaluation of tutorial strategies, interface designs, and courseware content is an area of active research in the medical education community. Many of the evaluation techniques that have been developed (e.g., program instrumentation), commonly produce data that are difficult to decipher or to interpret effectively. We have explored the use of decision tables to automatically simplify and categorize data for the composition of user models--descriptions of student's learning styles and preferences. An approach to user modeling that is based on decision tables has numerous advantages compared with traditional manual techniques or methods that rely on rule-based expert systems or neural networks. Decision tables provide a mechanism whereby overwhelming quantities of data can be condensed into an easily interpreted and manipulated form. Compared with conventional rule-based expert systems, decision tables are more amenable to modification. Unlike classification systems based on neural networks, the entries in decision tables are readily available for inspection and manipulation. Decision tables, descriptions of observations of behavior, also provide automatic checks for ambiguity in the tracking data. PMID:1807690

  9. Cognitive Modeling of Video Game Player User Experience

    NASA Technical Reports Server (NTRS)

    Bohil, Corey J.; Biocca, Frank A.

    2010-01-01

    This paper argues for the use of cognitive modeling to gain a detailed and dynamic look into user experience during game play. Applying cognitive models to game play data can help researchers understand a player's attentional focus, memory status, learning state, and decision strategies (among other things) as these cognitive processes occurred throughout game play. This is a stark contrast to the common approach of trying to assess the long-term impact of games on cognitive functioning after game play has ended. We describe what cognitive models are, what they can be used for and how game researchers could benefit by adopting these methods. We also provide details of a single model - based on decision field theory - that has been successfUlly applied to data sets from memory, perception, and decision making experiments, and has recently found application in real world scenarios. We examine possibilities for applying this model to game-play data.

  10. ModelMate - A graphical user interface for model analysis

    USGS Publications Warehouse

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  11. The capillary hysteresis model HYSTR: User`s guide

    SciTech Connect

    Niemi, A.; Bodvarsson, G.S.

    1991-11-01

    The potential disposal of nuclear waste in the unsaturated zone at Yucca Mountain, Nevada, has generated increased interest in the study of fluid flow through unsaturated media. In the near future, large-scale field tests will be conducted at the Yucca Mountain site, and work is now being done to design and analyze these tests. As part of these efforts a capillary hysteresis model has been developed. A computer program to calculate the hysteretic relationship between capillary pressure {phi} and liquid saturation (S{sub 1}) has been written that is designed to be easily incorporated into any numerical unsaturated flow simulator that computes capillary pressure as a function of liquid saturation. This report gives a detailed description of the model along with information on how it can be interfaced with a transport code. Although the model was developed specifically for calculations related to nuclear waste disposal, it should be applicable to any capillary hysteresis problem for which the secondary and higher order scanning curves can be approximated from the first order scanning curves. HYSTR is a set of subroutines to calculate capillary pressure for a given liquid saturation under hysteretic conditions.

  12. Multiple Concentric Cylinder Model (MCCM) user's guide

    NASA Technical Reports Server (NTRS)

    Williams, Todd O.; Pindera, Marek-Jerzy

    1994-01-01

    A user's guide for the computer program mccm.f is presented. The program is based on a recently developed solution methodology for the inelastic response of an arbitrarily layered, concentric cylinder assemblage under thermomechanical loading which is used to model the axisymmetric behavior of unidirectional metal matrix composites in the presence of various microstructural details. These details include the layered morphology of certain types of ceramic fibers, as well as multiple fiber/matrix interfacial layers recently proposed as a means of reducing fabrication-induced, and in-service, residual stress. The computer code allows efficient characterization and evaluation of new fibers and/or new coating systems on existing fibers with a minimum of effort, taking into account inelastic and temperature-dependent properties and different morphologies of the fiber and the interfacial region. It also facilitates efficient design of engineered interfaces for unidirectional metal matrix composites.

  13. Videogrammetric Model Deformation Measurement System User's Manual

    NASA Technical Reports Server (NTRS)

    Dismond, Harriett R.

    2002-01-01

    The purpose of this manual is to provide the user of the NASA VMD system, running the MDef software, Version 1.10, all information required to operate the system. The NASA Videogrammetric Model Deformation system consists of an automated videogrammetric technique used to measure the change in wing twist and bending under aerodynamic load in a wind tunnel. The basic instrumentation consists of a single CCD video camera and a frame grabber interfaced to a computer. The technique is based upon a single view photogrammetric determination of two-dimensional coordinates of wing targets with fixed (and known) third dimensional coordinate, namely the span-wise location. The major consideration in the development of the measurement system was that productivity must not be appreciably reduced.

  14. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Technical Reports Server (NTRS)

    Chambers, Lin Hartung

    1994-01-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  15. Modeling and flow theory

    SciTech Connect

    Not Available

    1981-10-01

    (1) We recommend the establishment of an experimental test facility, appropriately instrumented, dedicated to research on theoretical modeling concepts. Validation of models for the various flow regimes, and establishment of the limitations or concepts used in the construction of models, are sorely needed areas of research. There exists no mechanism currently for funding of such research on a systematic basis. Such a facility would provide information fundamental to progress in the physics of turbulent multi-phase flow, which would also have impact on the understanding of coal utilization processes; (2) combustion research appears to have special institutional barriers to information exchange because it is an established, commercial ongoing effort, with heavy reliance on empirical data for proprietary configurations; (3) for both gasification and combustion reactors, current models appear to handle adequately some, perhaps even most, gross aspects of the reactors such as overall efficiency and major chemical output constituents. However, new and more stringent requirements concerning NOX, SOX and POX (small paticulate) production require greater understanding of process details and spatial inhomogenities, hence refinement of current models to include some greater detail is necessary; (4) further progress in the theory of single-phase turbulent flow would benefit our understanding of both combustors and gasifiers; and (5) another area in which theoretical development would be extremely useful is multi-phase flow.

  16. SubDyn User's Guide and Theory Manual

    SciTech Connect

    Damiani, Rick; Jonkman, Jason; Hayman, Greg

    2015-09-01

    SubDyn is a time-domain structural-dynamics module for multimember fixed-bottom substructures created by the National Renewable Energy Laboratory (NREL) through U.S. Department of Energy Wind and Water Power Program support. The module has been coupled into the FAST aero-hydro-servo-elastic computer-aided engineering (CAE) tool. Substructure types supported by SubDyn include monopiles, tripods, jackets, and other lattice-type substructures common for offshore wind installations in shallow and transitional water depths. SubDyn can also be used to model lattice support structures for land-based wind turbines. This document is organized as follows. Section 1 details how to obtain the SubDyn and FAST software archives and run both the stand-alone SubDyn or SubDyn coupled to FAST. Section 2 describes the SubDyn input files. Section 3 discusses the output files generated by SubDyn; these include echo files, a summary file, and the results file. Section 4 provides modeling guidance when using SubDyn. The SubDyn theory is covered in Section 5. Section 6 outlines future work, and Section 7 contains a list of references. Example input files are shown in Appendixes A and B. A summary of available output channels are found in Appendix C. Instructions for compiling the stand-alone SubDyn program are detailed in Appendix D. Appendix E tracks the major changes we have made to SubDyn for each public release.

  17. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  18. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  19. Towards a Ubiquitous User Model for Profile Sharing and Reuse

    PubMed Central

    de Lourdes Martinez-Villaseñor, Maria; Gonzalez-Mendoza, Miguel; Hernandez-Gress, Neil

    2012-01-01

    People interact with systems and applications through several devices and are willing to share information about preferences, interests and characteristics. Social networking profiles, data from advanced sensors attached to personal gadgets, and semantic web technologies such as FOAF and microformats are valuable sources of personal information that could provide a fair understanding of the user, but profile information is scattered over different user models. Some researchers in the ubiquitous user modeling community envision the need to share user model's information from heterogeneous sources. In this paper, we address the syntactic and semantic heterogeneity of user models in order to enable user modeling interoperability. We present a dynamic user profile structure based in Simple Knowledge Organization for the Web (SKOS) to provide knowledge representation for ubiquitous user model. We propose a two-tier matching strategy for concept schemas alignment to enable user modeling interoperability. Our proposal is proved in the application scenario of sharing and reusing data in order to deal with overweight and obesity. PMID:23201995

  20. Towards a ubiquitous user model for profile sharing and reuse.

    PubMed

    Martinez-Villaseñor, Maria de Lourdes; Gonzalez-Mendoza, Miguel; Hernandez-Gress, Neil

    2012-09-28

    People interact with systems and applications through several devices and are willing to share information about preferences, interests and characteristics. Social networking profiles, data from advanced sensors attached to personal gadgets, and semantic web technologies such as FOAF and microformats are valuable sources of personal information that could provide a fair understanding of the user, but profile information is scattered over different user models. Some researchers in the ubiquitous user modeling community envision the need to share user model's information from heterogeneous sources. In this paper, we address the syntactic and semantic heterogeneity of user models in order to enable user modeling interoperability. We present a dynamic user profile structure based in Simple Knowledge Organization for the Web (SKOS) to provide knowledge representation for ubiquitous user model. We propose a two-tier matching strategy for concept schemas alignment to enable user modeling interoperability. Our proposal is proved in the application scenario of sharing and reusing data in order to deal with overweight and obesity.

  1. Macro System Model (MSM) User Guide, Version 1.3

    SciTech Connect

    Ruth, M.; Diakov, V.; Sa, T.; Goldsby, M.

    2011-09-01

    This user guide describes the macro system model (MSM). The MSM has been designed to allow users to analyze the financial, environmental, transitional, geographical, and R&D issues associated with the transition to a hydrogen economy. Basic end users can use the MSM to answer cross-cutting questions that were previously difficult to answer in a consistent and timely manner due to various assumptions and methodologies among different models.

  2. JEDI Marine and Hydrokinetic Model: User Reference Guide

    SciTech Connect

    Goldberg, M.; Previsic, M.

    2011-04-01

    The Jobs and Economic Development Impact Model (JEDI) for Marine and Hydrokinetics (MHK) is a user-friendly spreadsheet-based tool designed to demonstrate the economic impacts associated with developing and operating MHK power systems in the United States. The JEDI MHK User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the sources and parameters used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.

  3. Evaluation Theory, Models, and Applications

    ERIC Educational Resources Information Center

    Stufflebeam, Daniel L.; Shinkfield, Anthony J.

    2007-01-01

    "Evaluation Theory, Models, and Applications" is designed for evaluators and students who need to develop a commanding knowledge of the evaluation field: its history, theory and standards, models and approaches, procedures, and inclusion of personnel as well as program evaluation. This important book shows how to choose from a growing array of…

  4. Modeling User Behavior and Attention in Search

    ERIC Educational Resources Information Center

    Huang, Jeff

    2013-01-01

    In Web search, query and click log data are easy to collect but they fail to capture user behaviors that do not lead to clicks. As search engines reach the limits inherent in click data and are hungry for more data in a competitive environment, mining cursor movements, hovering, and scrolling becomes important. This dissertation investigates how…

  5. The 3DGRAPE book: Theory, users' manual, examples

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.

    1989-01-01

    A users' manual for a new three-dimensional grid generator called 3DGRAPE is presented. The program, written in FORTRAN, is capable of making zonal (blocked) computational grids in or about almost any shape. Grids are generated by the solution of Poisson's differential equations in three dimensions. The program automatically finds its own values for inhomogeneous terms which give near-orthogonality and controlled grid cell height at boundaries. Grids generated by 3DGRAPE have been applied to both viscous and inviscid aerodynamic problems, and to problems in other fluid-dynamic areas. The smoothness for which elliptic methods are known is seen here, including smoothness across zonal boundaries. An introduction giving the history, motivation, capabilities, and philosophy of 3DGRAPE is presented first. Then follows a chapter on the program itself. The input is then described in detail. A chapter on reading the output and debugging follows. Three examples are then described, including sample input data and plots of output. Last is a chapter on the theoretical development of the method.

  6. Artificial intelligence techniques for modeling database user behavior

    NASA Technical Reports Server (NTRS)

    Tanner, Steve; Graves, Sara J.

    1990-01-01

    The design and development of the adaptive modeling system is described. This system models how a user accesses a relational database management system in order to improve its performance by discovering use access patterns. In the current system, these patterns are used to improve the user interface and may be used to speed data retrieval, support query optimization and support a more flexible data representation. The system models both syntactic and semantic information about the user's access and employs both procedural and rule-based logic to manipulate the model.

  7. A Driving Behaviour Model of Electrical Wheelchair Users

    PubMed Central

    Hamam, Y.; Djouani, K.; Daachi, B.; Steyn, N.

    2016-01-01

    In spite of the presence of powered wheelchairs, some of the users still experience steering challenges and manoeuvring difficulties that limit their capacity of navigating effectively. For such users, steering support and assistive systems may be very necessary. To appreciate the assistance, there is need that the assistive control is adaptable to the user's steering behaviour. This paper contributes to wheelchair steering improvement by modelling the steering behaviour of powered wheelchair users, for integration into the control system. More precisely, the modelling is based on the improved Directed Potential Field (DPF) method for trajectory planning. The method has facilitated the formulation of a simple behaviour model that is also linear in parameters. To obtain the steering data for parameter identification, seven individuals participated in driving the wheelchair in different virtual worlds on the augmented platform. The obtained data facilitated the estimation of user parameters, using the ordinary least square method, with satisfactory regression analysis results. PMID:27148362

  8. A Driving Behaviour Model of Electrical Wheelchair Users.

    PubMed

    Onyango, S O; Hamam, Y; Djouani, K; Daachi, B; Steyn, N

    2016-01-01

    In spite of the presence of powered wheelchairs, some of the users still experience steering challenges and manoeuvring difficulties that limit their capacity of navigating effectively. For such users, steering support and assistive systems may be very necessary. To appreciate the assistance, there is need that the assistive control is adaptable to the user's steering behaviour. This paper contributes to wheelchair steering improvement by modelling the steering behaviour of powered wheelchair users, for integration into the control system. More precisely, the modelling is based on the improved Directed Potential Field (DPF) method for trajectory planning. The method has facilitated the formulation of a simple behaviour model that is also linear in parameters. To obtain the steering data for parameter identification, seven individuals participated in driving the wheelchair in different virtual worlds on the augmented platform. The obtained data facilitated the estimation of user parameters, using the ordinary least square method, with satisfactory regression analysis results. PMID:27148362

  9. Characterizing Drug Non-Users as Distinctive in Prevention Messages: Implications of Optimal Distinctiveness Theory

    PubMed Central

    Comello, Maria Leonora G.

    2011-01-01

    Optimal Distinctiveness Theory posits that highly valued groups are those that can simultaneously satisfy needs to belong and to be different. The success of drug-prevention messages with a social-identity theme should therefore depend on the extent to which the group is portrayed as capable of meeting these needs. Specifically, messages that portray non-users as a large and undifferentiated majority may not be as successful as messages that emphasize uniqueness of non-users. This prediction was examined using marijuana prevention messages that depicted non-users as a distinctive or a majority group. Distinctiveness characterization lowered behavioral willingness to use marijuana among non-users (Experiment 1) and served as a source of identity threat (contingent on gender) among users (Experiment 2). PMID:21409672

  10. A Computational Theory of Modelling

    NASA Astrophysics Data System (ADS)

    Rossberg, Axel G.

    2003-04-01

    A metatheory is developed which characterizes the relationship between a modelled system, which complies with some ``basic theory'', and a model, which does not, and yet reproduces important aspects of the modelled system. A model is represented by an (in a certain sense, s.b.) optimal algorithm which generates data that describe the model's state or evolution complying with a ``reduced theory''. Theories are represented by classes of (in a similar sense, s.b.) optimal algorithms that test if their input data comply with the theory. The metatheory does not prescribe the formalisms (data structure, language) to be used for the description of states or evolutions. Transitions to other formalisms and loss of accuracy, common to theory reduction, are explicitly accounted for. The basic assumption of the theory is that resources such as the code length (~ programming time) and the computation time for modelling and testing are costly, but the relative cost of each recourse is unknown. Thus, if there is an algorithm a for which there is no other algorithm b solving the same problem but using less of each recourse, then a is considered optimal. For tests (theories), the set X of wrongly admitted inputs is treated as another resource. It is assumed that X1 is cheaper than X2 when X1 ⊂ X2 (X1 ≠ X2). Depending on the problem, the algorithmic complexity of a reduced theory can be smaller or larger than that of the basic theory. The theory might help to distinguish actual properties of complex systems from mere mental constructs. An application to complex spatio-temporal patterns is discussed.

  11. Quantify uncertain emergency search techniques (QUEST) -- Theory and user`s guide

    SciTech Connect

    Johnson, M.M.; Goldsby, M.E.; Plantenga, T.D.; Porter, T.L.; West, T.H.; Wilcox, W.B.; Hensley, W.K.

    1998-01-01

    As recent world events show, criminal and terrorist access to nuclear materials is a growing national concern. The national laboratories are taking the lead in developing technologies to counter these potential threats to the national security. Sandia National laboratories, with support from Pacific Northwest National Laboratory and the Bechtel Nevada, Remote Sensing Laboratory, has developed QUEST (a model to Quantify Uncertain Emergency Search Techniques), to enhance the performance of organizations in the search for lost or stolen nuclear material. In addition, QUEST supports a wide range of other applications, such as environmental monitoring, nuclear facilities inspections, and searcher training. QUEST simulates the search for nuclear materials and calculates detector response for various source types and locations. The probability of detecting a radioactive source during a search is a function of many different variables, including source type, search location and structure geometry (including shielding), search dynamics (path and speed), and detector type and size. Through calculation of dynamic detector response, QUEST makes possible quantitative comparisons of various sensor technologies and search patterns. The QUEST model can be used as a tool to examine the impact of new detector technologies, explore alternative search concepts, and provide interactive search/inspector training.

  12. Modeling the behavior of the computer-assisted instruction user

    SciTech Connect

    Stoddard, M.L.

    1983-01-01

    The field of computer-assisted instruction CAI contains abundant studies on effectiveness of particular programs or systems. However, the nature of the field is such that the computer is the focus of research, not the users. Few research studies have focused on the behavior of the individual CAI user. Morgan (1981) stated that descriptive studies are needed to clarify what the important phenomena of user behavior are. The need for such studies is particularly acute in computer-assisted instruction. Building a behavioral model would enable us to understand problem-solving strategies and rules applied by the user during a CAI experience. Also, courseware developers could use this information to design tutoring systems that are more responsive to individual differences than our present CAI is. This paper proposes a naturalistic model for evaluating both affective and cognitive characteristics of the CAI user. It begins with a discussion of features of user behavior, followed by a description of evaluation methodology that can lead to modeling user behavior. The paper concludes with a discussion of how implementation of this model can contribute to the fields of CAI and cognitive psychology.

  13. The hydrogen futures simulation model (H2Sim) user's guide.

    SciTech Connect

    Kamery, William; Baker, Arnold Barry; Drennen, Thomas E.; Rosthal, Jennifer Elizabeth

    2004-11-01

    The Hydrogen Futures Simulation Model (H{sub 2}Sim) is a high level, internally consistent, strategic tool for exploring the options of a hydrogen economy. Once the user understands how to use the basic functions, H{sub 2}Sim can be used to examine a wide variety of scenarios, such as testing different options for the hydrogen pathway, altering key assumptions regarding hydrogen production, storage, transportation, and end use costs, and determining the effectiveness of various options on carbon mitigation. This User's Guide explains how to run the model for the first time user.

  14. FEM3C, An improved three-dimensional heavy-gas dispersion model: User`s manual

    SciTech Connect

    Chan, S.T.

    1994-03-01

    FEM3C is another upgraded version of FEM3 (a three-dimensional Finite Element Model), which was developed primarily for simulating the atmospheric dispersion of heavier-than-air gas (or heavy gas) releases, based on solving the fully three-dimensional, time-dependent conservation equations of mass, momentum, energy, and species of an inert gas or a pollutant in the form of vapor/droplets. A generalized anelastic approximation, together with the ideal gas law for the density of the gas/air mixture, is invoked to preclude sound waves and allow large density variations in both space and time. Thee numerical algorithm utilizes a modified Galerkin finite element method to discretize spatially the time-dependent conservation equations of mass, momentum, energy, and species. A consistent pressure Poisson equation is formed and solved separately from the time-dependent equations, which are sequentially solved and integrated in time via a modified forward Euler method. The model can handle instantaneous source, finite-duration, and continuous releases. Also, it is capable of treating terrain and obstructions. Besides a K-theory model using similarity functions, an advanced turbulence model based on solving the k - {var_epsilon} transport equations is available as well. Imbedded in the code are also options for solving the Boussinesq equations. In this report, an overview of the model is given, user`s guides for using the model are provided, and example problems are presented to illustrate the usage of the model.

  15. An Investigation of the Integrated Model of User Technology Acceptance: Internet User Samples in Four Countries

    ERIC Educational Resources Information Center

    Fusilier, Marcelline; Durlabhji, Subhash; Cucchi, Alain

    2008-01-01

    National background of users may influence the process of technology acceptance. The present study explored this issue with the new, integrated technology use model proposed by Sun and Zhang (2006). Data were collected from samples of college students in India, Mauritius, Reunion Island, and United States. Questionnaire methodology and…

  16. [Systematized care in cardiac preoperative: theory of human caring in the perspective of nurses and users].

    PubMed

    Amorim, Thais Vasconselos; Arreguy-Sena, Cristina; Alves, Marcelo da Silva; Salimena, Anna Maria de Oliveira

    2014-01-01

    This is a case study research that aimed to know, with the adoption of the Theory of Human Caring, the meanings of therapeutic interpersonal relationship between nurse and user on the preoperative nursing visit after the experience of the surgical process. The convenience sample was composed of three nurses and three users of an institution that has updated records to perform highly complex cardiovascular surgery, comprising nine combinations of therapeutic interactions. It was used instruments, structured according to the theory of Jean Watson and North American Nursing Diagnosis Association, Nursing Intervention Classification and Nursing Outcomes Classification taxonomies. The legal and ethical aspects of research involving human subjects were assured. The results revealed three clusters to grasp the significance of preoperative visits by users and five clusters to capture the perception of nurses when they experience this clinical experience.

  17. How Homeless Sector Workers Deal with the Death of Service Users: A Grounded Theory Study

    ERIC Educational Resources Information Center

    Lakeman, Richard

    2011-01-01

    Homeless sector workers often encounter the deaths of service users. A modified grounded theory methodology project was used to explore how workers make sense of, respond to, and cope with sudden death. In-depth interviews were undertaken with 16 paid homeless sector workers who had experienced the death of someone with whom they worked.…

  18. Precipitation-runoff modeling system; user's manual

    USGS Publications Warehouse

    Leavesley, G.H.; Lichty, R.W.; Troutman, B.M.; Saindon, L.G.

    1983-01-01

    The concepts, structure, theoretical development, and data requirements of the precipitation-runoff modeling system (PRMS) are described. The precipitation-runoff modeling system is a modular-design, deterministic, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow, sediment yields, and general basin hydrology. Basin response to normal and extreme rainfall and snowmelt can be simulated to evaluate changes in water balance relationships, flow regimes, flood peaks and volumes, soil-water relationships, sediment yields, and groundwater recharge. Parameter-optimization and sensitivity analysis capabilites are provided to fit selected model parameters and evaluate their individual and joint effects on model output. The modular design provides a flexible framework for continued model system enhancement and hydrologic modeling research and development. (Author 's abstract)

  19. User's instructions for the cardiovascular Walters model

    NASA Technical Reports Server (NTRS)

    Croston, R. C.

    1973-01-01

    The model is a combined, steady-state cardiovascular and thermal model. It was originally developed for interactive use, but was converted to batch mode simulation for the Sigma 3 computer. The model has the purpose to compute steady-state circulatory and thermal variables in response to exercise work loads and environmental factors. During a computer simulation run, several selected variables are printed at each time step. End conditions are also printed at the completion of the run.

  20. Modeling User Behavior in Computer Learning Tasks.

    ERIC Educational Resources Information Center

    Mantei, Marilyn M.

    Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…

  1. Randomized Item Response Theory Models

    ERIC Educational Resources Information Center

    Fox, Jean-Paul

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by an item response theory (IRT) model. The RR…

  2. USERS MANUAL: LANDFILL GAS EMISSIONS MODEL - VERSION 2.0

    EPA Science Inventory

    The document is a user's guide for a computer model, Version 2.0 of the Landfill Gas Emissions Model (LandGEM), for estimating air pollution emissions from municipal solid waste (MSW) landfills. The model can be used to estimate emission rates for methane, carbon dioxide, nonmet...

  3. HYDROCARBON SPILL SCREENING MODEL (HSSM) VOLUME 1: USER'S GUIDE

    EPA Science Inventory

    This users guide describes the Hydrocarbon Spill Screening Model (HSSM). The model is intended for simulation of subsurface releases of light nonaqueous phase liquids (LNAPLs). The model consists of separate modules for LNAPL flow through the vadose zone, spreading in the capil...

  4. Do recommender systems benefit users? a modeling approach

    NASA Astrophysics Data System (ADS)

    Yeung, Chi Ho

    2016-04-01

    Recommender systems are present in many web applications to guide purchase choices. They increase sales and benefit sellers, but whether they benefit customers by providing relevant products remains less explored. While in many cases the recommended products are relevant to users, in other cases customers may be tempted to purchase the products only because they are recommended. Here we introduce a model to examine the benefit of recommender systems for users, and find that recommendations from the system can be equivalent to random draws if one always follows the recommendations and seldom purchases according to his or her own preference. Nevertheless, with sufficient information about user preferences, recommendations become accurate and an abrupt transition to this accurate regime is observed for some of the studied algorithms. On the other hand, we find that high estimated accuracy indicated by common accuracy metrics is not necessarily equivalent to high real accuracy in matching users with products. This disagreement between estimated and real accuracy serves as an alarm for operators and researchers who evaluate recommender systems merely with accuracy metrics. We tested our model with a real dataset and observed similar behaviors. Finally, a recommendation approach with improved accuracy is suggested. These results imply that recommender systems can benefit users, but the more frequently a user purchases the recommended products, the less relevant the recommended products are in matching user taste.

  5. METAPHOR (version 1): Users guide. [performability modeling

    NASA Technical Reports Server (NTRS)

    Furchtgott, D. G.

    1979-01-01

    General information concerning METAPHOR, an interactive software package to facilitate performability modeling and evaluation, is presented. Example systems are studied and their performabilities are calculated. Each available METAPHOR command and array generator is described. Complete METAPHOR sessions are included.

  6. Utilizing Vector Space Models for User Modeling within e-Learning Environments

    ERIC Educational Resources Information Center

    Mangina, E.; Kilbride, J.

    2008-01-01

    User modeling has been found to enhance the effectiveness and/or usability of software systems through the representation of certain properties of a particular user. This paper presents the research and the results of the development of a user modeling system for the implementation of student models within e-learning environments, utilizing vector…

  7. User's instructions for the erythropoiesis regulatory model

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The purpose of the model provides a method to analyze some of the events that could account for the decrease in red cell mass observed in crewmen returning from space missions. The model is based on the premise that erythrocyte production is governed by the balance between oxygen supply and demand at a renal sensing site. Oxygen supply is taken to be a function of arterial oxygen tension, mean corpuscular hemoglobin concentration, oxy-hemoglobin carrying capacity, hematocrit, and blood flow. Erythrocyte destruction is based on the law of mass action. The instantaneous hematocrit value is derived by integrating changes in production and destruction rates and accounting for the degree of plasma dilution.

  8. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  9. Modeling User Interactions with Instructional Design Software.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    As one of a series of studies being conducted to develop a useful (predictive) model of the instructional design process that is appropriate to military technical training settings, this study performed initial evaluations on two pieces of instructional design software developed by M. David Merrill and colleagues at Utah State University i.e.,…

  10. The Michigan Space Weather Modeling Framework (SWMF) Graphical User Interface

    NASA Astrophysics Data System (ADS)

    de Zeeuw, D.; Gombosi, T.; Toth, G.; Ridley, A.

    2007-05-01

    The Michigan Space Weather Modeling Framework (SWMF) is a powerful tool available for the community that has been used to model from the Sun to Earth and beyond. As a research tool, however, it still requires user experience with parallel compute clusters and visualization tools. Thus, we have developed a graphical user interface (GUI) that assists with configuring, compiling, and running the SWMF, as well as visualizing the model output. This is accomplished through a portable web interface. Live examples will be demonstrated and visualization of several archived events will be shown.

  11. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  12. Simple Thermal Environment Model (STEM) User's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C.G.; Batts, G. W.; Anderson, B. J.; James, B. F.

    2001-01-01

    This report presents a Simple Thermal Environment Model (STEM) for determining appropriate engineering design values to specify the thermal environment of Earth-orbiting satellites. The thermal environment of a satellite, consists of three components: (1) direct solar radiation, (2) Earth-atmosphere reflected shortwave radiation, as characterized by Earth's albedo, and (3) Earth-atmosphere-emitted outgoing longwave radiation (OLR). This report, together with a companion "guidelines" report provides methodology and guidelines for selecting "design points" for thermal environment parameters for satellites and spacecraft systems. The methods and models reported here are outgrowths of Earth Radiation Budget Experiment (ERBE) satellite data analysis and thermal environment specifications discussed by Anderson and Smith (1994). In large part, this report is intended to update (and supersede) those results.

  13. Supplement to wellbore models GWELL, GWNACL, and HOLA User`s Guide

    SciTech Connect

    Hadgu, T.; Bodvarsson, G.S.

    1992-09-01

    A study was made on improving the applicability and ease of usage of the wellbore simulators HOLA, GWELL and GWNACL (Bjornsson, 1987; Aunzo et al., 1991). The study concentrated mainly on the usage of Option 2 (please refer to the User`s Guide; Aunzo et al., 1991) and modeling flow of superheated steam when using these computer codes. Amendments were made to the simulators to allow implementation of a variety of input data. A wide range of input data was used to test the modifications to the codes. The study did not attempt to modify or improve the physics or formulations which were used in the models. It showed that a careful check of the input data is required. This report addresses these two areas of interest: usage of Option 2, and simulation of wellbore flow of superheated steam.

  14. Theory of hadronic nonperturbative models

    SciTech Connect

    Coester, F.; Polyzou, W.N.

    1995-08-01

    As more data probing hadron structure become available hadron models based on nonperturbative relativistic dynamics will be increasingly important for their interpretation. Relativistic Hamiltonian dynamics of few-body systems (constituent-quark models) and many-body systems (parton models) provides a precisely defined approach and a useful phenomenology. However such models lack a quantitative foundation in quantum field theory. The specification of a quantum field theory by a Euclidean action provides a basis for the construction of nonperturbative models designed to maintain essential features of the field theory. For finite systems it is possible to satisfy axioms which guarantee the existence of a Hilbert space with a unitary representation of the Poincare group and the spectral condition which ensures that the spectrum of the four-momentum operator is in the forward light cone. The separate axiom which guarantees locality of the field operators can be weakened for the construction for few-body models. In this context we are investigating algebraic and analytic properties of model Schwinger functions. This approach promises insight into the relations between hadronic models based on relativistic Hamiltonian dynamics on one hand and Bethe-Salpeter Green`s-function equations on the other.

  15. Modeling mutual feedback between users and recommender systems

    NASA Astrophysics Data System (ADS)

    Zeng, An; Yeung, Chi Ho; Medo, Matúš; Zhang, Yi-Cheng

    2015-07-01

    Recommender systems daily influence our decisions on the Internet. While considerable attention has been given to issues such as recommendation accuracy and user privacy, the long-term mutual feedback between a recommender system and the decisions of its users has been neglected so far. We propose here a model of network evolution which allows us to study the complex dynamics induced by this feedback, including the hysteresis effect which is typical for systems with non-linear dynamics. Despite the popular belief that recommendation helps users to discover new things, we find that the long-term use of recommendation can contribute to the rise of extremely popular items and thus ultimately narrow the user choice. These results are supported by measurements of the time evolution of item popularity inequality in real systems. We show that this adverse effect of recommendation can be tamed by sacrificing part of short-term recommendation accuracy.

  16. Solid rocket booster performance evaluation model. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    This users manual for the solid rocket booster performance evaluation model (SRB-II) contains descriptions of the model, the program options, the required program inputs, the program output format and the program error messages. SRB-II is written in FORTRAN and is operational on both the IBM 370/155 and the MSFC UNIVAC 1108 computers.

  17. Earthquake Early Warning Beta Users: Java, Modeling, and Mobile Apps

    NASA Astrophysics Data System (ADS)

    Strauss, J. A.; Vinci, M.; Steele, W. P.; Allen, R. M.; Hellweg, M.

    2014-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The beta users receive earthquake information very rapidly in real-time and are providing feedback on their experiences of performance and potential uses within their organization. Beta user interactions allow the ShakeAlert team to discern: which alert delivery options are most effective, what changes would make the UserDisplay more useful in a pre-disaster situation, and most importantly, what actions users plan to take for various scenarios. Actions could include: personal safety approaches, such as drop cover, and hold on; automated processes and procedures, such as opening elevator or fire stations doors; or situational awareness. Users are beginning to determine which policy and technological changes may need to be enacted, and funding requirements to implement their automated controls. The use of models and mobile apps are beginning to augment the basic Java desktop applet. Modeling allows beta users to test their early warning responses against various scenarios without having to wait for a real event. Mobile apps are also changing the possible response landscape, providing other avenues for people to receive information. All of these combine to improve business continuity and resiliency.

  18. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  19. Geothermal loan guaranty cash flow model: description and users' manual

    SciTech Connect

    Keimig, M.A.; Rosenberg, J.I.; Entingh, D.J.

    1980-11-01

    This is the users guide for the Geothermal Loan Guaranty Cash Flow Model (GCFM). GCFM is a Fortran code which designs and costs geothermal fields and electric power plants. It contains a financial analysis module which performs life cycle costing analysis taking into account various types of taxes, costs and financial structures. The financial module includes a discounted cash flow feature which calculates a levelized breakeven price for each run. The user's guide contains descriptions of the data requirements and instructions for using the model.

  20. Understanding Deep Representations Learned in Modeling Users Likes.

    PubMed

    Guntuku, Sharath Chandra; Zhou, Joey Tianyi; Roy, Sujoy; Lin, Weisi; Tsang, Ivor W

    2016-08-01

    Automatically understanding and discriminating different users' liking for an image is a challenging problem. This is because the relationship between image features (even semantic ones extracted by existing tools, viz., faces, objects, and so on) and users' likes is non-linear, influenced by several subtle factors. This paper presents a deep bi-modal knowledge representation of images based on their visual content and associated tags (text). A mapping step between the different levels of visual and textual representations allows for the transfer of semantic knowledge between the two modalities. Feature selection is applied before learning deep representation to identify the important features for a user to like an image. The proposed representation is shown to be effective in discriminating users based on images they like and also in recommending images that a given user likes, outperforming the state-of-the-art feature representations by  ∼ 15 %-20%. Beyond this test-set performance, an attempt is made to qualitatively understand the representations learned by the deep architecture used to model user likes. PMID:27295666

  1. A mixing evolution model for bidirectional microblog user networks

    NASA Astrophysics Data System (ADS)

    Yuan, Wei-Guo; Liu, Yun

    2015-08-01

    Microblogs have been widely used as a new form of online social networking. Based on the user profile data collected from Sina Weibo, we find that the number of microblog user bidirectional friends approximately corresponds with the lognormal distribution. We then build two microblog user networks with real bidirectional relationships, both of which have not only small-world and scale-free but also some special properties, such as double power-law degree distribution, disassortative network, hierarchical and rich-club structure. Moreover, by detecting the community structures of the two real networks, we find both of their community scales follow an exponential distribution. Based on the empirical analysis, we present a novel evolution network model with mixed connection rules, including lognormal fitness preferential and random attachment, nearest neighbor interconnected in the same community, and global random associations in different communities. The simulation results show that our model is consistent with real network in many topology features.

  2. Designing visual displays and system models for safe reactor operations based on the user`s perspective of the system

    SciTech Connect

    Brown-VanHoozer, S.A.

    1995-12-31

    Most designers are not schooled in the area of human-interaction psychology and therefore tend to rely on the traditional ergonomic aspects of human factors when designing complex human-interactive workstations related to reactor operations. They do not take into account the differences in user information processing behavior and how these behaviors may affect individual and team performance when accessing visual displays or utilizing system models in process and control room areas. Unfortunately, by ignoring the importance of the integration of the user interface at the information process level, the result can be sub-optimization and inherently error- and failure-prone systems. Therefore, to minimize or eliminate failures in human-interactive systems, it is essential that the designers understand how each user`s processing characteristics affects how the user gathers information, and how the user communicates the information to the designer and other users. A different type of approach in achieving this understanding is Neuro Linguistic Programming (NLP). The material presented in this paper is based on two studies involving the design of visual displays, NLP, and the user`s perspective model of a reactor system. The studies involve the methodology known as NLP, and its use in expanding design choices from the user`s ``model of the world,`` in the areas of virtual reality, workstation design, team structure, decision and learning style patterns, safety operations, pattern recognition, and much, much more.

  3. User's guide for the Fugitive Dust Model (FDM). Final report

    SciTech Connect

    Winges, K.D.

    1988-06-01

    This document provides a technical description and User's Instructions for the Fugitive Dust Model. The FDM is a Gaussian-plume base dispersion model specifically designed for computation of fugitive-dust concentrations and deposition rates. It's chief advantage over other models is an advance deposition algorithm. A validation study has been performed and is included as an appendix. The document also includes sample input and output printouts and a complete listing of the FORTRAN computer code.

  4. H2A Production Model, Version 2 User Guide

    SciTech Connect

    Steward, D.; Ramsden, T.; Zuboy, J.

    2008-09-01

    The H2A Production Model analyzes the technical and economic aspects of central and forecourt hydrogen production technologies. Using a standard discounted cash flow rate of return methodology, it determines the minimum hydrogen selling price, including a specified after-tax internal rate of return from the production technology. Users have the option of accepting default technology input values--such as capital costs, operating costs, and capacity factor--from established H2A production technology cases or entering custom values. Users can also modify the model's financial inputs. This new version of the H2A Production Model features enhanced usability and functionality. Input fields are consolidated and simplified. New capabilities include performing sensitivity analyses and scaling analyses to various plant sizes. This User Guide helps users already familiar with the basic tenets of H2A hydrogen production cost analysis get started using the new version of the model. It introduces the basic elements of the model then describes the function and use of each of its worksheets.

  5. Shawnee flue gas desulfurization computer model users manual

    SciTech Connect

    Sudhoff, F.A.; Torstrick, R.L.

    1985-03-01

    In conjunction with the US Enviromental Protection Agency sponsored Shawnee test program, Bechtel National, Inc., and the Tennessee Valley Authority jointly developed a computer model capable of projecting preliminary design and economics for lime- and limestone-scrubbing flue gas desulfurization systems. The model is capable of projecting relative economics for spray tower, turbulent contact absorber, and venturi-spray tower scrubbing options. It may be used to project the effect on system design and economics of variations in required SO/sub 2/ removal, scrubber operating parameters (gas velocity, liquid-to-gas (L/G) ration, alkali stoichiometry, liquor hold time in slurry recirculation tanks), reheat temperature, and scrubber bypass. It may also be used to evaluate alternative waste disposal methods or additives (MgO or adipic acid) on costs for the selected process. Although the model is not intended to project the economics of an individual system to a high degree of accuracy, it allows prospective users to quickly project comparative design and costs for limestone and lime case variations on a common design and cost basis. The users manual provides a general descripton of the Shawnee FGD computer model and detailed instructions for its use. It describes and explains the user-supplied input data which are required such as boiler size, coal characteristics, and SO/sub 2/ removal requirments. Output includes a material balance, equipment list, and detailed capital investment and annual revenue requirements. The users manual provides information concerning the use of the overall model as well as sample runs to serve as a guide to prospective users in identifying applications. The FORTRAN-based model is maintained by TVA, from whom copies or individual runs are available. 25 refs., 3 figs., 36 tabs.

  6. Solid rocket booster thermal radiation model. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Lee, A. L.

    1976-01-01

    A user's manual was prepared for the computer program of a solid rocket booster (SRB) thermal radiation model. The following information was included: (1) structure of the program, (2) input information required, (3) examples of input cards and output printout, (4) program characteristics, and (5) program listing.

  7. Using Partial Credit and Response History to Model User Knowledge

    ERIC Educational Resources Information Center

    Van Inwegen, Eric G.; Adjei, Seth A.; Wang, Yan; Heffernan, Neil T.

    2015-01-01

    User modelling algorithms such as Performance Factors Analysis and Knowledge Tracing seek to determine a student's knowledge state by analyzing (among other features) right and wrong answers. Anyone who has ever graded an assignment by hand knows that some answers are "more wrong" than others; i.e. they display less of an understanding…

  8. Dynamic User Modeling within a Game-Based ITS

    ERIC Educational Resources Information Center

    Snow, Erica L.

    2015-01-01

    Intelligent tutoring systems are adaptive learning environments designed to support individualized instruction. The adaptation embedded within these systems is often guided by user models that represent one or more aspects of students' domain knowledge, actions, or performance. The proposed project focuses on the development and testing of user…

  9. A generalized development model for testing GPS user equipment

    NASA Technical Reports Server (NTRS)

    Hemesath, N.

    1978-01-01

    The generalized development model (GDM) program, which was intended to establish how well GPS user equipment can perform under a combination of jamming and dynamics, is described. The systems design and the characteristics of the GDM are discussed. The performance aspects of the GDM are listed and the application of the GDM to civil aviation is examined.

  10. ABAREX -- A neutron spherical optical-statistical-model code -- A user`s manual

    SciTech Connect

    Smith, A.B.; Lawson, R.D.

    1998-06-01

    The contemporary version of the neutron spherical optical-statistical-model code ABAREX is summarized with the objective of providing detailed operational guidance for the user. The physical concepts involved are very briefly outlined. The code is described in some detail and a number of explicit examples are given. With this document one should very quickly become fluent with the use of ABAREX. While the code has operated on a number of computing systems, this version is specifically tailored for the VAX/VMS work station and/or the IBM-compatible personal computer.

  11. COSCREEN98 user`s manual for the FDOT intersection air quality (CO) screening model

    SciTech Connect

    Cooper, C.D.; Keely, D.K.

    1999-03-01

    As with the original FDOT carbon monoxide (CO) screening test -- COSCREEN -- this new version, COSCREEN98, attempts to screen major intersections by using a hypothetical four-legged intersection. The original COSCREEN is based on CALINE3 analysis of an intersection for a specific set of temperatures. COSCREEN98 requires only a few more input data than the original COSCREEN, but is much more accurate about predicted CO concentrations near the intersection. The user inputs the region, the environment, the traffic volume, the approach speed, the year, and up to ten real receptors. This model then uses MOBILE5a and CAL3QHC2 to evaluate the intersection. CAL3QHC2 is the model accepted by the US Environmental Protection Agency (EPA) for dispersion modeling at intersections. Using MOBILE5a, emission factors are automatically developed for each analysis based on the location of the project, year, and speed. The location of the project impacts the temperatures and MOBILE5a options, which are automatically set by COSCREEN98. The selected environment (urban, suburban or rural) impacts the surface roughness value and the background CO concentrations. The addition of the suburban option provides the user with additional flexibility. The one-hour and eight-hour CO concentrations (including background concentrations) are calculated at each specified receptor.

  12. A general graphical user interface for automatic reliability modeling

    NASA Technical Reports Server (NTRS)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1991-01-01

    Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.

  13. User interface for ground-water modeling: Arcview extension

    USGS Publications Warehouse

    Tsou, M.-S.; Whittemore, D.O.

    2001-01-01

    Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.

  14. Airport-Noise Levels and Annoyance Model (ALAMO) user's guide

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    A guide for the use of the Airport-Noise Level and Annoyance MOdel (ALAMO) at the Langley Research Center computer complex is provided. This document is divided into 5 primary sections, the introduction, the purpose of the model, and an in-depth description of the following subsystems: baseline, noise reduction simulation and track analysis. For each subsystem, the user is provided with a description of architecture, an explanation of subsystem use, sample results, and a case runner's check list. It is assumed that the user is familiar with the operations at the Langley Research Center (LaRC) computer complex, the Network Operating System (NOS 1.4) and CYBER Control Language. Incorporated within the ALAMO model is a census database system called SITE II.

  15. The Snowmelt-Runoff Model (SRM) user's manual

    NASA Technical Reports Server (NTRS)

    Martinec, J.; Rango, A.; Major, E.

    1983-01-01

    A manual to provide a means by which a user may apply the snowmelt runoff model (SRM) unaided is presented. Model structure, conditions of application, and data requirements, including remote sensing, are described. Guidance is given for determining various model variables and parameters. Possible sources of error are discussed and conversion of snowmelt runoff model (SRM) from the simulation mode to the operational forecasting mode is explained. A computer program is presented for running SRM is easily adaptable to most systems used by water resources agencies.

  16. SN_GUI: a graphical user interface for snowpack modeling

    NASA Astrophysics Data System (ADS)

    Spreitzhofer, G.; Fierz, C.; Lehning, M.

    2004-10-01

    SNOWPACK is a physical snow cover model. The model not only serves as a valuable research tool, but also runs operationally on a network of high Alpine automatic weather and snow measurement sites. In order to facilitate the operation of SNOWPACK and the interpretation of the results obtained by this model, a user-friendly graphical user interface for snowpack modeling, named SN_GUI, was created. This Java-based and thus platform-independent tool can be operated in two modes, one designed to fulfill the requirements of avalanche warning services (e.g. by providing information about critical layers within the snowpack that are closely related to the avalanche activity), and the other one offering a variety of additional options satisfying the needs of researchers. The user of SN_GUI is graphically guided through the entire process of creating snow cover simulations. The starting point is the efficient creation of input parameter files for SNOWPACK, followed by the launching of SNOWPACK with a variety of parameter settings. Finally, after the successful termination of the run, a number of interactive display options may be used to visualize the model output. Among these are vertical profiles and time profiles for many parameters. Besides other features, SN_GUI allows the use of various color, time and coordinate scales, and the comparison of measured and observed parameters.

  17. EpiPOD : community vaccination and dispensing model user's guide.

    SciTech Connect

    Berry, M.; Samsa, M.; Walsh, D.; Decision and Information Sciences

    2009-01-09

    EpiPOD is a modeling system that enables local, regional, and county health departments to evaluate and refine their plans for mass distribution of antiviral and antibiotic medications and vaccines. An intuitive interface requires users to input as few or as many plan specifics as are available in order to simulate a mass treatment campaign. Behind the input interface, a system dynamics model simulates pharmaceutical supply logistics, hospital and first-responder personnel treatment, population arrival dynamics and treatment, and disease spread. When the simulation is complete, users have estimates of the number of illnesses in the population at large, the number of ill persons seeking treatment, and queuing and delays within the mass treatment system--all metrics by which the plan can be judged.

  18. Agile IT: Thinking in User-Centric Models

    NASA Astrophysics Data System (ADS)

    Margaria, Tiziana; Steffen, Bernhard

    We advocate a new teaching direction for modern CS curricula: extreme model-driven development (XMDD), a new development paradigm designed to continuously involve the customer/application expert throughout the whole systems' life cycle. Based on the `One-Thing Approach', which works by successively enriching and refining one single artifact, system development becomes in essence a user-centric orchestration of intuitive service functionality. XMDD differs radically from classical software development, which, in our opinion is no longer adequate for the bulk of application programming - in particular when it comes to heterogeneous, cross organizational systems which must adapt to rapidly changing market requirements. Thus there is a need for new curricula addressing this model-driven, lightweight, and cooperative development paradigm that puts the user process in the center of the development and the application expert in control of the process evolution.

  19. Regional Ionospheric Modelling for Single-Frequency Users

    NASA Astrophysics Data System (ADS)

    Boisits, Janina; Joldzic, Nina; Weber, Robert

    2016-04-01

    Ionospheric signal delays are a main error source in GNSS-based positioning. Thus, single-frequency receivers, which are frequently used nowadays, require additional ionospheric information to mitigate these effects. Within the Austrian Research Promotion Agency (FFG) project Regiomontan (Regional Ionospheric Modelling for Single-Frequency Users) a new and as realistic as possible model is used to obtain precise GNSS ionospheric signal delays. These delays will be provided to single-frequency users to significantly increase positioning accuracy. The computational basis is the Thin-Shell Model. For regional modelling a thin electron layer of the underlying model is approximated by a Taylor series up to degree two. The network used includes 22 GNSS Reference Stations in Austria and nearby. First results were calculated from smoothed code observations by forming the geometry-free linear combination. Satellite and station DCBs were applied. In a least squares adjustment the model parameters, consisting of the VTEC0 at the origin of the investigated area, as well as the first and the second derivatives of the electron content in longitude and latitude, were obtained with a temporal resolution of 1 hour. The height of the layer was kept fixed. The formal errors of the model parameters suggest an accuracy of the VTEC slightly better than 1TECU for a user location within Austria. In a further step, the model parameters were derived from sole phase observations by using a levelling approach to mitigate common range biases. The formal errors of this model approach suggest an accuracy of about a few tenths of a TECU. For validation, the Regiomontan VTEC was compared to IGS TEC maps depicting a very good agreement. Further, a comparison of pseudoranges has been performed to calculate the 'true' error by forming the ionosphere-free linear combination on the one hand, and by applying the Regiomontan model to L1 pseudoranges on the other hand. The resulting differences are mostly

  20. Simplified analytical model of penetration with lateral loading -- User`s guide

    SciTech Connect

    Young, C.W.

    1998-05-01

    The SAMPLL (Simplified Analytical Model of Penetration with Lateral Loading) computer code was originally developed in 1984 to realistically yet economically predict penetrator/target interactions. Since the code`s inception, its use has spread throughout the conventional and nuclear penetrating weapons community. During the penetrator/target interaction, the resistance of the material being penetrated imparts both lateral and axial loads on the penetrator. These loads cause changes to the penetrator`s motion (kinematics). SAMPLL uses empirically based algorithms, formulated from an extensive experimental data base, to replicate the loads the penetrator experiences during penetration. The lateral loads resulting from angle of attack and trajectory angle of the penetrator are explicitly treated in SAMPLL. The loads are summed and the kinematics calculated at each time step. SAMPLL has been continually improved, and the current version, Version 6.0, can handle cratering and spall effects, multiple target layers, penetrator damage/failure, and complex penetrator shapes. Version 6 uses the latest empirical penetration equations, and also automatically adjusts the penetrability index for certain target layers to account for layer thickness and confinement. This report describes the SAMPLL code, including assumptions and limitations, and includes a user`s guide.

  1. Effectiveness of Anabolic Steroid Preventative Intervention among Gym Users: Applying Theory of Planned Behavior

    PubMed Central

    Jalilian, Farzad; Allahverdipour, Hamid; Moeini, Babak; Moghimbeigi, Abbas

    2011-01-01

    Background: Use of anabolic androgenic steroids (AAS) has been associated with adverse physical and psychiatric effects and it is known as rising problem among youth people. This study was conducted to evaluate anabolic steroids preventative intervention efficiency among gym users in Iran and theory of planned behaviour was applied as theoretical framework. Methods: Overall, 120 male gym users participated in this study as intervention and control group. This was a longitudinal randomized pretest - posttest series control group design panel study to implement a behaviour modification based intervention to prevent AAS use. Cross -tabulation and t-test by using SPSS statistical package, version 13 was used for the statistical analysis. Results: It was found significant improvements in average response for knowledge about side effects of AAS (P<0.001), attitude toward, and intention not to use AAS. Additionally after intervention, the rate of AAS and supplements use was decreased among intervention group. Conclusion: Comprehensive implementation against AAS abuse among gym users and ado­lescences would be effective to improve adolescents’ healthy behaviors and intend them not to use AAS. PMID:24688897

  2. NASA AVOSS Fast-Time Wake Prediction Models: User's Guide

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew

    2014-01-01

    The National Aeronautics and Space Administration (NASA) is developing and testing fast-time wake transport and decay models to safely enhance the capacity of the National Airspace System (NAS). The fast-time wake models are empirical algorithms used for real-time predictions of wake transport and decay based on aircraft parameters and ambient weather conditions. The aircraft dependent parameters include the initial vortex descent velocity and the vortex pair separation distance. The atmospheric initial conditions include vertical profiles of temperature or potential temperature, eddy dissipation rate, and crosswind. The current distribution includes the latest versions of the APA (3.4) and the TDP (2.1) models. This User's Guide provides detailed information on the model inputs, file formats, and the model output. An example of a model run and a brief description of the Memphis 1995 Wake Vortex Dataset is also provided.

  3. HIGHWAY, a transportation routing model: program description and users' manual

    SciTech Connect

    Joy, D.S.; Johnson, P.E.; Gibson, S.M.

    1982-12-01

    A computerized transportation routing model has been developed at the Oak Ridge National Laboratory to be used for predicting likely routes for shipping radioactive materials. The HIGHWAY data base is a computerized road atlas containing descriptions of the entire interstate highway system, the federal highway system, and most of the principal state roads. In addition to its prediction of the most likely commercial route, options incorporated in the HIGHWAY model can allow for maximum use of interstate highways or routes that will bypass urbanized areas containing populations > 100,000. The user may also interactively modify the data base to predict routes that bypass any particular state, city, town, or specific highway segment.

  4. Halo modelling in chameleon theories

    SciTech Connect

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu E-mail: kazuya.koyama@port.ac.uk

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  5. Stochastic models: theory and simulation.

    SciTech Connect

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  6. Engine structures modeling software system: Computer code. User's manual

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.

  7. Interactive Rapid Dose Assessment Model (IRDAM): user's guide

    SciTech Connect

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This User's Guide provides instruction in the setup and operation of the equipment necessary to run IRDAM. Instructions are also given on how to load the magnetic disks and access the interactive part of the program. Two other companion volumes to this one provide additional information on IRDAM. Reactor Accident Assessment Methods (NUREG/CR-3012, Volume 2) describes the technical bases for IRDAM including methods, models and assumptions used in calculations. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios.

  8. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  9. Groundwater modeling and remedial optimization design using graphical user interfaces

    SciTech Connect

    Deschaine, L.M.

    1997-05-01

    The ability to accurately predict the behavior of chemicals in groundwater systems under natural flow circumstances or remedial screening and design conditions is the cornerstone to the environmental industry. The ability to do this efficiently and effectively communicate the information to the client and regulators is what differentiates effective consultants from ineffective consultants. Recent advances in groundwater modeling graphical user interfaces (GUIs) are doing for numerical modeling what Windows{trademark} did for DOS{trademark}. GUI facilitates both the modeling process and the information exchange. This Test Drive evaluates the performance of two GUIs--Groundwater Vistas and ModIME--on an actual groundwater model calibration and remedial design optimization project. In the early days of numerical modeling, data input consisted of large arrays of numbers that required intensive labor to input and troubleshoot. Model calibration was also manual, as was interpreting the reams of computer output for each of the tens or hundreds of simulations required to calibrate and perform optimal groundwater remedial design. During this period, the majority of the modelers effort (and budget) was spent just getting the model running, as opposed to solving the environmental challenge at hand. GUIs take the majority of the grunt work out of the modeling process, thereby allowing the modeler to focus on designing optimal solutions.

  10. User-friendly software for modeling collective spin wave excitations

    NASA Astrophysics Data System (ADS)

    Hahn, Steven; Peterson, Peter; Fishman, Randy; Ehlers, Georg

    There exists a great need for user-friendly, integrated software that assists in the scientific analysis of collective spin wave excitations measured with inelastic neutron scattering. SpinWaveGenie is a C + + software library that simplifies the modeling of collective spin wave excitations, allowing scientists to analyze neutron scattering data with sophisticated models fast and efficiently. Furthermore, one can calculate the four-dimensional scattering function S(Q,E) to directly compare and fit calculations to experimental measurements. Its generality has been both enhanced and verified through successful modeling of a wide array of magnetic materials. Recently, we have spent considerable effort transforming SpinWaveGenie from an early prototype to a high quality free open source software package for the scientific community. S.E.H. acknowledges support by the Laboratory's Director's fund, ORNL. Work was sponsored by the Division of Scientific User Facilities, Office of Basic Energy Sciences, US Department of Energy, under Contract No. DE-AC05-00OR22725 with UT-Battelle, LLC.

  11. Users manual for a one-dimensional Lagrangian transport model

    USGS Publications Warehouse

    Schoellhamer, D.H.; Jobson, H.E.

    1986-01-01

    A Users Manual for the Lagrangian Transport Model (LTM) is presented. The LTM uses Lagrangian calculations that are based on a reference frame moving with the river flow. The Lagrangian reference frame eliminates the need to numerically solve the convective term of the convection-diffusion equation and provides significant numerical advantages over the more commonly used Eulerian reference frame. When properly applied, the LTM can simulate riverine transport and decay processes within the accuracy required by most water quality studies. The LTM is applicable to steady or unsteady one-dimensional unidirectional flows in fixed channels with tributary and lateral inflows. Application of the LTM is relatively simple and optional capabilities improve the model 's convenience. Appendices give file formats and three example LTM applications that include the incorporation of the QUAL II water quality model 's reaction kinetics into the LTM. (Author 's abstract)

  12. Investigating Agile User-Centered Design in Practice: A Grounded Theory Perspective

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    This paper investigates how the integration of agile methods and User-Centered Design (UCD) is carried out in practice. For this study, we have applied grounded theory as a suitable qualitative approach to determine what is happening in actual practice. The data was collected by semi-structured interviews with professionals who have already worked with an integrated agile UCD methodology. Further data was collected by observing these professionals in their working context, and by studying their documents, where possible. The emerging themes that the study found show that there is an increasing realization of the importance of usability in software development among agile team members. The requirements are emerging; and both low and high fidelity prototypes based usability tests are highly used in agile teams. There is an appreciation of each other's work from both UCD professionals and developers and both sides can learn from each other.

  13. COSTEAM, an industrial steam generation cost model: updated users' manual

    SciTech Connect

    Murphy, Mary; Reierson, James; Lethi, Minh- Triet

    1980-10-01

    COSTEAM is a tool for designers and managers faced with choosing among alternative systems for generating process steam, whether for new or replacement applications. Such a decision requires a series of choices among overall system concepts, component characteristics, fuel types and financial assumptions, all of which are interdependent and affect the cost of steam. COSTEAM takes the user's input on key characteristics of a proposed process steam generation facility, and computes its capital, operating and maintenance costs. Versatility and simplicity of operation are major goals of the COSTEAM system. As a user, you can work to almost any level of detail necessary and appropriate to a given stage of planning. Since the values you specify are retained and used by the computer throughout each terminal session, you can set up a hypothetical steam generation system fixed in all characteristics but one or two of special interest. It is then quick and easy to obtain a series of results by changing only those one or two values between computer runs. This updated version of the Users' Manual contains instructions for using the expanded and improved COSTEAM model. COSTEAM has three technology submodels which address conventional coal, conventional oil and atmospheric fluidized bed combustion. The structure and calculation methods of COSTEAM are not discussed in this guide, and need not be understood in order to use the model. However, you may consult the companion volume of this report, COSTEAM Expansion and Improvements: Design of a Coal-Fired Atmospheric Fluidized Bed Submodel, an Oil-Fired Submodel, and Input/Output Improvements, MTR80W00048, which presents the design details.

  14. Galaxy Alignments: Theory, Modelling & Simulations

    NASA Astrophysics Data System (ADS)

    Kiessling, Alina; Cacciato, Marcello; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D.; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L.; Rassat, Anais

    2015-11-01

    The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in the large-scale structure tend to align nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of both the shapes and angular momenta of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both N-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the field, providing a solid basis for future work.

  15. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

    2009-08-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several

  16. Theory, modeling, and simulation annual report, 1992

    SciTech Connect

    Not Available

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  17. Modeling Actions of PubMed Users with N-Gram Language Models*

    PubMed Central

    Lin, Jimmy; Wilbur, W. John

    2008-01-01

    Transaction logs from online search engines are valuable for two reasons: First, they provide insight into human information-seeking behavior. Second, log data can be used to train user models, which can then be applied to improve retrieval systems. This article presents a study of logs from PubMed®, the public gateway to the MEDLINE® database of bibliographic records from the medical and biomedical primary literature. Unlike most previous studies on general Web search, our work examines user activities with a highly-specialized search engine. We encode user actions as string sequences and model these sequences using n-gram language models. The models are evaluated in terms of perplexity and in a sequence prediction task. They help us better understand how PubMed users search for information and provide an enabler for improving users’ search experience. PMID:19684883

  18. Hanford Soil Inventory Model (SIM) Rev. 1 Users Guide

    SciTech Connect

    Simpson, Brett C.; Corbin, Rob A.; Anderson, Michael J.; Kincaid, Charles T.

    2006-09-25

    The focus of the development and application of a soil inventory model as part of the Remediation and Closure Science (RCS) Project managed by PNNL was to develop a probabilistic approach to estimate comprehensive, mass balanced-based contaminant inventories for the Hanford Site post-closure setting. The outcome of this effort was the Hanford Soil Inventory Model (SIM). This document is a user's guide for the Hanford SIM. The principal project requirement for the SIM was to provide comprehensive quantitative estimates of contaminant inventory and its uncertainty for the various liquid waste sites, unplanned releases, and past tank farm leaks as a function of time and location at Hanford. The majority, but not all of these waste sites are in the 200 Areas of Hanford where chemical processing of spent fuel occurred. A computer model capable of performing these calculations and providing satisfactory quantitative output representing a robust description of contaminant inventory and uncertainty for use in other subsequent models was determined to be satisfactory to address the needs of the RCS Project. The ability to use familiar, commercially available software on high-performance personal computers for data input, modeling, and analysis, rather than custom software on a workstation or mainframe computer for modeling, was desired.

  19. User-Defined Material Model for Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)

    2006-01-01

    An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.

  20. A Markov Chain Model for Changes in Users' Assessment of Search Results.

    PubMed

    Zhitomirsky-Geffet, Maayan; Bar-Ilan, Judit; Levene, Mark

    2016-01-01

    Previous research shows that users tend to change their assessment of search results over time. This is a first study that investigates the factors and reasons for these changes, and describes a stochastic model of user behaviour that may explain these changes. In particular, we hypothesise that most of the changes are local, i.e. between results with similar or close relevance to the query, and thus belong to the same"coarse" relevance category. According to the theory of coarse beliefs and categorical thinking, humans tend to divide the range of values under consideration into coarse categories, and are thus able to distinguish only between cross-category values but not within them. To test this hypothesis we conducted five experiments with about 120 subjects divided into 3 groups. Each student in every group was asked to rank and assign relevance scores to the same set of search results over two or three rounds, with a period of three to nine weeks between each round. The subjects of the last three-round experiment were then exposed to the differences in their judgements and were asked to explain them. We make use of a Markov chain model to measure change in users' judgments between the different rounds. The Markov chain demonstrates that the changes converge, and that a majority of the changes are local to a neighbouring relevance category. We found that most of the subjects were satisfied with their changes, and did not perceive them as mistakes but rather as a legitimate phenomenon, since they believe that time has influenced their relevance assessment. Both our quantitative analysis and user comments support the hypothesis of the existence of coarse relevance categories resulting from categorical thinking in the context of user evaluation of search results.

  1. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  2. User's guide to the MESOI diffusion model: Version 1. 1 (for Data General Eclipse S/230 with AFOS)

    SciTech Connect

    Athey, G.F.; Ramsdell, J.V.

    1982-09-01

    MESOI is an interactive, Langrangian puff trajectory model. The model theory is documented separately (Ramsdell and Athey, 1981). Version 1.1 is a modified form of the original 1.0. It is designed to run on a Data General Eclipse computer. The model has improved support features which make it useful as an emergency response tool. This report is intended to provide the user with the information necessary to successfully conduct model simulations using MESOI Version 1.1 and to use the support programs STAPREP and EXPLT. The user is also provided information on the use of the data file maintenance and review program UPDATE. Examples are given for the operation of the program. Test data sets are described which allow the user to practice with the programs and to confirm proper implementation and execution.

  3. Developing a User-process Model for Designing Menu-based Interfaces: An Exploratory Study.

    ERIC Educational Resources Information Center

    Ju, Boryung; Gluck, Myke

    2003-01-01

    The purpose of this study was to organize menu items based on a user-process model and implement a new version of current software for enhancing usability of interfaces. A user-process model was developed, drawn from actual users' understanding of their goals and strategies to solve their information needs by using Dervin's Sense-Making Theory…

  4. The Logic of the SIGGS Theory Model.

    ERIC Educational Resources Information Center

    Maccia, Elizabeth S.; Maccia, George S.

    This paper describes the SIGGS Theory Model as it applies to educational systems. This model, designed to simulate a wide variety of systems, uses sets (S), information (I), graph theory (G), and general systems (GS). The educational system is viewed as a set; in one example this set consists of teacher, student, curriculum, and setting. This…

  5. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  6. Modeling of rolling element bearing mechanics. Computer program user's manual

    NASA Astrophysics Data System (ADS)

    Greenhill, Lyn M.; Merchant, David H.

    1994-10-01

    This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  7. GCFM Users Guide Revision for Model Version 5.0

    SciTech Connect

    Keimig, Mark A.; Blake, Coleman

    1981-08-10

    This paper documents alterations made to the MITRE/DOE Geothermal Cash Flow Model (GCFM) in the period of September 1980 through September 1981. Version 4.0 of GCFM was installed on the computer at the DOE San Francisco Operations Office in August 1980. This Version has also been distributed to about a dozen geothermal industry firms, for examination and potential use. During late 1980 and 1981, a few errors detected in the Version 4.0 code were corrected, resulting in Version 4.1. If you are currently using GCFM Version 4.0, it is suggested that you make the changes to your code that are described in Section 2.0. User's manual changes listed in Section 3.0 and Section 4.0 should then also be made.

  8. Modeling of rolling element bearing mechanics. Computer program user's manual

    NASA Technical Reports Server (NTRS)

    Greenhill, Lyn M.; Merchant, David H.

    1994-01-01

    This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  9. CORCON-MOD3: An integrated computer model for analysis of molten core-concrete interactions. User`s manual

    SciTech Connect

    Bradley, D.R.; Gardner, D.R.; Brockmann, J.E.; Griffith, R.O.

    1993-10-01

    The CORCON-Mod3 computer code was developed to mechanistically model the important core-concrete interaction phenomena, including those phenomena relevant to the assessment of containment failure and radionuclide release. The code can be applied to a wide range of severe accident scenarios and reactor plants. The code represents the current state of the art for simulating core debris interactions with concrete. This document comprises the user`s manual and gives a brief description of the models and the assumptions and limitations in the code. Also discussed are the input parameters and the code output. Two sample problems are also given.

  10. WASP7 BENTHIC ALGAE - MODEL THEORY AND USER'S GUIDE

    EPA Science Inventory

    The standard WASP7 eutrophication module includes nitrogen and phosphorus cycling, dissolved oxygen-organic matter interactions, and phytoplankton kinetics. In many shallow streams and rivers, however, the attached algae (benthic algae, or periphyton, attached to submerged substr...

  11. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    SciTech Connect

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  12. Towards a Theory of Conceptual Modelling

    NASA Astrophysics Data System (ADS)

    Thalheim, Bernhard

    Conceptual modelling is a widely applied practice and has led to a large body of knowledge on constructs that might be used for modelling and on methods that might be useful for modelling. It is commonly accepted that database application development is based on conceptual modelling. It is however surprising that only very few publications have been published on a theory of conceptual modelling.

  13. A practical tool for modeling biospecimen user fees.

    PubMed

    Matzke, Lise; Dee, Simon; Bartlett, John; Damaraju, Sambasivarao; Graham, Kathryn; Johnston, Randal; Mes-Masson, Anne-Marie; Murphy, Leigh; Shepherd, Lois; Schacter, Brent; Watson, Peter H

    2014-08-01

    The question of how best to attribute the unit costs of the annotated biospecimen product that is provided to a research user is a common issue for many biobanks. Some of the factors influencing user fees are capital and operating costs, internal and external demand and market competition, and moral standards that dictate that fees must have an ethical basis. It is therefore important to establish a transparent and accurate costing tool that can be utilized by biobanks and aid them in establishing biospecimen user fees. To address this issue, we built a biospecimen user fee calculator tool, accessible online at www.biobanking.org . The tool was built to allow input of: i) annual operating and capital costs; ii) costs categorized by the major core biobanking operations; iii) specimen products requested by a biobank user; and iv) services provided by the biobank beyond core operations (e.g., histology, tissue micro-array); as well as v) several user defined variables to allow the calculator to be adapted to different biobank operational designs. To establish default values for variables within the calculator, we first surveyed the members of the Canadian Tumour Repository Network (CTRNet) management committee. We then enrolled four different participants from CTRNet biobanks to test the hypothesis that the calculator tool could change approaches to user fees. Participants were first asked to estimate user fee pricing for three hypothetical user scenarios based on their biobanking experience (estimated pricing) and then to calculate fees for the same scenarios using the calculator tool (calculated pricing). Results demonstrated significant variation in estimated pricing that was reduced by calculated pricing, and that higher user fees are consistently derived when using the calculator. We conclude that adoption of this online calculator for user fee determination is an important first step towards harmonization and realistic user fees.

  14. Graphical User Interface for Simulink Integrated Performance Analysis Model

    NASA Technical Reports Server (NTRS)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  15. A user credit assessment model based on clustering ensemble for broadband network new media service supervision

    NASA Astrophysics Data System (ADS)

    Liu, Fang; Cao, San-xing; Lu, Rui

    2012-04-01

    This paper proposes a user credit assessment model based on clustering ensemble aiming to solve the problem that users illegally spread pirated and pornographic media contents within the user self-service oriented broadband network new media platforms. Its idea is to do the new media user credit assessment by establishing indices system based on user credit behaviors, and the illegal users could be found according to the credit assessment results, thus to curb the bad videos and audios transmitted on the network. The user credit assessment model based on clustering ensemble proposed by this paper which integrates the advantages that swarm intelligence clustering is suitable for user credit behavior analysis and K-means clustering could eliminate the scattered users existed in the result of swarm intelligence clustering, thus to realize all the users' credit classification automatically. The model's effective verification experiments are accomplished which are based on standard credit application dataset in UCI machine learning repository, and the statistical results of a comparative experiment with a single model of swarm intelligence clustering indicates this clustering ensemble model has a stronger creditworthiness distinguishing ability, especially in the aspect of predicting to find user clusters with the best credit and worst credit, which will facilitate the operators to take incentive measures or punitive measures accurately. Besides, compared with the experimental results of Logistic regression based model under the same conditions, this clustering ensemble model is robustness and has better prediction accuracy.

  16. Long Fibre Composite Modelling Using Cohesive User's Element

    NASA Astrophysics Data System (ADS)

    Kozák, Vladislav; Chlup, Zdeněk

    2010-09-01

    The development glass matrix composites reinforced by unidirectional long ceramic fibre has resulted in a family of very perspective structural materials. The only disadvantage of such materials is relatively high brittleness at room temperature. The main micromechanisms acting as toughening mechanism are the pull out, crack bridging, matrix cracking. There are other mechanisms as crack deflection etc. but the primer mechanism is mentioned pull out which is governed by interface between fibre and matrix. The contribution shows a way how to predict and/or optimise mechanical behaviour of composite by application of cohesive zone method and write user's cohesive element into the FEM numerical package Abaqus. The presented results from numerical calculations are compared with experimental data. Crack extension is simulated by means of element extinction algorithms. The principal effort is concentrated on the application of the cohesive zone model with the special traction separation (bridging) law and on the cohesive zone modelling. Determination of micro-mechanical parameters is based on the combination of static tests, microscopic observations and numerical calibration procedures.

  17. Long Fibre Composite Modelling Using Cohesive User's Element

    SciTech Connect

    Kozak, Vladislav; Chlup, Zdenek

    2010-09-30

    The development glass matrix composites reinforced by unidirectional long ceramic fibre has resulted in a family of very perspective structural materials. The only disadvantage of such materials is relatively high brittleness at room temperature. The main micromechanisms acting as toughening mechanism are the pull out, crack bridging, matrix cracking. There are other mechanisms as crack deflection etc. but the primer mechanism is mentioned pull out which is governed by interface between fibre and matrix. The contribution shows a way how to predict and/or optimise mechanical behaviour of composite by application of cohesive zone method and write user's cohesive element into the FEM numerical package Abaqus. The presented results from numerical calculations are compared with experimental data. Crack extension is simulated by means of element extinction algorithms. The principal effort is concentrated on the application of the cohesive zone model with the special traction separation (bridging) law and on the cohesive zone modelling. Determination of micro-mechanical parameters is based on the combination of static tests, microscopic observations and numerical calibration procedures.

  18. Propeller aircraft interior noise model: User's manual for computer program

    NASA Technical Reports Server (NTRS)

    Wilby, E. G.; Pope, L. D.

    1985-01-01

    A computer program entitled PAIN (Propeller Aircraft Interior Noise) has been developed to permit calculation of the sound levels in the cabin of a propeller-driven airplane. The fuselage is modeled as a cylinder with a structurally integral floor, the cabin sidewall and floor being stiffened by ring frames, stringers and floor beams of arbitrary configurations. The cabin interior is covered with acoustic treatment and trim. The propeller noise consists of a series of tones at harmonics of the blade passage frequency. Input data required by the program include the mechanical and acoustical properties of the fuselage structure and sidewall trim. Also, the precise propeller noise signature must be defined on a grid that lies in the fuselage skin. The propeller data are generated with a propeller noise prediction program such as the NASA Langley ANOPP program. The program PAIN permits the calculation of the space-average interior sound levels for the first ten harmonics of a propeller rotating alongside the fuselage. User instructions for PAIN are given in the report. Development of the analytical model is presented in NASA CR 3813.

  19. User Acceptance of Long-Term Evolution (LTE) Services: An Application of Extended Technology Acceptance Model

    ERIC Educational Resources Information Center

    Park, Eunil; Kim, Ki Joon

    2013-01-01

    Purpose: The aim of this paper is to propose an integrated path model in order to explore user acceptance of long-term evolution (LTE) services by examining potential causal relationships between key psychological factors and user intention to use the services. Design/methodology/approach: Online survey data collected from 1,344 users are analysed…

  20. BPACK -- A computer model package for boiler reburning/co-firing performance evaluations. User`s manual, Volume 1

    SciTech Connect

    Wu, K.T.; Li, B.; Payne, R.

    1992-06-01

    This manual presents and describes a package of computer models uniquely developed for boiler thermal performance and emissions evaluations by the Energy and Environmental Research Corporation. The model package permits boiler heat transfer, fuels combustion, and pollutant emissions predictions related to a number of practical boiler operations such as fuel-switching, fuels co-firing, and reburning NO{sub x} reductions. The models are adaptable to most boiler/combustor designs and can handle burner fuels in solid, liquid, gaseous, and slurried forms. The models are also capable of performing predictions for combustion applications involving gaseous-fuel reburning, and co-firing of solid/gas, liquid/gas, gas/gas, slurry/gas fuels. The model package is conveniently named as BPACK (Boiler Package) and consists of six computer codes, of which three of them are main computational codes and the other three are input codes. The three main codes are: (a) a two-dimensional furnace heat-transfer and combustion code: (b) a detailed chemical-kinetics code; and (c) a boiler convective passage code. This user`s manual presents the computer model package in two volumes. Volume 1 describes in detail a number of topics which are of general users` interest, including the physical and chemical basis of the models, a complete description of the model applicability, options, input/output, and the default inputs. Volume 2 contains a detailed record of the worked examples to assist users in applying the models, and to illustrate the versatility of the codes.

  1. User Interface Models for Multidisciplinary Bibliographic Information Dissemination Centers.

    ERIC Educational Resources Information Center

    Zipperer, W. C.

    Two information dissemination centers at University of California at Los Angeles and University of Georgia studied the interactions between computer based search facilities and their users. The study, largely descriptive in nature, investigated the interaction processes between data base users and profile analysis or information specialists in…

  2. Towards Model-Driven End-User Development in CALL

    ERIC Educational Resources Information Center

    Farmer, Rod; Gruba, Paul

    2006-01-01

    The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…

  3. Crisis in Context Theory: An Ecological Model

    ERIC Educational Resources Information Center

    Myer, Rick A.; Moore, Holly B.

    2006-01-01

    This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…

  4. Theories of addiction: methamphetamine users' explanations for continuing drug use and relapse.

    PubMed

    Newton, Thomas F; De La Garza, Richard; Kalechstein, Ari D; Tziortzis, Desey; Jacobsen, Caitlin A

    2009-01-01

    A variety of preclinical models have been constructed to emphasize unique aspects of addiction-like behavior. These include Negative Reinforcement ("Pain Avoidance"), Positive Reinforcement ("Pleasure Seeking"), Incentive Salience ("Craving"), Stimulus Response Learning ("Habits"), and Inhibitory Control Dysfunction ("Impulsivity"). We used a survey to better understand why methamphetamine-dependent research volunteers (N = 73) continue to use methamphetamine, or relapse to methamphetamine use after a period of cessation of use. All participants met DSM-IV criteria for methamphetamine abuse or dependence, and did not meet criteria for other current Axis I psychiatric disorders or dependence on other drugs of abuse, other than nicotine. The questionnaire consisted of a series of face-valid questions regarding drug use, which in this case referred to methamphetamine use. Examples of questions include: "Do you use drugs mostly to make bad feelings like boredom, loneliness, or apathy go away?", "Do you use drugs mostly because you want to get high?", "Do you use drugs mostly because of cravings?", "Do you find yourself getting ready to take drugs without thinking about it?", and "Do you impulsively take drugs?". The scale was anchored at 1 (not at all) and 7 (very much). For each question, the numbers of participants rating each question negatively (1 or 2), neither negatively or affirmatively (3-5), and affirmatively (6 or 7) were tabulated. The greatest number of respondents (56%) affirmed that they used drugs due to "pleasure seeking." The next highest categories selected were "impulsivity" (27%) and "habits"(25%). Surprisingly, many participants reported that "pain avoidance" (30%) and "craving" (30%) were not important for their drug use. Results from this study support the contention that methamphetamine users (and probably other drug users as well) are more heterogeneous than is often appreciated, and imply that treatment development might be more successful if

  5. Metaphor, Model, and Theory in Education Research.

    ERIC Educational Resources Information Center

    Dickmeyer, Nathan

    1989-01-01

    The concepts of metaphor, model, and theory are defined and used to show how social science research in general, and education research in particular, has differed from Popper's description of natural science research. (IAH)

  6. Spud 1.0: generalising and automating the user interfaces of scientific computer models

    NASA Astrophysics Data System (ADS)

    Ham, D. A.; Farrell, P. E.; Gorman, G. J.; Maddison, J. R.; Wilson, C. R.; Kramer, S. C.; Shipton, J.; Collins, G. S.; Cotter, C. J.; Piggott, M. D.

    2009-03-01

    The interfaces by which users specify the scenarios to be simulated by scientific computer models are frequently primitive, under-documented and ad-hoc text files which make using the model in question difficult and error-prone and significantly increase the development cost of the model. In this paper, we present a model-independent system, Spud, which formalises the specification of model input formats in terms of formal grammars. This is combined with an automated graphical user interface which guides users to create valid model inputs based on the grammar provided, and a generic options reading module, libspud, which minimises the development cost of adding model options. Together, this provides a user friendly, well documented, self validating user interface which is applicable to a wide range of scientific models and which minimises the developer input required to maintain and extend the model interface.

  7. Spud 1.0: generalising and automating the user interfaces of scientific computer models

    NASA Astrophysics Data System (ADS)

    Ham, D. A.; Farrell, P. E.; Gorman, G. J.; Maddison, J. R.; Wilson, C. R.; Kramer, S. C.; Shipton, J.; Collins, G. S.; Cotter, C. J.; Piggott, M. D.

    2008-07-01

    The interfaces by which users specify the scenarios to be simulated by scientific computer models are frequently primitive, under-documented and ad-hoc text files which make using the model in question difficult and error-prone and significantly increase the development cost of the model. In this paper, we present a model-independent system, Spud, which formalises the specification of model input formats in terms of formal grammars. This is combined with an automated graphical user interface which guides users to create valid model inputs based on the grammar provided, and a generic options reading module which minimises the development cost of adding model options. Together, this provides a user friendly, well documented, self validating user interface which is applicable to a wide range of scientific models and which minimises the developer input required to maintain and extend the model interface.

  8. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  9. Solvation models: theory and validation.

    PubMed

    Purisima, Enrico O; Sulea, Traian

    2014-01-01

    Water plays an active role in many fundamental phenomena in cellular systems such as molecular recognition, folding and conformational equilibria, reaction kinetics and phase partitioning. Hence, our ability to account for the energetics of these processes is highly dependent on the models we use for calculating solvation effects. For example, theoretical prediction of protein-ligand binding modes (i.e., docking) and binding affinities (i.e., scoring) requires an accurate description of the change in hydration that accompanies solute binding. In this review, we discuss the challenges of constructing solvation models that capture these effects, with an emphasis on continuum models and on more recent developments in the field. In our discussion of methods, relatively greater attention will be given to boundary element solutions to the Poisson equation and to nonpolar solvation models, two areas that have become increasingly important but are likely to be less familiar to many readers. The other focus will be upon the trending efforts for evaluating solvation models in order to uncover limitations, biases, and potentially attractive directions for their improvement and applicability. The prospective and retrospective performance of a variety of solvation models in the SAMPL blind challenges will be discussed in detail. After just a few years, these benchmarking exercises have already had a tangible effect in guiding the improvement of solvation models.

  10. Elementary Theory of Covariance Modeling

    NASA Technical Reports Server (NTRS)

    Cohn, S.; Atlas, Robert (Technical Monitor)

    2002-01-01

    The contents include: 1. State space, spectral space, and observation space; 2. Variances and correlations; 3. Isotropic and anisotropic correlation modeling on the sphere; 4. Operational ozone data assimilation; and 5. Kalman filtering for trace constituents.

  11. A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James

    1994-01-01

    Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.

  12. APEX user`s guide - (Argonne production, expansion, and exchange model for electrical systems), version 3.0

    SciTech Connect

    VanKuiken, J.C.; Veselka, T.D.; Guziel, K.A.; Blodgett, D.W.; Hamilton, S.; Kavicky, J.A.; Koritarov, V.S.; North, M.J.; Novickas, A.A.; Paprockas, K.R.

    1994-11-01

    This report describes operating procedures and background documentation for the Argonne Production, Expansion, and Exchange Model for Electrical Systems (APEX). This modeling system was developed to provide the U.S. Department of Energy, Division of Fossil Energy, Office of Coal and Electricity with in-house capabilities for addressing policy options that affect electrical utilities. To meet this objective, Argonne National Laboratory developed a menu-driven programming package that enables the user to develop and conduct simulations of production costs, system reliability, spot market network flows, and optimal system capacity expansion. The APEX system consists of three basic simulation components, supported by various databases and data management software. The components include (1) the investigation of Costs and Reliability in Utility Systems (ICARUS) model, (2) the Spot Market Network (SMN) model, and (3) the Production and Capacity Expansion (PACE) model. The ICARUS model provides generating-unit-level production-cost and reliability simulations with explicit recognition of planned and unplanned outages. The SMN model addresses optimal network flows with recognition of marginal costs, wheeling charges, and transmission constraints. The PACE model determines long-term (e.g., longer than 10 years) capacity expansion schedules on the basis of candidate expansion technologies and load growth estimates. In addition, the Automated Data Assembly Package (ADAP) and case management features simplify user-input requirements. The ADAP, ICARUS, and SMN modules are described in detail. The PACE module is expected to be addressed in a future publication.

  13. Bilinear modeling of EMG signals to extract user-independent features for multiuser myoelectric interface.

    PubMed

    Matsubara, Takamitsu; Morimoto, Jun

    2013-08-01

    In this study, we propose a multiuser myoelectric interface that can easily adapt to novel users. When a user performs different motions (e.g., grasping and pinching), different electromyography (EMG) signals are measured. When different users perform the same motion (e.g., grasping), different EMG signals are also measured. Therefore, designing a myoelectric interface that can be used by multiple users to perform multiple motions is difficult. To cope with this problem, we propose for EMG signals a bilinear model that is composed of two linear factors: 1) user dependent and 2) motion dependent. By decomposing the EMG signals into these two factors, the extracted motion-dependent factors can be used as user-independent features. We can construct a motion classifier on the extracted feature space to develop the multiuser interface. For novel users, the proposed adaptation method estimates the user-dependent factor through only a few interactions. The bilinear EMG model with the estimated user-dependent factor can extract the user-independent features from the novel user data. We applied our proposed method to a recognition task of five hand gestures for robotic hand control using four-channel EMG signals measured from subject forearms. Our method resulted in 73% accuracy, which was statistically significantly different from the accuracy of standard nonmultiuser interfaces, as the result of a two-sample t -test at a significance level of 1%.

  14. Self Modeling: Expanding the Theories of Learning

    ERIC Educational Resources Information Center

    Dowrick, Peter W.

    2012-01-01

    Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…

  15. Understanding the Impact of User Frustration Intensities on Task Performance Using the OCC Theory of Emotions

    NASA Technical Reports Server (NTRS)

    Washington, Gloria

    2012-01-01

    Have you heard the saying "frustration is written all over your falce"? Well this saying is true, but that is not the only place. Frustration is written all over your face and your body. The human body has various means to communicate an emotion without the utterance of a single word. The Media Equation says that people interact with computers as if they are human: this includes experiencing frustration. This research measures frustration by monitoring human body-based measures such as heart rate, posture, skin temperature. and respiration. The OCC Theory of Emotions is used to separate frustration into different levels or intensities. The results of this study showed that individual intensities of frustration exist, so that task performance is not degraded. Results from this study can be used by usability testers to model how much frustration is needed before task performance measures start to decrease.

  16. User`s guide for the CALPUFF dispersion model. Final report

    SciTech Connect

    1995-07-01

    This report describes the CALPUFF dispersion model and associated processing programs. The CALPUFF model described in this report reflect improvements to the model including (1) new modules to treat buoyant rise and dispersion from area sources (such as forest fires), buoyant line sources, and volume sources, (2) an improved treatment of complex terrain, (3) additional model switches to facilitate its use in regulatory applications, (4) an enhanced treatment of wind shear through puff splitting, and (4) an optional PC-based GUI. CALPUFF has been coupled to the Emissions Production Model (EPM) developed by the Forest Service through an interface processor. EPM provides time-dependent emissions and heat release data for use in modeling controlled burns and wildfires.

  17. User-Centered Innovation: A Model for "Early Usability Testing."

    ERIC Educational Resources Information Center

    Sugar, William A.; Boling, Elizabeth

    The goal of this study is to show how some concepts and techniques from disciplines outside Instructional Systems Development (ISD) have the potential to extend and enhance the traditional view of ISD practice when they are employed very early in the ISD process. The concepts and techniques employed were user-centered in design and usability, and…

  18. A Study of Context-Awareness RBAC Model Using User Profile on Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Jang, Bokman; Park, Sungdo; Chang, Hyokyung; Ahn, Hyosik; Choi, Euiin

    Recently, With the IT technique growth, there is getting formed to convert to ubiquitous environment that means it can access information everywhere and every-time using various devices, and the computer can decide to provide useful services to users. But, in this computing environment will be connected to wireless network and various devices. According to, recklessness approaches of information resource make trouble to system. So, access authority management is very important issue both information resource and adapt to system through founding security policy to need a system. So, this model has a problem that is not concerned about user's context information as user's profile. In this paper suppose to context-awareness RABC model that based on profile about which user's information which provide efficiently access control to user through active classification, inference and judgment about user who access to system and resource.

  19. LDA-Based Unified Topic Modeling for Similar TV User Grouping and TV Program Recommendation.

    PubMed

    Pyo, Shinjee; Kim, Eunhui; Kim, Munchurl

    2015-08-01

    Social TV is a social media service via TV and social networks through which TV users exchange their experiences about TV programs that they are viewing. For social TV service, two technical aspects are envisioned: grouping of similar TV users to create social TV communities and recommending TV programs based on group and personal interests for personalizing TV. In this paper, we propose a unified topic model based on grouping of similar TV users and recommending TV programs as a social TV service. The proposed unified topic model employs two latent Dirichlet allocation (LDA) models. One is a topic model of TV users, and the other is a topic model of the description words for viewed TV programs. The two LDA models are then integrated via a topic proportion parameter for TV programs, which enforces the grouping of similar TV users and associated description words for watched TV programs at the same time in a unified topic modeling framework. The unified model identifies the semantic relation between TV user groups and TV program description word groups so that more meaningful TV program recommendations can be made. The unified topic model also overcomes an item ramp-up problem such that new TV programs can be reliably recommended to TV users. Furthermore, from the topic model of TV users, TV users with similar tastes can be grouped as topics, which can then be recommended as social TV communities. To verify our proposed method of unified topic-modeling-based TV user grouping and TV program recommendation for social TV services, in our experiments, we used real TV viewing history data and electronic program guide data from a seven-month period collected by a TV poll agency. The experimental results show that the proposed unified topic model yields an average 81.4% precision for 50 topics in TV program recommendation and its performance is an average of 6.5% higher than that of the topic model of TV users only. For TV user prediction with new TV programs, the average

  20. LDA-Based Unified Topic Modeling for Similar TV User Grouping and TV Program Recommendation.

    PubMed

    Pyo, Shinjee; Kim, Eunhui; Kim, Munchurl

    2015-08-01

    Social TV is a social media service via TV and social networks through which TV users exchange their experiences about TV programs that they are viewing. For social TV service, two technical aspects are envisioned: grouping of similar TV users to create social TV communities and recommending TV programs based on group and personal interests for personalizing TV. In this paper, we propose a unified topic model based on grouping of similar TV users and recommending TV programs as a social TV service. The proposed unified topic model employs two latent Dirichlet allocation (LDA) models. One is a topic model of TV users, and the other is a topic model of the description words for viewed TV programs. The two LDA models are then integrated via a topic proportion parameter for TV programs, which enforces the grouping of similar TV users and associated description words for watched TV programs at the same time in a unified topic modeling framework. The unified model identifies the semantic relation between TV user groups and TV program description word groups so that more meaningful TV program recommendations can be made. The unified topic model also overcomes an item ramp-up problem such that new TV programs can be reliably recommended to TV users. Furthermore, from the topic model of TV users, TV users with similar tastes can be grouped as topics, which can then be recommended as social TV communities. To verify our proposed method of unified topic-modeling-based TV user grouping and TV program recommendation for social TV services, in our experiments, we used real TV viewing history data and electronic program guide data from a seven-month period collected by a TV poll agency. The experimental results show that the proposed unified topic model yields an average 81.4% precision for 50 topics in TV program recommendation and its performance is an average of 6.5% higher than that of the topic model of TV users only. For TV user prediction with new TV programs, the average

  1. The Oak Ridge National Laboratory automobile heat pump model: User`s guide

    SciTech Connect

    Kyle, D.M.

    1993-05-01

    A computer program has been developed to predict the steady-state performance of vapor compression automobile air conditioners and heat pumps. The code is based on the residential heat pump model developed at Oak Ridge National Laboratory. Most calculations are based on fundamental physical principles, in conjunction with generalized correlations available in the research literature. Automobile air conditioning components that can be specified as inputs to the program include open and hermetic compressors; finned tube condensers; finned tube and plate-fin style evaporators; thermal expansion valve, capillary tube and short tube expansion devices; refrigerant mass; evaporator pressure regulator; and all interconnecting tubing. The program can be used with a variety of refrigerants, including R134a. Methodologies are discussed for using the model as a tool for designing all new systems or, alternatively, as a tool for simulating a known system for a variety of operating conditions.

  2. Model for a fundamental theory with supersymmetry

    NASA Astrophysics Data System (ADS)

    Yokoo, Seiichiro

    Physics in the year 2006 is tightly constrained by experiment, observation, and mathematical consistency. The Standard Model provides a remarkably precise description of particle physics, and general relativity is quite successful in describing gravitational phenomena. At the same time, it is clear that a more fundamental theory is needed for several distinct reasons. Here we consider a new approach, which begins with the unusually ambitious point of view that a truly fundamental theory should aspire to explaining the origins of Lorentz invariance, gravity, gauge fields and their symmetry, supersymmetry, fermionic fields, bosonic fields, quantum mechanics and spacetime. The present dissertation is organized so that it starts with the most conventional ideas for extending the Standard Model and ends with a microscopic statistical picture, which is actually the logical starting point of the theory, but which is also the most remote excursion from conventional physics. One motivation for the present work is the fact that a Euclidean path integral in quantum physics is equivalent to a partition function in statistical physics. This suggests that the most fundamental description of nature may be statistical. This dissertation may be regarded as an attempt to see how far one can go with this premise in explaining the observed phenomena, starting with the simplest statistical picture imaginable. It may be that nature is richer than the model assumed here, but the present results are quite suggestive, because, with a set of assumptions that are not unreasonable, one recovers the phenomena listed above. At the end, the present theory leads back to conventional physics, except that Lorentz invariance and supersymmetry are violated at extremely high energy. To be more specific, one obtains local Lorentz invariance (at low energy compared to the Planck scale), an SO( N) unified gauge theory (with N = 10 as the simplest possibility), supersymmetry of Standard Model fermions and

  3. Engaging Theories and Models to Inform Practice

    ERIC Educational Resources Information Center

    Kraus, Amanda

    2012-01-01

    Helping students prepare for the complex transition to life after graduation is an important responsibility shared by those in student affairs and others in higher education. This chapter explores theories and models that can inform student affairs practitioners and faculty in preparing students for life after college. The focus is on roles,…

  4. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  5. Theories and Models of Ethnic Inequality.

    ERIC Educational Resources Information Center

    Hirschman, Charles

    Theories of racial and ethnic relations have been plentiful, but the empirical testing of hypotheses has not led to a cumulative growth of knowledge. As yet, no strong paradigm of research has emerged. The growth of empirical studies of racial/ethnic inequality in the United States over the last decade suggests that formal models of the process of…

  6. Viscous wing theory development. Volume 2: GRUMWING computer program user's manual

    NASA Technical Reports Server (NTRS)

    Chow, R. R.; Ogilvie, P. L.

    1986-01-01

    This report is a user's manual which describes the operation of the computer program, GRUMWING. The program computes the viscous transonic flow over three-dimensional wings using a boundary layer type viscid-inviscid interaction approach. The inviscid solution is obtained by an approximate factorization (AFZ)method for the full potential equation. The boundary layer solution is based on integral entrainment methods.

  7. Factors from the transtheoretical model differentiating between solar water disinfection (SODIS) user groups.

    PubMed

    Kraemer, Silvie M; Mosler, Hans-Joachim

    2011-01-01

    Solar water disinfection (SODIS) is a sustainable household water treatment technique that could prevent millions of deaths caused by diarrhoea. The behaviour change process necessary to move from drinking raw water to drinking SODIS is analysed with the Transtheoretical Model of Change (TTM). User groups and psychological factors that differentiate between types of users are identified. Results of a 1.5 year longitudinal study in Zimbabwe reveal distinguishing factors between groups, from which it can be deduced that they drive the development of user groups. Implications are drawn for campaigns with the aim of bringing all user types to a regular use.

  8. Informatic system for a global tissue-fluid biorepository with a graph theory-oriented graphical user interface.

    PubMed

    Butler, William E; Atai, Nadia; Carter, Bob; Hochberg, Fred

    2014-01-01

    The Richard Floor Biorepository supports collaborative studies of extracellular vesicles (EVs) found in human fluids and tissue specimens. The current emphasis is on biomarkers for central nervous system neoplasms but its structure may serve as a template for collaborative EV translational studies in other fields. The informatic system provides specimen inventory tracking with bar codes assigned to specimens and containers and projects, is hosted on globalized cloud computing resources, and embeds a suite of shared documents, calendars, and video-conferencing features. Clinical data are recorded in relation to molecular EV attributes and may be tagged with terms drawn from a network of externally maintained ontologies thus offering expansion of the system as the field matures. We fashioned the graphical user interface (GUI) around a web-based data visualization package. This system is now in an early stage of deployment, mainly focused on specimen tracking and clinical, laboratory, and imaging data capture in support of studies to optimize detection and analysis of brain tumour-specific mutations. It currently includes 4,392 specimens drawn from 611 subjects, the majority with brain tumours. As EV science evolves, we plan biorepository changes which may reflect multi-institutional collaborations, proteomic interfaces, additional biofluids, changes in operating procedures and kits for specimen handling, novel procedures for detection of tumour-specific EVs, and for RNA extraction and changes in the taxonomy of EVs. We have used an ontology-driven data model and web-based architecture with a graph theory-driven GUI to accommodate and stimulate the semantic web of EV science.

  9. Informatic system for a global tissue-fluid biorepository with a graph theory-oriented graphical user interface.

    PubMed

    Butler, William E; Atai, Nadia; Carter, Bob; Hochberg, Fred

    2014-01-01

    The Richard Floor Biorepository supports collaborative studies of extracellular vesicles (EVs) found in human fluids and tissue specimens. The current emphasis is on biomarkers for central nervous system neoplasms but its structure may serve as a template for collaborative EV translational studies in other fields. The informatic system provides specimen inventory tracking with bar codes assigned to specimens and containers and projects, is hosted on globalized cloud computing resources, and embeds a suite of shared documents, calendars, and video-conferencing features. Clinical data are recorded in relation to molecular EV attributes and may be tagged with terms drawn from a network of externally maintained ontologies thus offering expansion of the system as the field matures. We fashioned the graphical user interface (GUI) around a web-based data visualization package. This system is now in an early stage of deployment, mainly focused on specimen tracking and clinical, laboratory, and imaging data capture in support of studies to optimize detection and analysis of brain tumour-specific mutations. It currently includes 4,392 specimens drawn from 611 subjects, the majority with brain tumours. As EV science evolves, we plan biorepository changes which may reflect multi-institutional collaborations, proteomic interfaces, additional biofluids, changes in operating procedures and kits for specimen handling, novel procedures for detection of tumour-specific EVs, and for RNA extraction and changes in the taxonomy of EVs. We have used an ontology-driven data model and web-based architecture with a graph theory-driven GUI to accommodate and stimulate the semantic web of EV science. PMID:25317275

  10. Theory, Modeling, and Simulation of Semiconductor Lasers

    NASA Technical Reports Server (NTRS)

    Ning, Cun-Zheng; Saini, Subbash (Technical Monitor)

    1998-01-01

    Semiconductor lasers play very important roles in many areas of information technology. In this talk, I will first give an overview of semiconductor laser theory. This will be followed by a description of different models and their shortcomings in modeling and simulation. Our recent efforts in constructing a fully space and time resolved simulation model will then be described. Simulation results based on our model will be presented. Finally the effort towards a self-consistent and comprehensive simulation capability for the opto-electronics integrated circuits (OEICs) will be briefly reviewed.

  11. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    SciTech Connect

    Bloyd, C.; Camp, J.; Conzelmann, G.

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  12. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    NASA Astrophysics Data System (ADS)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  13. User's guide to the MESOI diffusion model and to the utility programs UPDATE and LOGRVU

    SciTech Connect

    Athey, G.F.; Allwine, K.J.; Ramsdell, J.V.

    1981-11-01

    MESOI is an interactive, Lagrangian puff trajectory diffusion model. The model is documented separately (Ramsdell and Athey, 1981); this report is intended to provide MESOI users with the information needed to successfully conduct model simulations. The user is also provided with guidance in the use of the data file maintenance and review programs; UPDATE and LOGRVU. Complete examples are given for the operaton of all three programs and an appendix documents UPDATE and LOGRVU.

  14. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia

    PubMed Central

    Segkouli, Sofia; Paliokas, Ioannis; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  15. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia.

    PubMed

    Segkouli, Sofia; Paliokas, Ioannis; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  16. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia.

    PubMed

    Segkouli, Sofia; Paliokas, Ioannis; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users.

  17. An approach to validation and verification of the communications load model with supporting user's guide

    NASA Astrophysics Data System (ADS)

    Cox, William R.

    1987-09-01

    This thesis investigates the issues of validation and verification of the Communications Load Model (CLM) being used in the Battle Group Communications Simulation Facility at the Naval Air Development Center. The processes involved in creating user input files are explained and evaluated. A user's guide is included to assist the user in interpreting input into the proper data structure and format for use by the model. Structure and function of the model and its components are examined. Calculations of results predicted by scenario inputs are compared to actual program output. The analysis is used to determine appropriate methodology to be utilized in validation and verification of the CLM.

  18. Vulnerability and the intention to anabolic steroids use among Iranian gym users: an application of the theory of planned behavior.

    PubMed

    Allahverdipour, Hamid; Jalilian, Farzad; Shaghaghi, Abdolreza

    2012-02-01

    This correlational study explored the psychological antecedents of 253 Iranian gym users' intentions to use the anabolic-androgenic steroids (AAS), based on the Theory of Planned Behavior (TPB). The three predictor variables of (1) attitude, (2) subjective norms, and (3) perceived behavioral control accounted for 63% of the variation in the outcome measure of the intention to use the AAS. There is some support to use the TPB to design and implement interventions to modify and/or improve individuals' beliefs that athletic goals are achievable without the use of the AAS. PMID:22217129

  19. Vulnerability and the intention to anabolic steroids use among Iranian gym users: an application of the theory of planned behavior.

    PubMed

    Allahverdipour, Hamid; Jalilian, Farzad; Shaghaghi, Abdolreza

    2012-02-01

    This correlational study explored the psychological antecedents of 253 Iranian gym users' intentions to use the anabolic-androgenic steroids (AAS), based on the Theory of Planned Behavior (TPB). The three predictor variables of (1) attitude, (2) subjective norms, and (3) perceived behavioral control accounted for 63% of the variation in the outcome measure of the intention to use the AAS. There is some support to use the TPB to design and implement interventions to modify and/or improve individuals' beliefs that athletic goals are achievable without the use of the AAS.

  20. Communications network design and costing model users manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.

  1. Crack propagation modeling using Peridynamic theory

    NASA Astrophysics Data System (ADS)

    Hafezi, M. H.; Alebrahim, R.; Kundu, T.

    2016-04-01

    Crack propagation and branching are modeled using nonlocal peridynamic theory. One major advantage of this nonlocal theory based analysis tool is the unifying approach towards material behavior modeling - irrespective of whether the crack is formed in the material or not. No separate damage law is needed for crack initiation and propagation. This theory overcomes the weaknesses of existing continuum mechanics based numerical tools (e.g. FEM, XFEM etc.) for identifying fracture modes and does not require any simplifying assumptions. Cracks grow autonomously and not necessarily along a prescribed path. However, in some special situations such as in case of ductile fracture, the damage evolution and failure depend on parameters characterizing the local stress state instead of peridynamic damage modeling technique developed for brittle fracture. For brittle fracture modeling the bond is simply broken when the failure criterion is satisfied. This simulation helps us to design more reliable modeling tool for crack propagation and branching in both brittle and ductile materials. Peridynamic analysis has been found to be very demanding computationally, particularly for real-world structures (e.g. vehicles, aircrafts, etc.). It also requires a very expensive visualization process. The goal of this paper is to bring awareness to researchers the impact of this cutting-edge simulation tool for a better understanding of the cracked material response. A computer code has been developed to implement the peridynamic theory based modeling tool for two-dimensional analysis. A good agreement between our predictions and previously published results is observed. Some interesting new results that have not been reported earlier by others are also obtained and presented in this paper. The final objective of this investigation is to increase the mechanics knowledge of self-similar and self-affine cracks.

  2. Spud and FLML: generalising and automating the user interfaces of scientific computer models

    NASA Astrophysics Data System (ADS)

    Ham, D. A.; Farrell, P. E.; Maddison, J. R.; Gorman, G. J.; Wilson, C. R.; Kramer, S. C.; Shipton, J.; Collins, G. S.; Cotter, C. J.; Piggott, M. D.

    2009-04-01

    The interfaces by which users specify the scenarios to be simulated by scientific computer models are frequently primitive, under-documented and ad-hoc text files which make using the model in question difficult and error-prone and significantly increase the development cost of the model. We present a model-independent system, Spud[1], which formalises the specification of model input formats in terms of formal grammars. This is combined with an automatically generated graphical user interface which guides users to create valid model inputs based on the grammar provided, and a generic options reading module which minimises the development cost of adding model options. We further present FLML, the Fluidity Markup Language. FLML applies Spud to the Imperial College Ocean Model (ICOM) resulting in a graphically driven system which radically improves the usability of ICOM. As well as a step forward for ICOM, FLML illustrates how the Spud system can be applied to an existing complex ocean model highlighting the potential of Spud as a user interface for other codes in the ocean modelling community. [1] Ham, D. A. et.al, Spud 1.0: generalising and automating the user interfaces of scientific computer models, Geosci. Model Dev. Discuss., 1, 125-146, 2008.

  3. Asymptotic theory of an infectious disease model.

    PubMed

    Whitman, Alan M; Ashrafiuon, Hashem

    2006-08-01

    In this paper, we present asymptotic theory as a viable alternative solution method for infectious disease models. We consider a particular model of a pathogen attacking a host whose immune system responds defensively, that has been studied previously [Mohtashemi and Levins in J. Math. Biol. 43: 446-470 (2001)]. On rendering this model dimensionless, we can reduce the number of parameters to two and note that one of them has a large value that suggests an asymptotic analysis. On doing this analysis, we obtain a satisfying qualitative description of the dynamic evolution of each population, together with simple analytic expressions for their main features, from which we can compute accurate quantitative values.

  4. Modeling Integrated Water-User Decisions with Intermittent Supplies

    NASA Astrophysics Data System (ADS)

    Lund, J. R.; Rosenberg, D.

    2006-12-01

    We present an economic-engineering method to estimate urban water use demands with intermittent water supplies. A two-stage, probabilistic optimization formulation includes a wide variety of water supply enhancement and conservation actions that individual households can adopt to meet multiple water quality uses with uncertain water availability. We embed the optimization in Monte-Carlo simulations to show aggregate effects at a utility (citywide) scale for a population of user conditions and decisions. Parametric analysis provides derivations of supply curves to subsidize conservation, demand responses to alternative pricing, and customer willingness-to-pay to avoid shortages. Results show a good empirical fit for the average and distribution of billed residential water use in Amman, Jordan. Additional outputs give likely market penetration rates for household conservation actions, associated water savings, and subsidies required to entice further adoption. We discuss new insights to size, target, market, and finance conservation programs and interpret a demand curve with block pricing.

  5. Users guide for the hydroacoustic coverage assessment model (HydroCAM)

    SciTech Connect

    Farrell, T., LLNL

    1997-12-01

    A model for predicting the detection and localization performance of hydroacoustic monitoring networks has been developed. The model accounts for major factors affecting global-scale acoustic propagation in the ocean. including horizontal refraction, travel time variability due to spatial and temporal fluctuations in the ocean, and detailed characteristics of the source. Graphical user interfaces are provided to setup the models and visualize the results. The model produces maps of network detection coverage and localization area of uncertainty, as well as intermediate results such as predicted path amplitudes, travel time and travel time variance. This Users Guide for the model is organized into three sections. First a summary of functionality available in the model is presented, including example output products. The second section provides detailed descriptions of each of models contained in the system. The last section describes how to run the model, including a summary of each data input form in the user interface.

  6. BioAssemblyModeler (BAM): User-Friendly Homology Modeling of Protein Homo- and Heterooligomers

    PubMed Central

    Xu, Qifang; Andrake, Mark; Dunbrack, Roland L.

    2014-01-01

    Many if not most proteins function in oligomeric assemblies of one or more protein sequences. The Protein Data Bank provides coordinates for biological assemblies for each entry, at least 60% of which are dimers or larger assemblies. BioAssemblyModeler (BAM) is a graphical user interface to the basic steps in homology modeling of protein homooligomers and heterooligomers from the biological assemblies provided in the PDB. BAM takes as input up to six different protein sequences and begins by assigning Pfam domains to the target sequences. The program utilizes a complete assignment of Pfam domains to sequences in the PDB, PDBfam (http://dunbrack2.fccc.edu/protcid/pdbfam), to obtain templates that contain any or all of the domains assigned to the target sequence(s). The contents of the biological assemblies of potential templates are provided, and alignments of the target sequences to the templates are produced with a profile-profile alignment algorithm. BAM provides for visual examination and mouse-editing of the alignments supported by target and template secondary structure information and a 3D viewer of the template biological assembly. Side-chain coordinates for a model of the biological assembly are built with the program SCWRL4. A built-in protocol navigation system guides the user through all stages of homology modeling from input sequences to a three-dimensional model of the target complex. Availability: http://dunbrack.fccc.edu/BAM. PMID:24922057

  7. Topos models for physics and topos theory

    SciTech Connect

    Wolters, Sander

    2014-08-15

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  8. Empirical Analysis and Modeling of Users' Topic Interests in Online Forums

    PubMed Central

    Xiong, Fei; Liu, Yun

    2012-01-01

    Bulletin Board Systems (BBSs) have demonstrated their usefulness in spreading information. In BBS forums, a few posts that address currently popular social topics attract a lot of attention, and different users are interested in many different discussion topics. We investigate topic cluster features and user interests of an actual BBS forum, analyzing user posting and replying behavior. According to the growing process of BBS, we suggest a network model in which each agent only replies to the posts that belong to its specific topics of interest. A post that is replied to will be immediately assigned the highest priority on the post list. Simulation results show that characteristics of our model are similar to those of the real BBS. The model with heterogeneous user interests promotes the occurrence of popular posts, and the user relationship network possesses a large clustering coefficient. Bursts and long waiting time exist in user replying behavior, leading to non-Poisson user activity pattern. In addition, the model produces an analogous evolving trend of Gini coefficients for posts' and clusters' participants as BBS forums. PMID:23251401

  9. Prospects for Advanced RF Theory and Modeling

    SciTech Connect

    Batchelor, D.B.

    1999-04-12

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed.

  10. A Membrane Model from Implicit Elasticity Theory

    PubMed Central

    Freed, A. D.; Liao, J.; Einstein, D. R.

    2014-01-01

    A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079

  11. Quantum mechanical model in gravity theory

    NASA Astrophysics Data System (ADS)

    Losyakov, V. V.

    2016-05-01

    We consider a model of a real massive scalar field defined as homogeneous on a d-dimensional sphere such that the sphere radius, time scale, and scalar field are related by the equations of the general theory of relativity. We quantize this system with three degrees of freedom, define the observables, and find dynamical mean values of observables in the regime where the scalar field mass is much less than the Planck mass.

  12. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    EPA Science Inventory

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  13. Jobs and Economic Development Impact (JEDI) Model Geothermal User Reference Guide

    SciTech Connect

    Johnson, C.; Augustine, C.; Goldberg, M.

    2012-09-01

    The Geothermal Jobs and Economic Development Impact (JEDI) model, developed through the National Renewable Energy Laboratory (NREL), is an Excel-based user-friendly tools that estimates the economic impacts of constructing and operating hydrothermal and Enhanced Geothermal System (EGS) power generation projects at the local level for a range of conventional and renewable energy technologies. The JEDI Model Geothermal User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.

  14. User's manual for interactive LINEAR: A FORTRAN program to derive linear aircraft models

    NASA Technical Reports Server (NTRS)

    Antoniewicz, Robert F.; Duke, Eugene L.; Patterson, Brian P.

    1988-01-01

    An interactive FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft aerodynamic models is documented in this report. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied linear or nonlinear aerodynamic model. The nonlinear equations of motion used are six-degree-of-freedom equations with stationary atmosphere and flat, nonrotating earth assumptions. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.

  15. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    ERIC Educational Resources Information Center

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  16. HIGHWAY 3.1: An enhanced HIGHWAY routing model: Program description, methodology, and revised user`s manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S.; Clarke, D.B.; Jacobi, J.M.

    1993-03-01

    The HIGHWAY program provides a flexible tool for predicting highway routes for transporting radioactive materials in the United States. The HIGHWAY data base is essentially a computerized road atlas that currently describes over 240,000 miles of highways. Complete descriptions of all Interstate System and most US highways (except those that parallel a nearby Interstate highway) are included in the data base. Many of the principal state highways and a number of local and county highways are also identified. The data base also includes locations of nuclear facilities and major airports. Several different types of routes may be calculated, depending on a set of user-supplied constraints. Routes are calculated by minimizing the total impedance between the origin and the destination. Basically, the impedance is defined as a function of distance and driving time along a particular highway segment. Several routing constraints can be imposed during the calculations. One of the special features of the HIGHWAY model is its ability to calculate routes that maximize use of Interstate System highways. This feature allows the user to predict routes for shipments of radioactive materials that conform to the US Department of Transportation routing regulations. Other features of the model include the ability to predict routes that bypass a specific state, city, town, or highway segment. Two special features have been incorporated in HIGHWAY, version 3.1. The first is the ability to automatically calculate alternative routes. Frequently, there are a number of routes between the source and destination that vary slightly in distance and estimated driving time. The HIGHWAY program offers a selection of different but nearly equal routes. The second special feature is the capability to calculate route-specific population density statistics. The population density distribution is calculated for each highway segment in the route and is reported on a state-by-state basis.

  17. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable

  18. Solar Advisor Model User Guide for Version 2.0

    SciTech Connect

    Gilman, P.; Blair, N.; Mehos, M.; Christensen, C.; Janzou, S.; Cameron, C.

    2008-08-01

    The Solar Advisor Model (SAM) provides a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, from photovoltaic systems for residential and commercial markets to concentrating solar power and large photovoltaic systems for utility markets. This manual describes Version 2.0 of the software, which can model photovoltaic and concentrating solar power technologies for electric applications for several markets. The current version of the Solar Advisor Model does not model solar heating and lighting technologies.

  19. Harm reduction theory: users' culture, micro-social indigenous harm reduction, and the self-organization and outside-organizing of users' groups.

    PubMed

    Friedman, Samuel R; de Jong, Wouter; Rossi, Diana; Touzé, Graciela; Rockwell, Russell; Des Jarlais, Don C; Elovich, Richard

    2007-03-01

    This paper discusses the user side of harm reduction, focusing to some extent on the early responses to the HIV/AIDS epidemic in each of four sets of localities-New York City, Rotterdam, Buenos Aires, and sites in Central Asia. Using available qualitative and quantitative information, we present a series of vignettes about user activities in four different localities in behalf of reducing drug-related harm. Some of these activities have been micro-social (small group) activities; others have been conducted by formal organizations of users that the users organized at their own initiative. In spite of the limitations of the methodology, the data suggest that users' activities have helped limit HIV spread. These activities are shaped by broader social contexts, such as the extent to which drug scenes are integrated with broader social networks and the way the political and economic systems impinge on drug users' lives. Drug users are active agents in their own individual and collective behalf, and in helping to protect wider communities. Harm reduction activities and research should take note of and draw upon both the micro-social and formal organizations of users. Finally, both researchers and policy makers should help develop ways to enable and support both micro-social and formally organized action by users.

  20. Theory, modeling and simulation: Annual report 1993

    SciTech Connect

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  1. Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation.

    NASA Astrophysics Data System (ADS)

    Lawley, Russell; Barron, Mark; Lee, Katy

    2014-05-01

    Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation. R. Lawley, M. Barron and K. Lee. NERC - British Geological Survey, Environmental Science Centre, Keyworth, Nottingham, UK, NG12 5GG The boundaries mapped in traditional field geological survey are subject to a wide range of inherent uncertainties. A map at a survey-scale of 1:10,000 is created by a combination of terrain interpretation, direct observations from boreholes and exposures (often sparsely distributed), and indirect interpretation of proxy variables such as soil properties, vegetation and remotely sensed images. A critical factor influencing the quality of the final map is the skill and experience of the surveyor to bring this information together in a coherent conceptual model. The users of geological data comprising or based on mapped boundaries are increasingly aware of these uncertainties, and want to know how to manage them. The growth of 3D modelling, which takes 2D surveys as a starting point, adds urgency to the need for a better understanding of survey uncertainties; particularly where 2D mapping of variable vintage has been compiled into a national coverage. Previous attempts to apply confidence on the basis of metrics such as data density, survey age or survey techniques have proved useful for isolating single, critical, factors but do not generally succeed in evaluating geological mapping 'in the round', because they cannot account for the 'conceptual' skill set of the surveyor. The British Geological Survey (BGS) is using expert elicitation methods to gain a better understanding of uncertainties within the national geological map of Great Britain. The expert elicitation approach starts with the assumption that experienced surveyors have an intuitive sense of the uncertainty of the boundaries that they map, based on a tacit model of geology and its complexity and the nature of the surveying process. The objective of

  2. STORM WATER MANAGEMENT MODEL USER'S MANUAL VERSION 5.0

    EPA Science Inventory

    The EPA Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. SWMM was first developed in 1971 and has undergone several major upgrade...

  3. Algorithm for model validation: Theory and applications

    PubMed Central

    Sornette, D.; Davis, A. B.; Ide, K.; Vixie, K. R.; Pisarenko, V.; Kamm, J. R.

    2007-01-01

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer–Meshkov instability. PMID:17420476

  4. User's guide for waste tank corrosion data model code

    SciTech Connect

    Mackey, D.B.; Divine, J.R.

    1986-12-01

    Corrosion tests were conducted on A-516 and A-537 carbon steel in simulated Double Shell Slurry, Future PUREX, and Hanford Facilities wastes. The corrosion rate data, gathered between 25 and 180/sup 0/C, were statistically ''modeled'' for each waste; a fourth model was developed that utilized the combined data. The report briefly describes the modeling procedure and details on how to access information through a computerized data system. Copies of the report and operating information may be obtained from the author (DB Mackey) at 509-376-9844 of FTS 444-9844.

  5. Self model theory: learning from the future.

    PubMed

    Dowrick, Peter W

    2012-03-01

    This paper synthesizes findings and theoretical propositions across behavioral, cognitive, and neuropsychological theories, with significant new conceptualizations bearing upon processes of learning and performance. There is a need to explain ultrarapid learning within the framework of cognitive science. In video self modeling and in challenging circumstances, the speed of behavior change appears to be derived from feedforward, in which component behaviors (in the repertoire) are reconfigured to produce a new skill or level of performance. It is argued that 'self modeling' is fundamental to learning, and peer/other modeling serves as an alternative. Learning in this way produces a cognitive self-simulation which can be accessed to trigger a behavioral response in a future context. Related neurological processes are indicated by 'mental time travel' (MTT) and specific brain activity during the imagination (simulation) of future personal events. There is evidence that some brain mechanisms (mirror neurons), involved in immediate imitation, are differentially responsive to images of self versus other. MTT (to future events) in cognitive neuroscience has so far been discussed only in terms of prediction and planning not behavior change. These issues are brought together by self model theory. Conclusions drawn in this paper include discussions of the value in 'learning from the future' as a ubiquitous human ability. Overall, the propositions of this theory should stimulate diverse future research, linking neurological and behavioral contributions to cognitive science. WIREs Cogn Sci 2012, 3:215-230. doi: 10.1002/wcs.1156 For further resources related to this article, please visit the WIREs website.

  6. Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training

    NASA Astrophysics Data System (ADS)

    Macris, A.; Malamateniou, F.; Vassilacopoulos, G.

    Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.

  7. A Neural Network Approach to Intention Modeling for User-Adapted Conversational Agents

    PubMed Central

    Griol, David

    2016-01-01

    Spoken dialogue systems have been proposed to enable a more natural and intuitive interaction with the environment and human-computer interfaces. In this contribution, we present a framework based on neural networks that allows modeling of the user's intention during the dialogue and uses this prediction to dynamically adapt the dialogue model of the system taking into consideration the user's needs and preferences. We have evaluated our proposal to develop a user-adapted spoken dialogue system that facilitates tourist information and services and provide a detailed discussion of the positive influence of our proposal in the success of the interaction, the information and services provided, and the quality perceived by the users. PMID:26819592

  8. A review of game-theoretic models of road user behaviour.

    PubMed

    Elvik, Rune

    2014-01-01

    This paper reviews game-theoretic models that have been developed to explain road user behaviour in situations where road users interact with each other. The paper includes the following game-theoretic models: 1.A general model of the interaction between road users and their possible reaction to measures improving safety (behavioural adaptation).2.Choice of vehicle size as a Prisoners’ dilemma game.3.Speed choice as a co-ordination game.4.Speed compliance as a game between drivers and the police.5.Merging into traffic from an acceleration lane as a mixed-strategy game.6.Choice of level of attention in following situations as an evolutionary game.7.Choice of departure time to avoid congestion as variant of a Prisoners’ dilemma game.8.Interaction between cyclists crossing the road and car drivers.9.Dipping headlights at night well ahead of the point when glare becomes noticeable.10.Choice of evasive action in a situation when cars are on collision course. The models reviewed are different in many respects, but a common feature of the models is that they can explain how informal norms of behaviour can develop among road users and be sustained even if these informal norms violate the formal regulations of the traffic code. Game-theoretic models are not applicable to every conceivable interaction between road users or to situations in which road users choose behaviour without interacting with other road users. Nevertheless, it is likely that game-theoretic models can be applied more widely than they have been until now.

  9. User's guide to the Texas climatological model. Final report

    SciTech Connect

    Not Available

    1980-08-01

    The Texas Climatological Model Version 2 (TCM-2) (Part of the UNAMAP version 4 collection) is a Fortran computer program designed to predict ground-level, long-term concentrations of atmospheric pollutants. The Model uses techniques that require much less computer time than most climatological models. Predictions are based upon the steady-state Gaussian plume hypothesis, Briggs plume rise formulations, Pasquill-Gifford dispersion coefficient approximations, and exponential pollutant decay. Long-term ground-level concentrations may be determined for one or two pollutants. Any number of point sources and area sources may be input to the model. Long-term meteorological conditions are input by a meteorological joint frequency function which gives the probability of occurrence for each of 576 different cases. Five scenarios of meteorological data and source emission inventories may be input to the model for one run. Plume rise is calculated by the most representative of six different methods and optionally can use only final rise. An option allows the simulation of dispersion found in urban areas. TCM-2 is well suited for, but not limited to, the following applications: Stack parameter design studies; Fuel conversion studies; Monitoring network design; Control strategy evaluation for SIP; Evaluation of the impact of new sources or source modifications for permit application review; Control technology evaluation; and Prevention of significant deterioration.

  10. Queuing theory models for computer networks

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  11. Petroleum Refinery Jobs and Economic Development Impact (JEDI) Model User Reference Guide

    SciTech Connect

    Goldberg, M.

    2013-12-31

    The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are user-friendly tools utilized to estimate the economic impacts at the local level of constructing and operating fuel and power generation projects for a range of conventional and renewable energy technologies. The JEDI Petroleum Refinery Model User Reference Guide was developed to assist users in employing and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted. Based on project-specific inputs from the user, the model estimates job creation, earning and output (total economic activity) for a given petroleum refinery. This includes the direct, indirect and induced economic impacts to the local economy associated with the refinery's construction and operation phases. Project cost and job data used in the model are derived from the most current cost estimations available. Local direct and indirect economic impacts are estimated using economic multipliers derived from IMPLAN software. By determining the regional economic impacts and job creation for a proposed refinery, the JEDI Petroleum Refinery model can be used to field questions about the added value refineries may bring to the local community.

  12. Compass models: Theory and physical motivations

    NASA Astrophysics Data System (ADS)

    Nussinov, Zohar; van den Brink, Jeroen

    2015-01-01

    Compass models are theories of matter in which the couplings between the internal spin (or other relevant field) components are inherently spatially (typically, direction) dependent. A simple illustrative example is furnished by the 90° compass model on a square lattice in which only couplings of the form τixτjx (where {τia}a denote Pauli operators at site i ) are associated with nearest-neighbor sites i and j separated along the x axis of the lattice while τiyτjy couplings appear for sites separated by a lattice constant along the y axis. Similar compass-type interactions can appear in diverse physical systems. For instance, compass models describe Mott insulators with orbital degrees of freedom where interactions sensitively depend on the spatial orientation of the orbitals involved as well as the low-energy effective theories of frustrated quantum magnets, and a host of other systems such as vacancy centers, and cold atomic gases. The fundamental interdependence between internal (spin, orbital, or other) and external (i.e., spatial) degrees of freedom which underlies compass models generally leads to very rich behaviors, including the frustration of (semi-)classical ordered states on nonfrustrated lattices, and to enhanced quantum effects, prompting, in certain cases, the appearance of zero-temperature quantum spin liquids. As a consequence of these frustrations, new types of symmetries and their associated degeneracies may appear. These intermediate symmetries lie midway between the extremes of global symmetries and local gauge symmetries and lead to effective dimensional reductions. In this article, compass models are reviewed in a unified manner, paying close attention to exact consequences of these symmetries and to thermal and quantum fluctuations that stabilize orders via order-out-of-disorder effects. This is complemented by a survey of numerical results. In addition to reviewing past works, a number of other models are introduced and new results

  13. User-owned utility models for rural electrification

    SciTech Connect

    Waddle, D.

    1997-12-01

    The author discusses the history of rural electric cooperatives (REC) in the United States, and the broader question of whether such organizations can serve as a model for rural electrification in other countries. The author points out the features of such cooperatives which have given them stability and strength, and emphasizes that for success of such programs, many of these same features must be present. He definitely feels the cooperative models are not outdated, but they need strong local support, and a governmental structure which is supportive, and in particular not negative.

  14. Mars Global Reference Atmospheric Model 2010 Version: Users Guide

    NASA Technical Reports Server (NTRS)

    Justh, H. L.

    2014-01-01

    This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.

  15. Incentive theory: II. Models for choice.

    PubMed

    Killeen, P R

    1982-09-01

    Incentive theory is extended to account for concurrent chained schedules of reinforcement. The basic model consists of additive contributions from the primary and secondary effects of reinforcers, which serve to direct the behavior activated by reinforcement. The activation is proportional to the rate of reinforcement and interacts multiplicatively with the directive effects. The two free parameters are q, the slope of the delay of reinforcement gradient, whose value is constant across many experiments, and b, a bias parameter. The model is shown to provide an excellent description of all results from studies that have varied the terminal-link schedules, and of many of the results from studies that have varied initial-link schedules. The model is extended to diverse modifications of the terminal links, such as varied amount of reinforcement, varied signaling of the terminal-link schedules, and segmentation of the terminal-link schedules. It is demonstrated that incentive theory provides an accurate and integrated account of many of the phenomena of choice.

  16. User's manual for LINEAR, a FORTRAN program to derive linear aircraft models

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Patterson, Brian P.; Antoniewicz, Robert F.

    1987-01-01

    This report documents a FORTRAN program that provides a powerful and flexible tool for the linearization of aircraft models. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.

  17. Economic contract theory tests models of mutualism

    PubMed Central

    Weyl, E. Glen; Frederickson, Megan E.; Yu, Douglas W.; Pierce, Naomi E.

    2010-01-01

    Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host–symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume–rhizobia and yucca–moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature. PMID:20733067

  18. Economic contract theory tests models of mutualism.

    PubMed

    Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E

    2010-09-01

    Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.

  19. Policy Building--An Extension to User Modeling

    ERIC Educational Resources Information Center

    Yudelson, Michael V.; Brunskill, Emma

    2012-01-01

    In this paper we combine a logistic regression student model with an exercise selection procedure. As opposed to the body of prior work on strategies for selecting practice opportunities, we are working on an assumption of a finite amount of opportunities to teach the student. Our goal is to prescribe activities that would maximize the amount…

  20. Supporting user-defined granularities in a spatiotemporal conceptual model

    USGS Publications Warehouse

    Khatri, V.; Ram, S.; Snodgrass, R.T.; O'Brien, G. M.

    2002-01-01

    Granularities are integral to spatial and temporal data. A large number of applications require storage of facts along with their temporal and spatial context, which needs to be expressed in terms of appropriate granularities. For many real-world applications, a single granularity in the database is insufficient. In order to support any type of spatial or temporal reasoning, the semantics related to granularities needs to be embedded in the database. Specifying granularities related to facts is an important part of conceptual database design because under-specifying the granularity can restrict an application, affect the relative ordering of events and impact the topological relationships. Closely related to granularities is indeterminacy, i.e., an occurrence time or location associated with a fact that is not known exactly. In this paper, we present an ontology for spatial granularities that is a natural analog of temporal granularities. We propose an upward-compatible, annotation-based spatiotemporal conceptual model that can comprehensively capture the semantics related to spatial and temporal granularities, and indeterminacy without requiring new spatiotemporal constructs. We specify the formal semantics of this spatiotemporal conceptual model via translation to a conventional conceptual model. To underscore the practical focus of our approach, we describe an on-going case study. We apply our approach to a hydrogeologic application at the United States Geologic Survey and demonstrate that our proposed granularity-based spatiotemporal conceptual model is straightforward to use and is comprehensive.

  1. Polarimetric clutter modeling: Theory and application

    NASA Technical Reports Server (NTRS)

    Kong, J. A.; Lin, F. C.; Borgeaud, M.; Yueh, H. A.; Swartz, A. A.; Lim, H. H.; Shim, R. T.; Novak, L. M.

    1988-01-01

    The two-layer anisotropic random medium model is used to investigate fully polarimetric scattering properties of earth terrain media. The polarization covariance matrices for the untilted and tilted uniaxial random medium are evaluated using the strong fluctuation theory and distorted Born approximation. In order to account for the azimuthal randomness in the growth direction of leaves in tree and grass fields, an averaging scheme over the azimuthal direction is also applied. It is found that characteristics of terrain clutter can be identified through the analysis of each element of the covariance matrix. Theoretical results are illustrated by the comparison with experimental data provided by MIT Lincoln Laboratory for tree and grass fields.

  2. A matrix model from string field theory

    NASA Astrophysics Data System (ADS)

    Zeze, Syoji

    2016-09-01

    We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N) vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large N matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.

  3. Efficient perturbation theory for quantum lattice models.

    PubMed

    Hafermann, H; Li, G; Rubtsov, A N; Katsnelson, M I; Lichtenstein, A I; Monien, H

    2009-05-22

    We present a novel approach to long-range correlations beyond dynamical mean-field theory, through a ladder approximation to dual fermions. The new technique is applied to the two-dimensional Hubbard model. We demonstrate that the transformed perturbation series for the nonlocal dual fermions has superior convergence properties over standard diagrammatic techniques. The critical Néel temperature of the mean-field solution is suppressed in the ladder approximation, in accordance with quantum Monte Carlo results. An illustration of how the approach captures and allows us to distinguish short- and long-range correlations is given.

  4. IoT-Based User-Driven Service Modeling Environment for a Smart Space Management System

    PubMed Central

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-01-01

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service. PMID:25420153

  5. IoT-based user-driven service modeling environment for a smart space management system.

    PubMed

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-11-20

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service.

  6. Can behavioral theory inform the understanding of depression and medication nonadherence among HIV-positive substance users?

    PubMed

    Magidson, Jessica F; Listhaus, Alyson; Seitz-Brown, C J; Safren, Steven A; Lejuez, C W; Daughters, Stacey B

    2015-04-01

    Medication adherence is highly predictive of health outcomes across chronic conditions, particularly HIV/AIDS. Depression is consistently associated with worse adherence, yet few studies have sought to understand how depression relates to adherence. This study tested three components of behavioral depression theory--goal-directed activation, positive reinforcement, and environmental punishment--as potential indirect effects in the relation between depressive symptoms and medication nonadherence among low-income, predominantly African American substance users (n = 83). Medication nonadherence was assessed as frequency of doses missed across common reasons for nonadherence. Non-parametric bootstrapping was used to evaluate the indirect effects. Of the three intermediary variables, there was only an indirect effect of environmental punishment; depressive symptoms were associated with greater nonadherence through greater environmental punishment. Goal-directed activation and positive reinforcement were unrelated to adherence. Findings suggest the importance of environmental punishment in the relation between depression and medication adherence and may inform future intervention efforts for this population.

  7. Using chemical organization theory for model checking

    PubMed Central

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter

    2009-01-01

    Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. Results: First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. Availability: All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools Contact: dittrich@minet.uni-jena.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19468053

  8. Future Air Traffic Growth and Schedule Model User's Guide

    NASA Technical Reports Server (NTRS)

    Kimmel, William M. (Technical Monitor); Smith, Jeremy C.; Dollyhigh, Samuel M.

    2004-01-01

    The Future Air Traffic Growth and Schedule Model was developed as an implementation of the Fratar algorithm to project future traffic flow between airports in a system and of then scheduling the additional flights to reflect current passenger time-of-travel preferences. The methodology produces an unconstrained future schedule from a current (or baseline) schedule and the airport operations growth rates. As an example of the use of the model, future schedules are projected for 2010 and 2022 for all flights arriving at, departing from, or flying between all continental United States airports that had commercial scheduled service for May 17, 2002. Inter-continental US traffic and airports are included and the traffic is also grown with the Fratar methodology to account for their arrivals and departures to the continental US airports. Input data sets derived from the Official Airline Guide (OAG) data and FAA Terminal Area Forecast (TAF) are included in the examples of the computer code execution.

  9. Application of Chaos Theory to Psychological Models

    NASA Astrophysics Data System (ADS)

    Blackerby, Rae Fortunato

    This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in

  10. Location contexts of user check-ins to model urban geo life-style patterns.

    PubMed

    Hasan, Samiul; Ukkusuri, Satish V

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items-either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior. PMID:25970430

  11. Location contexts of user check-ins to model urban geo life-style patterns.

    PubMed

    Hasan, Samiul; Ukkusuri, Satish V

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items-either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior.

  12. Location Contexts of User Check-Ins to Model Urban Geo Life-Style Patterns

    PubMed Central

    Hasan, Samiul; Ukkusuri, Satish V.

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items—either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior. PMID:25970430

  13. User's instructions for the 41-node thermoregulatory model (steady state version)

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1974-01-01

    A user's guide for the steady-state thermoregulatory model is presented. The model was modified to provide conversational interaction on a remote terminal, greater flexibility for parameter estimation, increased efficiency of convergence, greater choice of output variable and more realistic equations for respiratory and skin diffusion water losses.

  14. Introducing a new open source GIS user interface for the SWAT model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Soil and Water Assessment Tool (SWAT) model is a robust watershed modelling tool. It typically uses the ArcSWAT interface to create its inputs. ArcSWAT is public domain software which works in the licensed ArcGIS environment. The aim of this paper was to develop an open source user interface ...

  15. Cross-Cultural Teamwork in End User Computing: A Theoretical Model.

    ERIC Educational Resources Information Center

    Bento, Regina F.

    1995-01-01

    Presents a theoretical model explaining how cultural influences may affect the open, dynamic system of a cross-cultural, end-user computing team. Discusses the relationship between cross-cultural factors and various parts of the model such as: input variables, the system itself, outputs, and implications for the management of such teams. (JKP)

  16. ESPVI 4.0 ELECTROSTATIS PRECIPITATOR V-1 AND PERFORMANCE MODEL: USER'S MANUAL

    EPA Science Inventory

    The manual is the companion document for the microcomputer program ESPVI 4.0, Electrostatic Precipitation VI and Performance Model. The program was developed to provide a user- friendly interface to an advanced model of electrostatic precipitation (ESP) performance. The program i...

  17. Digital Avionics Information System (DAIS): Training Requirements Analysis Model Users Guide. Final Report.

    ERIC Educational Resources Information Center

    Czuchry, Andrew J.; And Others

    This user's guide describes the functions, logical operations and subroutines, input data requirements, and available outputs of the Training Requirements Analysis Model (TRAMOD), a computerized analytical life cycle cost modeling system for use in the early stages of system design. Operable in a stand-alone mode, TRAMOD can be used for the…

  18. Disconfirming User Expectations of the Online Service Experience: Inferred versus Direct Disconfirmation Modeling.

    ERIC Educational Resources Information Center

    O'Neill, Martin; Palmer, Adrian; Wright, Christine

    2003-01-01

    Disconfirmation models of online service measurement seek to define service quality as the difference between user expectations of the service to be received and perceptions of the service actually received. Two such models-inferred and direct disconfirmation-for measuring quality of the online experience are compared (WebQUAL, SERVQUAL). Findings…

  19. PARFUME Theory and Model basis Report

    SciTech Connect

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  20. A power information user (PIU) model to promote information integration in Tennessee's public health community.

    PubMed

    Sathe, Nila A; Lee, Patricia; Giuse, Nunzia Bettinsoli

    2004-10-01

    Observation and immersion in the user community are critical factors in designing and implementing informatics solutions; such practices ensure relevant interventions and promote user acceptance. Libraries can adapt these strategies to developing instruction and outreach. While needs assessment is typically a core facet of library instruction, sustained, iterative assessment underlying the development of user-centered instruction is key to integrating resource use into the workflow. This paper describes the Eskind Biomedical Library's (EBL's) recent work with the Tennessee public health community to articulate a training model centered around developing power information users (PIUs). PIUs are community-based individuals with an advanced understanding of information seeking and resource use and are committed to championing information integration. As model development was informed by observation of PIU workflow and information needs, it also allowed for informal testing of the applicability of assessment via domain immersion in library outreach. Though the number of PIUs involved in the project was small, evaluation indicated that the model was useful for promoting information use in PIU workgroups and that the concept of domain immersion was relevant to library-related projects. Moreover, EBL continues to employ principles of domain understanding inherent in the PIU model to develop further interventions for the public health community and library users.

  1. UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.

    PubMed

    Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun

    2013-12-01

    Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.

  2. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  3. UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.

    PubMed

    Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun

    2013-12-01

    Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets. PMID:24051765

  4. Targeting Parents for Childhood Weight Management: Development of a Theory-Driven and User-Centered Healthy Eating App

    PubMed Central

    Lahiri, Sudakshina; Brown, Katherine Elizabeth

    2015-01-01

    Background The proliferation of health promotion apps along with mobile phones' array of features supporting health behavior change offers a new and innovative approach to childhood weight management. However, despite the critical role parents play in children’s weight related behaviors, few industry-led apps aimed at childhood weight management target parents. Furthermore, industry-led apps have been shown to lack a basis in behavior change theory and evidence. Equally important remains the issue of how to maximize users’ engagement with mobile health (mHealth) interventions where there is growing consensus that inputs from the commercial app industry and the target population should be an integral part of the development process. Objective The aim of this study is to systematically design and develop a theory and evidence-driven, user-centered healthy eating app targeting parents for childhood weight management, and clearly document this for the research and app development community. Methods The Behavior Change Wheel (BCW) framework, a theoretically-based approach for intervention development, along with a user-centered design (UCD) philosophy and collaboration with the commercial app industry, guided the development process. Current evidence, along with a series of 9 focus groups (total of 46 participants) comprised of family weight management case workers, parents with overweight and healthy weight children aged 5-11 years, and consultation with experts, provided data to inform the app development. Thematic analysis of focus groups helped to extract information related to relevant theoretical, user-centered, and technological components to underpin the design and development of the app. Results Inputs from parents and experts working in the area of childhood weight management helped to identify the main target behavior: to help parents provide appropriate food portion sizes for their children. To achieve this target behavior, the behavioral diagnosis

  5. Measurement of Multiple Nicotine Dependence Domains Among Cigarette, Non-cigarette and Poly-tobacco Users: Insights from Item Response Theory*

    PubMed Central

    Strong, David R; Messer, Karen; Hartman, Sheri J.; Conway, Kevin P.; Hoffman, Allison; Pharris-Ciurej, Nikolas; White, Martha; Green, Victoria R.; Compton, Wilson M.; Pierce, John

    2015-01-01

    Background Nicotine dependence (ND) is a key construct that organizes physiological and behavioral symptoms associated with persistent nicotine intake. Measurement of ND has focused primarily on cigarette smokers. Thus, validation of brief instruments that apply to a broad spectrum of tobacco product users is needed. Methods We examined multiple domains of ND in a longitudinal national study of the United States population, the United States National Epidemiological Survey of Alcohol and Related Conditions (NESARC). We used methods based in item response theory to identify and validate increasingly brief measures of ND that included symptoms to assess ND similarly among cigarette, cigar, smokeless, and poly tobacco users. Results Confirmatory factor analytic models supported a single, primary dimension underlying symptoms of ND across tobacco use groups. Differential Item Functioning (DIF) analysis generated little support for systematic differences in response to symptoms of ND across tobacco use groups. We established significant concurrent and predictive validity of brief 3- and 5- symptom indices for measuring ND. Conclusions Measuring ND across tobacco use groups with a common set of symptoms facilitates evaluation of tobacco use in an evolving marketplace of tobacco and nicotine products. PMID:26005043

  6. Toward a user's toolkit for modeling scintillator proportionality and light yield

    NASA Astrophysics Data System (ADS)

    Li, Qi

    Intrinsic nonproportionality is a material-dependent phenomenon that sets an ultimate limit on energy resolution of radiation detectors. In general, anything that causes light yield to change along the particle track (e.g., the primary electron track in gamma-ray detectors) contributes to nonproportionality. Most of the physics of nonproportionality lies in the host-transport and transfer-to-activator term. The main physical phenomena involved are carrier diffusion, trapping, drift in internal electric fields, and nonlinear rates of radiative and nonradiative recombination. Some complexity is added by the now well-established fact that the electron temperature is changing during important parts of the physical processes listed above. It has consequences, but is tractable by application of electron-phonon interaction theory and first-principles calculation of trap structures checked by experiment. Determination of coefficients and rate "constants" as functions of electron temperature T e for diffusion, D(Te( t)); capture on multiple (i) radiative and nonradiative centers, Ali(Te(t)); bimolecular exciton formation, B2(T e(t)); and nonlinear quenching, K2( Te(t)), K3( Te(t)) in specific scintillator materials will enable computational prediction of energy-dependent response from standard rate equations solved in the electron track for initial excitation distributions calculated by standard methods such as Geant4. Te( t) itself is a function of time. Determination of these parameters can be combined with models describing carrier transport in scintillators, which is able to build a user's toolkit for analyzing any existing and potential scintillators. In the dissertation, progress in calculating electronic structure of traps and activators, diffusion coefficients and rate functions, and testing the model will be described.

  7. Surface matching for correlation of virtual models: Theory and application

    NASA Technical Reports Server (NTRS)

    Caracciolo, Roberto; Fanton, Francesco; Gasparetto, Alessandro

    1994-01-01

    Virtual reality can enable a robot user to off line generate and test in a virtual environment a sequence of operations to be executed by the robot in an assembly cell. Virtual models of objects are to be correlated to the real entities they represent by means of a suitable transformation. A solution to the correlation problem, which is basically a problem of 3-dimensional adjusting, has been found exploiting the surface matching theory. An iterative algorithm has been developed, which matches the geometric surface representing the shape of the virtual model of an object, with a set of points measured on the surface in the real world. A peculiar feature of the algorithm is to work also if there is no one-to-one correspondence between the measured points and those representing the surface model. Furthermore the problem of avoiding convergence to local minima is solved, by defining a starting point of states ensuring convergence to the global minimum. The developed algorithm has been tested by simulation. Finally, this paper proposes a specific application, i.e., correlating a robot cell, equipped for biomedical use with its virtual representation.

  8. PRay - A graphical user interface for interactive visualization and modification of rayinvr models

    NASA Astrophysics Data System (ADS)

    Fromm, T.

    2016-01-01

    PRay is a graphical user interface for interactive displaying and editing of velocity models for seismic refraction. It is optimized for editing rayinvr models but can also be used as a dynamic viewer for ray tracing results from other software. The main features are the graphical editing of nodes and fast adjusting of the display (stations and phases). It can be extended by user-defined shell scripts and links to phase picking software. PRay is open source software written in the scripting language Perl, runs on Unix-like operating systems including Mac OS X and provides a version controlled source code repository for community development.

  9. Emergent User Behavior on Twitter Modelled by a Stochastic Differential Equation

    PubMed Central

    Mollgaard, Anders; Mathiesen, Joachim

    2015-01-01

    Data from the social-media site, Twitter, is used to study the fluctuations in tweet rates of brand names. The tweet rates are the result of a strongly correlated user behavior, which leads to bursty collective dynamics with a characteristic 1/f noise. Here we use the aggregated "user interest" in a brand name to model collective human dynamics by a stochastic differential equation with multiplicative noise. The model is supported by a detailed analysis of the tweet rate fluctuations and it reproduces both the exact bursty dynamics found in the data and the 1/f noise. PMID:25955783

  10. User-defined Material Model for Thermo-mechanical Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.

  11. Emergent user behavior on Twitter modelled by a stochastic differential equation.

    PubMed

    Mollgaard, Anders; Mathiesen, Joachim

    2015-01-01

    Data from the social-media site, Twitter, is used to study the fluctuations in tweet rates of brand names. The tweet rates are the result of a strongly correlated user behavior, which leads to bursty collective dynamics with a characteristic 1/f noise. Here we use the aggregated "user interest" in a brand name to model collective human dynamics by a stochastic differential equation with multiplicative noise. The model is supported by a detailed analysis of the tweet rate fluctuations and it reproduces both the exact bursty dynamics found in the data and the 1/f noise.

  12. Reverberation noise modeling using extreme value theory

    NASA Astrophysics Data System (ADS)

    La Cour, Brian; Luter, Robert

    2002-05-01

    Normalized matched filter output forms the basis of target detection in active sonar. In a target-free environment, the central theorem, if valid, predicts that the statistics of the envelope follow a Rayleigh distribution, and, to first approximation, this is indeed observed. However, well-known departures from the Rayleigh model are found in the tail end of observed distributions. Traditional approaches to this problem have focused on constructing a simple, parameterized, non-Rayleigh distribution which more closely models observations. This paper suggests a novel alternative which focuses on a robust method of modeling only the tails of the distribution in favor of the less important body. Results from extreme-value theory are used to fit a generalized Pareto distribution (GPD) to the empirical cumulative distribution function, conditioned on a large threshold value. [A random variable X has a GPD if P(X<=x)=1-(1+γx/σ)-1/γ for x>=0, σ>0, and γ real; γ=0 is the exponential distribution.] Estimates of γ and σ are discussed for a broad range of active sonar data, and the results are compared with fits to other popular non-Rayleigh models. The origins of non-Rayleighness are also considered, including finite-size effects, spatial and temporal correlations, and nonuniformity.

  13. SB3D User Manual, Santa Barbara 3D Radiative Transfer Model

    SciTech Connect

    O'Hirok, William

    1999-01-01

    SB3D is a three-dimensional atmospheric and oceanic radiative transfer model for the Solar spectrum. The microphysics employed in the model are the same as used in the model SBDART. It is assumed that the user of SB3D is familiar with SBDART and IDL. SB3D differs from SBDART in that computations are conducted on media in three-dimensions rather than a single column (i.e. plane-parallel), and a stochastic method (Monte Carlo) is employed instead of a numerical approach (Discrete Ordinates) for estimating a solution to the radiative transfer equation. Because of these two differences between SB3D and SBDART, the input and running of SB3D is more unwieldy and requires compromises between model performance and computational expense. Hence, there is no one correct method for running the model and the user must develop a sense to the proper input and configuration of the model.

  14. Theory and modelling of nanocarbon phase stability.

    SciTech Connect

    Barnard, A. S.

    2006-01-01

    The transformation of nanodiamonds into carbon-onions (and vice versa) has been observed experimentally and has been modeled computationally at various levels of sophistication. Also, several analytical theories have been derived to describe the size, temperature and pressure dependence of this phase transition. However, in most cases a pure carbon-onion or nanodiamond is not the final product. More often than not an intermediary is formed, known as a bucky-diamond, with a diamond-like core encased in an onion-like shell. This has prompted a number of studies investigating the relative stability of nanodiamonds, bucky-diamonds, carbon-onions and fullerenes, in various size regimes. Presented here is a review outlining results of numerous theoretical studies examining the phase diagrams and phase stability of carbon nanoparticles, to clarify the complicated relationship between fullerenic and diamond structures at the nanoscale.

  15. The NATA code: Theory and analysis, volume 1. [user manuals (computer programming) - gas dynamics, wind tunnels

    NASA Technical Reports Server (NTRS)

    Bade, W. L.; Yos, J. M.

    1975-01-01

    A computer program for calculating quasi-one-dimensional gas flow in axisymmetric and two-dimensional nozzles and rectangular channels is presented. Flow is assumed to start from a state of thermochemical equilibrium at a high temperature in an upstream reservoir. The program provides solutions based on frozen chemistry, chemical equilibrium, and nonequilibrium flow with finite reaction rates. Electronic nonequilibrium effects can be included using a two-temperature model. An approximate laminar boundary layer calculation is given for the shear and heat flux on the nozzle wall. Boundary layer displacement effects on the inviscid flow are considered also. Chemical equilibrium and transport property calculations are provided by subroutines. The code contains precoded thermochemical, chemical kinetic, and transport cross section data for high-temperature air, CO2-N2-Ar mixtures, helium, and argon. It provides calculations of the stagnation conditions on axisymmetric or two-dimensional models, and of the conditions on the flat surface of a blunt wedge. The primary purpose of the code is to describe the flow conditions and test conditions in electric arc heated wind tunnels.

  16. Gravothermal Star Clusters - Theory and Computer Modelling

    NASA Astrophysics Data System (ADS)

    Spurzem, Rainer

    2010-11-01

    In the George Darwin lecture, delivered to the British Royal Astronomical Society in 1960 by Viktor A. Ambartsumian he wrote on the evolution of stellar systems that it can be described by the "dynamic evolution of a gravitating gas" complemented by "a statistical description of the changes in the physical states of stars". This talk will show how this physical concept has inspired theoretical modeling of star clusters in the following decades up to the present day. The application of principles of thermodynamics shows, as Ambartsumian argued in his 1960 lecture, that there is no stable state of equilibrium of a gravitating star cluster. The trend to local thermodynamic equilibrium is always disturbed by escaping stars (Ambartsumian), as well as by gravothermal and gravogyro instabilities, as it was detected later. Here the state-of-the-art of modeling the evolution of dense stellar systems based on principles of thermodynamics and statistical mechanics (Fokker-Planck approximation) will be reviewed. Recent progress including rotation and internal correlations (primordial binaries) is presented. The models have also very successfully been used to study dense star clusters around massive black holes in galactic nuclei and even (in a few cases) relativistic supermassive dense objects in centres of galaxies (here again briefly touching one of the many research fields of V.A. Ambartsumian). For the modern present time of high-speed supercomputing, where we are tackling direct N-body simulations of star clusters, we will show that such direct modeling supports and proves the concept of the statistical models based on the Fokker-Planck theory, and that both theoretical concepts and direct computer simulations are necessary to support each other and make scientific progress in the study of star cluster evolution.

  17. Modeling the heterogeneity of human dynamics based on the measurements of influential users in Sina Microblog

    NASA Astrophysics Data System (ADS)

    Wang, Chenxu; Guan, Xiaohong; Qin, Tao; Yang, Tao

    2015-06-01

    Online social network has become an indispensable communication tool in the information age. The development of microblog also provides us a great opportunity to study human dynamics that play a crucial role in the design of efficient communication systems. In this paper we study the characteristics of the tweeting behavior based on the data collected from Sina Microblog. The user activity level is measured to characterize how often a user posts a tweet. We find that the user activity level follows a bimodal distribution. That is, the microblog users tend to be either active or inactive. The inter-tweeting time distribution is then measured at both the aggregate and individual levels. We find that the inter-tweeting time follows a piecewise power law distribution of two tails. Furthermore, the exponents of the two tails have different correlations with the user activity level. These findings demonstrate that the dynamics of the tweeting behavior are heterogeneous in different time scales. We then develop a dynamic model co-driven by the memory and the interest mechanism to characterize the heterogeneity. The numerical simulations validate the model and verify that the short time interval tweeting behavior is driven by the memory mechanism while the long time interval behavior by the interest mechanism.

  18. Using the Theory of Planned Behavior to predict implementation of harm reduction strategies among MDMA/ecstasy users.

    PubMed

    Davis, Alan K; Rosenberg, Harold

    2016-06-01

    This prospective study was designed to test whether the variables proposed by the Theory of Planned Behavior (TPB) were associated with baseline intention to implement and subsequent use of 2 MDMA/ecstasy-specific harm reduction interventions: preloading/postloading and pill testing/pill checking. Using targeted Facebook advertisements, an international sample of 391 recreational ecstasy users were recruited to complete questionnaires assessing their ecstasy consumption history, and their attitudes, subjective norms, perceived behavioral control, habit strength (past strategy use), and intention to use these two strategies. Attitudes, subjective norms, and perceived behavioral control were significantly associated with baseline intention to preload/postload and pill test/pill check. Out of the 391 baseline participants, 100 completed the two-month follow-up assessment. Baseline habit strength and frequency of ecstasy consumption during the three months prior to baseline were the only significant predictors of how often participants used the preloading/postloading strategy during the follow-up. Baseline intention to pill test/pill check was the only significant predictor of how often participants used this strategy during the follow-up. These findings provide partial support for TPB variables as both correlates of baseline intention to implement and predictors of subsequent use of these two strategies. Future investigations could assess whether factors related to ecstasy consumption (e.g., subjective level of intoxication, craving, negative consequences following consumption), and environmental factors (e.g., accessibility and availability of harm reduction resources) improve the prediction of how often ecstasy users employ these and other harm reduction strategies. (PsycINFO Database Record PMID:27322805

  19. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 7: User Models: A System Assessment

    NASA Technical Reports Server (NTRS)

    1974-01-01

    User models defined as any explicit process or procedure used to transform information extracted from remotely sensed data into a form useful as a resource management information input are discussed. The role of the user models as information, technological, and operations interfaces between the TERSSE and the resource managers is emphasized. It is recommended that guidelines and management strategies be developed for a systems approach to user model development.

  20. ModelMuse: A U.S. Geological Survey Open-Source, Graphical User Interface for Groundwater Models

    NASA Astrophysics Data System (ADS)

    Winston, R. B.

    2013-12-01

    ModelMuse is a free publicly-available graphical preprocessor used to generate the input and display the output for several groundwater models. It is written in Object Pascal and the source code is available on the USGS software web site. Supported models include the MODFLOW family of models, PHAST (version 1), and SUTRA version 2.2. With MODFLOW and PHAST, the user generates a grid and uses 'objects' (points, lines, and polygons) to define boundary conditions and the spatial variation in aquifer properties. Because the objects define the spatial variation, the grid can be changed without the user needing to re-enter spatial data. The same paradigm is used with SUTRA except that the user generates a quadrilateral finite-element mesh instead of a rectangular grid. The user interacts with the model in a top view and in a vertical cross section. The cross section can be at any angle or location. There is also a three-dimensional view of the model. For SUTRA, a new method of visualizing the permeability and related properties has been introduced. In three dimensional SUTRA models, the user specifies the permeability tensor by specifying permeability in three mutually orthogonal directions that can be oriented in space in any direction. Because it is important for the user to be able to check both the magnitudes and directions of the permeabilities, ModelMuse displays the permeabilities as either a two-dimensional or a three-dimensional vector plot. Color is used to differentiate the maximum, middle, and minimum permeability vectors. The magnitude of the permeability is shown by the vector length. The vector angle shows the direction of the maximum, middle, or minimum permeability. Contour and color plots can also be used to display model input and output data.

  1. Transmission Line Jobs and Economic Development Impact (JEDI) Model User Reference Guide

    SciTech Connect

    Goldberg, M.; Keyser, D.

    2013-10-01

    The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are freely available, user-friendly tools that estimate the potential economic impacts of constructing and operating power generation projects for a range of conventional and renewable energy technologies. The Transmission Line JEDI model can be used to field questions about the economic impacts of transmission lines in a given state, region, or local community. This Transmission Line JEDI User Reference Guide was developed to provide basic instruction on operating the model and understanding the results. This guide also provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data contained in the model.

  2. A Monthly Water-Balance Model Driven By a Graphical User Interface

    USGS Publications Warehouse

    McCabe, Gregory J.; Markstrom, Steven L.

    2007-01-01

    This report describes a monthly water-balance model driven by a graphical user interface, referred to as the Thornthwaite monthly water-balance program. Computations of monthly water-balance components of the hydrologic cycle are made for a specified location. The program can be used as a research tool, an assessment tool, and a tool for classroom instruction.

  3. A Multilayer Naïve Bayes Model for Analyzing User's Retweeting Sentiment Tendency.

    PubMed

    Wang, Mengmeng; Zuo, Wanli; Wang, Ying

    2015-01-01

    Today microblogging has increasingly become a means of information diffusion via user's retweeting behavior. Since retweeting content, as context information of microblogging, is an understanding of microblogging, hence, user's retweeting sentiment tendency analysis has gradually become a hot research topic. Targeted at online microblogging, a dynamic social network, we investigate how to exploit dynamic retweeting sentiment features in retweeting sentiment tendency analysis. On the basis of time series of user's network structure information and published text information, we first model dynamic retweeting sentiment features. Then we build Naïve Bayes models from profile-, relationship-, and emotion-based dimensions, respectively. Finally, we build a multilayer Naïve Bayes model based on multidimensional Naïve Bayes models to analyze user's retweeting sentiment tendency towards a microblog. Experiments on real-world dataset demonstrate the effectiveness of the proposed framework. Further experiments are conducted to understand the importance of dynamic retweeting sentiment features and temporal information in retweeting sentiment tendency analysis. What is more, we provide a new train of thought for retweeting sentiment tendency analysis in dynamic social networks. PMID:26417367

  4. USER'S GUIDE TO GEOSYNTHETIC MODELING SYSTEM: GM SYSTEM VERSION 1.1

    EPA Science Inventory

    The document is a user manual for the Geosynthetic Modeling System. The menu-driven analytical system performs design calculations for 28 different landfill design applications that incorporate geosynthetic materials. The results of each set of design calculations are compared wi...

  5. Visual imagery and the user model applied to fuel handling at EBR-II

    SciTech Connect

    Brown-VanHoozer, S.A.

    1995-06-01

    The material presented in this paper is based on two studies involving visual display designs and the user`s perspective model of a system. The studies involved a methodology known as Neuro-Linguistic Programming (NLP), and its use in expanding design choices which included the ``comfort parameters`` and ``perspective reality`` of the user`s model of the world. In developing visual displays for the EBR-II fuel handling system, the focus would be to incorporate the comfort parameters that overlap from each of the representation systems: visual, auditory and kinesthetic then incorporate the comfort parameters of the most prominent group of the population, and last, blend in the other two representational system comfort parameters. The focus of this informal study was to use the techniques of meta-modeling and synesthesia to develop a virtual environment that closely resembled the operator`s perspective of the fuel handling system of Argonne`s Experimental Breeder Reactor - II. An informal study was conducted using NLP as the behavioral model in a v reality (VR) setting.

  6. A Multilayer Naïve Bayes Model for Analyzing User's Retweeting Sentiment Tendency.

    PubMed

    Wang, Mengmeng; Zuo, Wanli; Wang, Ying

    2015-01-01

    Today microblogging has increasingly become a means of information diffusion via user's retweeting behavior. Since retweeting content, as context information of microblogging, is an understanding of microblogging, hence, user's retweeting sentiment tendency analysis has gradually become a hot research topic. Targeted at online microblogging, a dynamic social network, we investigate how to exploit dynamic retweeting sentiment features in retweeting sentiment tendency analysis. On the basis of time series of user's network structure information and published text information, we first model dynamic retweeting sentiment features. Then we build Naïve Bayes models from profile-, relationship-, and emotion-based dimensions, respectively. Finally, we build a multilayer Naïve Bayes model based on multidimensional Naïve Bayes models to analyze user's retweeting sentiment tendency towards a microblog. Experiments on real-world dataset demonstrate the effectiveness of the proposed framework. Further experiments are conducted to understand the importance of dynamic retweeting sentiment features and temporal information in retweeting sentiment tendency analysis. What is more, we provide a new train of thought for retweeting sentiment tendency analysis in dynamic social networks.

  7. Cross's Nigrescence Model: From Theory to Scale to Theory.

    ERIC Educational Resources Information Center

    Vandiver, Beverly J.; Fhagen-Smith, Peony E.; Cokley, Kevin O.; Cross, William E., Jr.; Worrell, Frank C.

    2001-01-01

    Describes the theoretical and empirical evolution of the revised Cross nigrescence identity model (W. E. Cross, 1991) in the context of developing a new multidimensional measure, the Cross Racial Identity Scale. The research resulted in an expanded nigrescence model, and preliminary factor analytic strategies support the existence of 6 subscales.…

  8. A unified timeline model and user interface for multimedia medical databases.

    PubMed

    Dionisio, J D; Cárdenas, A F; Taira, R K; Aberle, D R; Chu, W W; McNitt-Gray, M F; Goldin, J; Lufkin, R B

    1996-01-01

    A multimedia medical database model and prototype is described for supporting a timeline-based presentation of information. The database links image and text data in a way that permits users to look at medical information in a single unified view. Various visualization programs permit the user to view data in various ways, including full image views, graphs, and tables. Our technology is applied for proof-of-concept to two areas: thoracic oncology and thermal tumor ablation therapy of the brain. This effort is part of the multidisciplinary KMeD project in collaboration with medical research and clinical treatment projects at UCLA.

  9. How service users become empowered in human service organizations: the empowerment model.

    PubMed

    Holosko, M J; Leslie, D R; Cassano, D R

    2001-01-01

    This article presents an empowerment model (EM) to be used by service users in human service organizations (HSOs). The EM is a structure for service user input to be integrated within the HSO at various administrative levels through a four-step sequential process. The article fills a distinct void in the literature as there are numerous accounts about the importance of empowerment, but few on processes that need to be defined to operationalize the concept. Implications are directed toward administrators as they need to take leadership in implementing the EM in order to deliver more efficient and relevant services to their clients.

  10. Users of withdrawal method in the Islamic Republic of Iran: are they intending to use oral contraceptives? Applying the theory of planned behaviour.

    PubMed

    Rahnama, P; Hidarnia, A; Shokravi, F A; Kazemnejad, A; Montazeri, A; Najorkolaei, F R; Saburi, A

    2013-09-01

    Many couples in the Islamic Republic of Iran rely on coital withdrawal for contraception. The purpose of this cross-sectional study was to use the theory of planned behaviour to explore factors that influence withdrawal users' intent to switch to oral contraception (OC). Participants were 336 sexually active, married women, who were current users of withdrawal and were recruited from 5 public family planning clinics in Tehran. A questionnair included measures of the theory of planned behaviour: attitude (behavioural beliefs, outcome evaluations), subjective norms (normative beliefs, motivation to comply), perceived behaviour control, past behaviour and behavioural intention. Linear regression analyses showed that past behaviour, perceived behaviour control, attitude and subjective norms accounted for the highest percentage of total variance observed for intention to use OC (36%). Beliefs-based family planning education and counsellingshould to be designed for users of the withdrawal method.

  11. Users of withdrawal method in the Islamic Republic of Iran: are they intending to use oral contraceptives? Applying the theory of planned behaviour.

    PubMed

    Rahnama, P; Hidarnia, A; Shokravi, F A; Kazemnejad, A; Montazeri, A; Najorkolaei, F R; Saburi, A

    2013-09-01

    Many couples in the Islamic Republic of Iran rely on coital withdrawal for contraception. The purpose of this cross-sectional study was to use the theory of planned behaviour to explore factors that influence withdrawal users' intent to switch to oral contraception (OC). Participants were 336 sexually active, married women, who were current users of withdrawal and were recruited from 5 public family planning clinics in Tehran. A questionnair included measures of the theory of planned behaviour: attitude (behavioural beliefs, outcome evaluations), subjective norms (normative beliefs, motivation to comply), perceived behaviour control, past behaviour and behavioural intention. Linear regression analyses showed that past behaviour, perceived behaviour control, attitude and subjective norms accounted for the highest percentage of total variance observed for intention to use OC (36%). Beliefs-based family planning education and counsellingshould to be designed for users of the withdrawal method. PMID:24313039

  12. Jobs and Economic Development Impact (JEDI) User Reference Guide: Fast Pyrolysis Biorefinery Model

    SciTech Connect

    Zhang, Y.; Goldberg, M.

    2015-02-01

    This guide -- the JEDI Fast Pyrolysis Biorefinery Model User Reference Guide -- was developed to assist users in operating and understanding the JEDI Fast Pyrolysis Biorefinery Model. The guide provides information on the model's underlying methodology, as well as the parameters and data sources used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features and a discussion of how the results should be interpreted. Based on project-specific inputs from the user, the JEDI Fast Pyrolysis Biorefinery Model estimates local (e.g., county- or state-level) job creation, earnings, and output from total economic activity for a given fast pyrolysis biorefinery. These estimates include the direct, indirect and induced economic impacts to the local economy associated with the construction and operation phases of biorefinery projects.Local revenue and supply chain impacts as well as induced impacts are estimated using economic multipliers derived from the IMPLAN software program. By determining the local economic impacts and job creation for a proposed biorefinery, the JEDI Fast Pyrolysis Biorefinery Model can be used to field questions about the added value biorefineries might bring to a local community.

  13. Planning a port interface for an ocean incineration system: computer-model user's manual. Final report

    SciTech Connect

    Glucksman, M.A.; Marcus, H.S.

    1986-06-01

    The User's Manual is written to accompany the computer model developed in the report, Planning a Port Interface For An Ocean Incineration System. The model is based on SYMPHONY (TM) a Lotus Development Corp. product. Apart from the requirement for the software, the model needs an IBM PC compatible personal computer with at least 576 kilobytes of RAM. The model assumes the viewpoint of a planner that has yet to choose a particular type of vessel and port technology. The model contains four types of information: physical parameters of system alternatives, government regulations, risks associated with different system alternatives, and relevant background information.

  14. Support for significant evolutions of the user data model in ROOT files

    NASA Astrophysics Data System (ADS)

    Canal, Ph; Brun, R.; Fine, V.; Janyst, L.; Lauret, J.; Russo, P.

    2010-04-01

    One of the main strengths of ROOT input and output (I/O) is its inherent support for schema evolution. Two distinct modes are supported, one manual via a hand coded streamer function and one fully automatic via the ROOT StreamerInfo. One draw back of the streamer functions is that they are not usable by TTree objects in split mode. Until now, the user could not customize the automatic schema evolution mechanism and the only mechanism to go beyond the default rules was to revert to using the streamer function. In ROOT 5.22/00, we introduced a new mechanism which allows user provided extensions of the automatic schema evolution that can be used in object-wise, member-wise and split modes. This paper will describe the many possibilities ranging from the simple assignment of transient members to the complex reorganization of the user's object model.

  15. The European ALMA Regional Centre Network: A Geographically Distributed User Support Model

    NASA Astrophysics Data System (ADS)

    Hatziminaoglou, E.; Zwaan, M.; Andreani, P.; Barta, M.; Bertoldi, F.; Brand, J.; Gueth, F.; Hogerheijde, M.; Maercker, M.; Massardi, M.; Muehle, S.; Muxlow, Th.; Richards, A.; Schilke, P.; Tilanus, R.; Vlemmings, W.; Afonso, J.; Messias, H.

    2015-12-01

    In recent years there has been a paradigm shift from centralised to geographically distributed resources. Individual entities are no longer able to host or afford the necessary expertise in-house, and, as a consequence, society increasingly relies on widespread collaborations. Although such collaborations are now the norm for scientific projects, more technical structures providing support to a distributed scientific community without direct financial or other material benefits are scarce. The network of European ALMA Regional Centre (ARC) nodes is an example of such an internationally distributed user support network. It is an organised effort to provide the European ALMA user community with uniform expert support to enable optimal usage and scientific output of the ALMA facility. The network model for the European ARC nodes is described in terms of its organisation, communication strategies and user support.

  16. Support for significant evolutions of the user data model in ROOT files

    SciTech Connect

    Canal, P.; Brun, R.; Fine, V.; Janyst, L.; Lauret, J.; Russo, P.; /Fermilab

    2010-01-01

    One of the main strengths of ROOT input and output (I/O) is its inherent support for schema evolution. Two distinct modes are supported, one manual via a hand coded streamer function and one fully automatic via the ROOT StreamerInfo. One draw back of the streamer functions is that they are not usable by TTree objects in split mode. Until now, the user could not customize the automatic schema evolution mechanism and the only mechanism to go beyond the default rules was to revert to using the streamer function. In ROOT 5.22/00, we introduced a new mechanism which allows user provided extensions of the automatic schema evolution that can be used in object-wise, member-wise and split modes. This paper will describe the many possibilities ranging from the simple assignment of transient members to the complex reorganization of the user's object model.

  17. Development and implementation of (Q)SAR modeling within the CHARMMing web-user interface.

    PubMed

    Weidlich, Iwona E; Pevzner, Yuri; Miller, Benjamin T; Filippov, Igor V; Woodcock, H Lee; Brooks, Bernard R

    2015-01-01

    Recent availability of large publicly accessible databases of chemical compounds and their biological activities (PubChem, ChEMBL) has inspired us to develop a web-based tool for structure activity relationship and quantitative structure activity relationship modeling to add to the services provided by CHARMMing (www.charmming.org). This new module implements some of the most recent advances in modern machine learning algorithms-Random Forest, Support Vector Machine, Stochastic Gradient Descent, Gradient Tree Boosting, so forth. A user can import training data from Pubchem Bioassay data collections directly from our interface or upload his or her own SD files which contain structures and activity information to create new models (either categorical or numerical). A user can then track the model generation process and run models on new data to predict activity.

  18. Development and implementation of (Q)SAR modeling within the CHARMMing web-user interface.

    PubMed

    Weidlich, Iwona E; Pevzner, Yuri; Miller, Benjamin T; Filippov, Igor V; Woodcock, H Lee; Brooks, Bernard R

    2015-01-01

    Recent availability of large publicly accessible databases of chemical compounds and their biological activities (PubChem, ChEMBL) has inspired us to develop a web-based tool for structure activity relationship and quantitative structure activity relationship modeling to add to the services provided by CHARMMing (www.charmming.org). This new module implements some of the most recent advances in modern machine learning algorithms-Random Forest, Support Vector Machine, Stochastic Gradient Descent, Gradient Tree Boosting, so forth. A user can import training data from Pubchem Bioassay data collections directly from our interface or upload his or her own SD files which contain structures and activity information to create new models (either categorical or numerical). A user can then track the model generation process and run models on new data to predict activity. PMID:25362883

  19. Development and implementation of (Q)SAR modeling within the CHARMMing Web-user interface

    PubMed Central

    Weidlich, Iwona E.; Pevzner, Yuri; Miller, Benjamin T.; Filippov, Igor V.; Woodcock, H. Lee; Brooks, Bernard R.

    2014-01-01

    Recent availability of large publicly accessible databases of chemical compounds and their biological activities (PubChem, ChEMBL) has inspired us to develop a Web-based tool for SAR and QSAR modeling to add to the services provided by CHARMMing (www.charmming.org). This new module implements some of the most recent advances in modern machine learning algorithms – Random Forest, Support Vector Machine (SVM), Stochastic Gradient Descent, Gradient Tree Boosting etc. A user can import training data from Pubchem Bioassay data collections directly from our interface or upload his or her own SD files which contain structures and activity information to create new models (either categorical or numerical). A user can then track the model generation process and run models on new data to predict activity. PMID:25362883

  20. Theory and Modeling in Support of Tether

    NASA Technical Reports Server (NTRS)

    Chang, C. L.; Bergeron, G.; Drobot, A. D.; Papadopoulos, K.; Riyopoulos, S.; Szuszczewicz, E.

    1999-01-01

    This final report summarizes the work performed by SAIC's Applied Physics Operation on the modeling and support of Tethered Satellite System missions (TSS-1 and TSS-1R). The SAIC team, known to be Theory and Modeling in Support of Tether (TMST) investigation, was one of the original twelve teams selected in July, 1985 for the first TSS mission. The accomplishments described in this report cover the period December 19, 1985 to September 31, 1999 and are the result of a continuous effort aimed at supporting the TSS missions in the following major areas. During the contract period, the SAIC's TMST investigation acted to: Participate in the planning and the execution on both of the TSS missions; Provide scientific understanding on the issues involved in the electrodynamic tether system operation prior to the TSS missions; Predict ionospheric conditions encountered during the re-flight mission (TSS-lR) based on realtime global ionosounde data; Perform post mission analyses to enhance our understanding on the TSS results. Specifically, we have 1) constructed and improved current collection models and enhanced our understanding on the current-voltage data; 2) investigated the effects of neutral gas in the current collection processes; 3) conducted laboratory experiments to study the discharge phenomena during and after tether-break; and 4) perform numerical simulations to understand data collected by plasma instruments SPES onboard the TSS satellite; Design and produce multi-media CD that highlights TSS mission achievements and convey the knowledge of the tether technology to the general public. Along with discussions of this work, a list of publications and presentations derived from the TMST investigation spanning the reporting period is compiled.

  1. Theory and Modeling of Asymmetric Catalytic Reactions.

    PubMed

    Lam, Yu-Hong; Grayson, Matthew N; Holland, Mareike C; Simon, Adam; Houk, K N

    2016-04-19

    Modern density functional theory and powerful contemporary computers have made it possible to explore complex reactions of value in organic synthesis. We describe recent explorations of mechanisms and origins of stereoselectivities with density functional theory calculations. The specific functionals and basis sets that are routinely used in computational studies of stereoselectivities of organic and organometallic reactions in our group are described, followed by our recent studies that uncovered the origins of stereocontrol in reactions catalyzed by (1) vicinal diamines, including cinchona alkaloid-derived primary amines, (2) vicinal amidophosphines, and (3) organo-transition-metal complexes. Two common cyclic models account for the stereoselectivity of aldol reactions of metal enolates (Zimmerman-Traxler) or those catalyzed by the organocatalyst proline (Houk-List). Three other models were derived from computational studies described in this Account. Cinchona alkaloid-derived primary amines and other vicinal diamines are venerable asymmetric organocatalysts. For α-fluorinations and a variety of aldol reactions, vicinal diamines form enamines at one terminal amine and activate electrophilically with NH(+) or NF(+) at the other. We found that the stereocontrolling transition states are cyclic and that their conformational preferences are responsible for the observed stereoselectivity. In fluorinations, the chair seven-membered cyclic transition states is highly favored, just as the Zimmerman-Traxler chair six-membered aldol transition state controls stereoselectivity. In aldol reactions with vicinal diamine catalysts, the crown transition states are favored, both in the prototype and in an experimental example, shown in the graphic. We found that low-energy conformations of cyclic transition states occur and control stereoselectivities in these reactions. Another class of bifunctional organocatalysts, the vicinal amidophosphines, catalyzes the (3 + 2) annulation

  2. Jobs and Economic Development Impact (JEDI) Model: Offshore Wind User Reference Guide

    SciTech Connect

    Lantz, E.; Goldberg, M.; Keyser, D.

    2013-06-01

    The Offshore Wind Jobs and Economic Development Impact (JEDI) model, developed by NREL and MRG & Associates, is a spreadsheet based input-output tool. JEDI is meant to be a user friendly and transparent tool to estimate potential economic impacts supported by the development and operation of offshore wind projects. This guide describes how to use the model as well as technical information such as methodology, limitations, and data sources.

  3. Can behavioral theory inform the understanding of depression and medication nonadherence among HIV-positive substance users?

    PubMed

    Magidson, Jessica F; Listhaus, Alyson; Seitz-Brown, C J; Safren, Steven A; Lejuez, C W; Daughters, Stacey B

    2015-04-01

    Medication adherence is highly predictive of health outcomes across chronic conditions, particularly HIV/AIDS. Depression is consistently associated with worse adherence, yet few studies have sought to understand how depression relates to adherence. This study tested three components of behavioral depression theory--goal-directed activation, positive reinforcement, and environmental punishment--as potential indirect effects in the relation between depressive symptoms and medication nonadherence among low-income, predominantly African American substance users (n = 83). Medication nonadherence was assessed as frequency of doses missed across common reasons for nonadherence. Non-parametric bootstrapping was used to evaluate the indirect effects. Of the three intermediary variables, there was only an indirect effect of environmental punishment; depressive symptoms were associated with greater nonadherence through greater environmental punishment. Goal-directed activation and positive reinforcement were unrelated to adherence. Findings suggest the importance of environmental punishment in the relation between depression and medication adherence and may inform future intervention efforts for this population. PMID:25381605

  4. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  5. Chaos Theory as a Model for Managing Issues and Crises.

    ERIC Educational Resources Information Center

    Murphy, Priscilla

    1996-01-01

    Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…

  6. Theory and modeling of active brazing.

    SciTech Connect

    van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.

    2013-09-01

    Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.

  7. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    ERIC Educational Resources Information Center

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  8. Object relations theory and activity theory: a proposed link by way of the procedural sequence model.

    PubMed

    Ryle, A

    1991-12-01

    An account of object relations theory (ORT), represented in terms of the procedural sequence model (PSM), is compared to the ideas of Vygotsky and activity theory (AT). The two models are seen to be compatible and complementary and their combination offers a satisfactory account of human psychology, appropriate for the understanding and integration of psychotherapy. PMID:1786224

  9. Quantitative agent based model of user behavior in an Internet discussion forum.

    PubMed

    Sobkowicz, Pawel

    2013-01-01

    The paper presents an agent based simulation of opinion evolution, based on a nonlinear emotion/information/opinion (E/I/O) individual dynamics, to an actual Internet discussion forum. The goal is to reproduce the results of two-year long observations and analyses of the user communication behavior and of the expressed opinions and emotions, via simulations using an agent based model. The model allowed to derive various characteristics of the forum, including the distribution of user activity and popularity (outdegree and indegree), the distribution of length of dialogs between the participants, their political sympathies and the emotional content and purpose of the comments. The parameters used in the model have intuitive meanings, and can be translated into psychological observables.

  10. ModelMuse - A Graphical User Interface for MODFLOW-2005 and PHAST

    USGS Publications Warehouse

    Winston, Richard B.

    2009-01-01

    ModelMuse is a graphical user interface (GUI) for the U.S. Geological Survey (USGS) models MODFLOW-2005 and PHAST. This software package provides a GUI for creating the flow and transport input file for PHAST and the input files for MODFLOW-2005. In ModelMuse, the spatial data for the model is independent of the grid, and the temporal data is independent of the stress periods. Being able to input these data independently allows the user to redefine the spatial and temporal discretization at will. This report describes the basic concepts required to work with ModelMuse. These basic concepts include the model grid, data sets, formulas, objects, the method used to assign values to data sets, and model features. The ModelMuse main window has a top, front, and side view of the model that can be used for editing the model, and a 3-D view of the model that can be used to display properties of the model. ModelMuse has tools to generate and edit the model grid. It also has a variety of interpolation methods and geographic functions that can be used to help define the spatial variability of the model. ModelMuse can be used to execute both MODFLOW-2005 and PHAST and can also display the results of MODFLOW-2005 models. An example of using ModelMuse with MODFLOW-2005 is included in this report. Several additional examples are described in the help system for ModelMuse, which can be accessed from the Help menu.

  11. End users transforming experiences into formal information and process models for personalised health interventions.

    PubMed

    Lindgren, Helena; Lundin-Olsson, Lillemor; Pohl, Petra; Sandlund, Marlene

    2014-01-01

    Five physiotherapists organised a user-centric design process of a knowledge-based support system for promoting exercise and preventing falls. The process integrated focus group studies with 17 older adults and prototyping. The transformation of informal medical and rehabilitation expertise and older adults' experiences into formal information and process models during the development was studied. As tool they used ACKTUS, a development platform for knowledge-based applications. The process became agile and incremental, partly due to the diversity of expectations and preferences among both older adults and physiotherapists, and the participatory approach to design and development. In addition, there was a need to develop the knowledge content alongside with the formal models and their presentations, which allowed the participants to test hands-on and evaluate the ideas, content and design. The resulting application is modular, extendable, flexible and adaptable to the individual end user. Moreover, the physiotherapists are able to modify the information and process models, and in this way further develop the application. The main constraint was found to be the lack of support for the initial phase of concept modelling, which lead to a redesigned user interface and functionality of ACKTUS.

  12. Hiding the system from the user: Moving from complex mental models to elegant metaphors

    SciTech Connect

    Curtis W. Nielsen; David J. Bruemmer

    2007-08-01

    In previous work, increased complexity of robot behaviors and the accompanying interface design often led to operator confusion and/or a fight for control between the robot and operator. We believe the reason for the conflict was that the design of the interface and interactions presented too much of the underlying robot design model to the operator. Since the design model includes the implementation of sensors, behaviors, and sophisticated algorithms, the result was that the operator’s cognitive efforts were focused on understanding the design of the robot system as opposed to focusing on the task at hand. This paper illustrates how this very problem emerged at the INL and how the implementation of new metaphors for interaction has allowed us to hide the design model from the user and allow the user to focus more on the task at hand. Supporting the user’s focus on the task rather than on the design model allows increased use of the system and significant performance improvement in a search task with novice users.

  13. Reliability and Maintainability Model (RAM): User and Maintenance Manual. Part 2; Improved Supportability Analysis

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1996-01-01

    This report documents the procedures for utilizing and maintaining the Reliability & Maintainability Model (RAM) developed by the University of Dayton for the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). The purpose of the grant is to provide support to NASA in establishing operational and support parameters and costs of proposed space systems. As part of this research objective, the model described here was developed. This Manual updates and supersedes the 1995 RAM User and Maintenance Manual. Changes and enhancements from the 1995 version of the model are primarily a result of the addition of more recent aircraft and shuttle R&M data.

  14. NETPATH-WIN: an interactive user version of the mass-balance model, NETPATH

    USGS Publications Warehouse

    El-Kadi, A. I.; Plummer, L.N.; Aggarwal, P.

    2011-01-01

    NETPATH-WIN is an interactive user version of NETPATH, an inverse geochemical modeling code used to find mass-balance reaction models that are consistent with the observed chemical and isotopic composition of waters from aquatic systems. NETPATH-WIN was constructed to migrate NETPATH applications into the Microsoft WINDOWS® environment. The new version facilitates model utilization by eliminating difficulties in data preparation and results analysis of the DOS version of NETPATH, while preserving all of the capabilities of the original version. Through example applications, the note describes some of the features of NETPATH-WIN as applied to adjustment of radiocarbon data for geochemical reactions in groundwater systems.

  15. User's Manual for Data for Validating Models for PV Module Performance

    SciTech Connect

    Marion, W.; Anderberg, A.; Deline, C.; Glick, S.; Muller, M.; Perrin, G.; Rodriguez, J.; Rummel, S.; Terwilliger, K.; Silverman, T. J.

    2014-04-01

    This user's manual describes performance data measured for flat-plate photovoltaic (PV) modules installed in Cocoa, Florida, Eugene, Oregon, and Golden, Colorado. The data include PV module current-voltage curves and associated meteorological data for approximately one-year periods. These publicly available data are intended to facilitate the validation of existing models for predicting the performance of PV modules, and for the development of new and improved models. For comparing different modeling approaches, using these public data will provide transparency and more meaningful comparisons of the relative benefits.

  16. User's manual for a parameter identification technique. [with options for model simulation for fixed input forcing functions and identification from wind tunnel and flight measurements

    NASA Technical Reports Server (NTRS)

    Kanning, G.

    1975-01-01

    A digital computer program written in FORTRAN is presented that implements the system identification theory for deterministic systems using input-output measurements. The user supplies programs simulating the mathematical model of the physical plant whose parameters are to be identified. The user may choose any one of three options. The first option allows for a complete model simulation for fixed input forcing functions. The second option identifies up to 36 parameters of the model from wind tunnel or flight measurements. The third option performs a sensitivity analysis for up to 36 parameters. The use of each option is illustrated with an example using input-output measurements for a helicopter rotor tested in a wind tunnel.

  17. Micromechanics of metal matrix composites using the Generalized Method of Cells model (GMC) user's guide

    NASA Technical Reports Server (NTRS)

    Aboudi, Jacob; Pindera, Marek-Jerzy

    1992-01-01

    A user's guide for the program gmc.f is presented. The program is based on the generalized method of cells model (GMC) which is capable via a micromechanical analysis, of predicting the overall, inelastic behavior of unidirectional, multi-phase composites from the knowledge of the properties of the viscoplastic constituents. In particular, the program is sufficiently general to predict the response of unidirectional composites having variable fiber shapes and arrays.

  18. A Quantitative Causal Model Theory of Conditional Reasoning

    ERIC Educational Resources Information Center

    Fernbach, Philip M.; Erb, Christopher D.

    2013-01-01

    The authors propose and test a causal model theory of reasoning about conditional arguments with causal content. According to the theory, the acceptability of modus ponens (MP) and affirming the consequent (AC) reflect the conditional likelihood of causes and effects based on a probabilistic causal model of the scenario being judged. Acceptability…

  19. Streamflow forecasting using the modular modeling system and an object-user interface

    USGS Publications Warehouse

    Jeton, A.E.

    2001-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Bureau of Reclamation (BOR), developed a computer program to provide a general framework needed to couple disparate environmental resource models and to manage the necessary data. The Object-User Interface (OUI) is a map-based interface for models and modeling data. It provides a common interface to run hydrologic models and acquire, browse, organize, and select spatial and temporal data. One application is to assist river managers in utilizing streamflow forecasts generated with the Precipitation-Runoff Modeling System running in the Modular Modeling System (MMS), a distributed-parameter watershed model, and the National Weather Service Extended Streamflow Prediction (ESP) methodology.

  20. Program evaluation models and related theories: AMEE guide no. 67.

    PubMed

    Frye, Ann W; Hemmer, Paul A

    2012-01-01

    This Guide reviews theories of science that have influenced the development of common educational evaluation models. Educators can be more confident when choosing an appropriate evaluation model if they first consider the model's theoretical basis against their program's complexity and their own evaluation needs. Reductionism, system theory, and (most recently) complexity theory have inspired the development of models commonly applied in evaluation studies today. This Guide describes experimental and quasi-experimental models, Kirkpatrick's four-level model, the Logic Model, and the CIPP (Context/Input/Process/Product) model in the context of the theories that influenced their development and that limit or support their ability to do what educators need. The goal of this Guide is for educators to become more competent and confident in being able to design educational program evaluations that support intentional program improvement while adequately documenting or describing the changes and outcomes-intended and unintended-associated with their programs.

  1. Program evaluation models and related theories: AMEE guide no. 67.

    PubMed

    Frye, Ann W; Hemmer, Paul A

    2012-01-01

    This Guide reviews theories of science that have influenced the development of common educational evaluation models. Educators can be more confident when choosing an appropriate evaluation model if they first consider the model's theoretical basis against their program's complexity and their own evaluation needs. Reductionism, system theory, and (most recently) complexity theory have inspired the development of models commonly applied in evaluation studies today. This Guide describes experimental and quasi-experimental models, Kirkpatrick's four-level model, the Logic Model, and the CIPP (Context/Input/Process/Product) model in the context of the theories that influenced their development and that limit or support their ability to do what educators need. The goal of this Guide is for educators to become more competent and confident in being able to design educational program evaluations that support intentional program improvement while adequately documenting or describing the changes and outcomes-intended and unintended-associated with their programs. PMID:22515309

  2. The Woodcock-Johnson Tests of Cognitive Abilities III's Cognitive Performance Model: Empirical Support for Intermediate Factors within CHC Theory

    ERIC Educational Resources Information Center

    Taub, Gordon E.; McGrew, Kevin S.

    2014-01-01

    The Woodcock-Johnson Tests of Cognitive Ability Third Edition is developed using the Cattell-Horn-Carroll (CHC) measurement-theory test design as the instrument's theoretical blueprint. The instrument provides users with cognitive scores based on the Cognitive Performance Model (CPM); however, the CPM is not a part of CHC theory. Within the…

  3. AMME: an Automatic Mental Model Evaluation to analyse user behaviour traced in a finite, discrete state space.

    PubMed

    Rauterberg, M

    1993-11-01

    To support the human factors engineer in designing a good user interface, a method has been developed to analyse the empirical data of the interactive user behaviour traced in a finite discrete state space. The sequences of actions produced by the user contain valuable information about the mental model of this user, the individual problem solution strategies for a given task and the hierarchical structure of the task-subtasks relationships. The presented method, AMME, can analyse the action sequences and automatically generate (1) a net description of the task dependent model of the user, (2) a complete state transition matrix, and (3) various quantitative measures of the user's task solving process. The behavioural complexity of task-solving processes carried out by novices has been found to be significantly larger than the complexity of task-solving processes carried out by experts.

  4. DRAC: a user-friendly computer code for modeling transient thermohydraulic phenomena in solar-receiver tubing

    SciTech Connect

    Winters, W.S.

    1983-01-01

    This document is intended to familiarize potential users with the capabilities of DRAC (Dynamic Receiver Analysis Code). DRAC is the first in a series of user friendly driver programs for the more general code, TOPAZ (Transient-One-Dimensional Pipe Flow Analyzer). DRAC is a relatively easy-to-use code which permits the user to model both transient and steady-state thermohydraulic phenomena in solar receiver tubing. Users may specify arbitrary, time-dependent, incident heat flux profiles and/or flow rate changes and DRAC will calculate the resulting transient excursions in tube wall temperature and fluid properties. Radiative and convective losses are accounted for and the user may model any receiver fluid (compressible or incompressible) for which thermodynamic data exists. A description of the DRAC code, a comprehensive set of steady-state validation calculations, and detailed user instructions are presented.

  5. Integrating HCV services for drug users: a model to improve engagement and outcomes.

    PubMed

    Sylvestre, Diana L; Zweben, Joan E

    2007-10-01

    Although the majority of prevalent and incident cases of hepatitis C are related to injection drug use, drug users often find it difficult to access treatment services because of concerns about adherence and treatment candidacy. In response to the growing epidemic, OASIS, a nonprofit community clinic, developed a successful peer-based HCV group that allowed us to engage, educate, test, and treat hepatitis C in large numbers of drug users, the majority of whom have multiple potential barriers to intervention. Integrating services for hepatitis C, addiction, mental health, and psychosocial problems, the model involves a collaboration of medical providers and peer educators and incorporates elements of other proven behavioural models, including self-help groups, therapeutic communities, and peer interventions. Our results indicate that this peer-based model is successful at engaging, educating, and treating a diverse spectrum of chaotic drug users. We conclude that an integrated, peer-based approach to intervention can engage even the most challenging addicted patients with hepatitis C, and can facilitate their successful screening and treatment.

  6. Large field inflation models from higher-dimensional gauge theories

    NASA Astrophysics Data System (ADS)

    Furuuchi, Kazuyuki; Koyama, Yoji

    2015-02-01

    Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante's Inferno model turns out to be the most preferred model in this framework.

  7. Large field inflation models from higher-dimensional gauge theories

    SciTech Connect

    Furuuchi, Kazuyuki; Koyama, Yoji

    2015-02-23

    Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante’s Inferno model turns out to be the most preferred model in this framework.

  8. Theory of stellar convection - II. First stellar models

    NASA Astrophysics Data System (ADS)

    Pasetto, S.; Chiosi, C.; Chiosi, E.; Cropper, M.; Weiss, A.

    2016-07-01

    We present here the first stellar models on the Hertzsprung-Russell diagram, in which convection is treated according to the new scale-free convection theory (SFC theory) by Pasetto et al. The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few per cent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients ∇ and ∇e, and energy fluxes that are very similar to those derived from the `calibrated' MT theory for main-sequence stars. We conclude that the old scale dependent ML theory can now be replaced with a self-consistent scale-free theory able to predict correct results, as it is more physically grounded than the ML theory. Fundamentally, the SFC theory offers a deeper insight of the underlying physics than numerical simulations.

  9. Prediction of User's Web-Browsing Behavior: Application of Markov Model.

    PubMed

    Awad, M A; Khalil, I

    2012-08-01

    Web prediction is a classification problem in which we attempt to predict the next set of Web pages that a user may visit based on the knowledge of the previously visited pages. Predicting user's behavior while serving the Internet can be applied effectively in various critical applications. Such application has traditional tradeoffs between modeling complexity and prediction accuracy. In this paper, we analyze and study Markov model and all- Kth Markov model in Web prediction. We propose a new modified Markov model to alleviate the issue of scalability in the number of paths. In addition, we present a new two-tier prediction framework that creates an example classifier EC, based on the training examples and the generated classifiers. We show that such framework can improve the prediction time without compromising prediction accuracy. We have used standard benchmark data sets to analyze, compare, and demonstrate the effectiveness of our techniques using variations of Markov models and association rule mining. Our experiments show the effectiveness of our modified Markov model in reducing the number of paths without compromising accuracy. Additionally, the results support our analysis conclusions that accuracy improves with higher orders of all- Kth model. PMID:22394580

  10. Labor Market Projections Model: a user's guide to the population, labor force, and unemployment projections model at Lawrence Berkeley Laboratory

    SciTech Connect

    Schroeder, E.

    1980-08-01

    In an effort to assist SESA analysts and CETA prime sponsor planners in the development of labor-market information suitable to their annual plans, the Labor Market Projections Model (LMPM) was initiated. The purpose of LMPM is to provide timely information on the demographic characteristics of local populations, labor supply, and unemployment. In particular, the model produces short-term projections of the distributions of population, labor force, and unemployment by age, sex, and race. LMPM was designed to carry out these projections at various geographic levels - counties, prime-sponsor areas, SMSAs, and states. While LMPM can project population distributions for areas without user input, the labor force and unemployment projections rely upon inputs from analysts or planners familiar with the economy of the area of interest. Thus, LMPM utilizes input from the SESA analysts. This User's Guide to LMPM was specifically written as an aid to SESA analysts and other users in improving their understanding of LMPM. The basic method of LMPM is a demographic cohort aging model that relies upon 1970 Census data. LMPM integrates data from several sources in order to produce current projections from the 1970 baseline for all the local areas of the nation. This User's Guide documents the procedures, data, and output of LMPM. 11 references.

  11. Using the Theory of Planned Behavior to investigate condom use behaviors among female injecting drug users who are also sex workers in China.

    PubMed

    Gu, Jing; Lau, Joseph T F; Chen, Xi; Liu, Chuliang; Liu, Jun; Chen, Hongyao; Wang, Renfan; Lei, Zhangquan; Li, Zhenglin

    2009-08-01

    Female injecting drug users who are sex workers (IDUFSWs) is a strategic "bridge population" for HIV transmission. Goals of the study were to investigate condom use behaviors during commercial sex among IDUFSWs using the Theory of Planned Behavior (TPB), and to investigate moderating effects that modify the strength of associations between the TPB-related variables and inconsistent condom use during commercial sex. A total of 281 non-institutionalized IDUFSWs were recruited using snowball sampling method. Anonymous face-to-face interviews were administered by trained doctors. The results showed that the prevalence of inconsistent condom use during commercial sex in the last six months was 64%. After adjusting for some significant background variables (e.g. main venue of sex work), all associations between the five TPB-related variables and the studied condom use variable were statistically significant (Odds Ratio (OR) = 0.43-0.68, p<0.001). In the hierarchical nested models, three background variables (age, venue of sex work, and ever used HIV-related interventions) entered in the first step (-2LL = 294.98, p<0.001) and the Social Norm Scale, the Perceived Behavioral Control Scale and the Behavioral Intention Scale were selected by the second step (OR = 0.67 - 0.72, p<0.01; -2LL = 160.99, p<0.001). Significant moderating effects between some TPB-related variables (Positive Condom use Attitude Scale and Behavioral Intention Scale) and duration of sex work and duration of drug use were also reported. The results highlighted the potential of using the TPB to better understand condom use behaviors in IDUFSWs in China. Theory-based research and intervention work should be developed in China in the future. PMID:20024752

  12. Mars Global Reference Atmospheric Model 2000 Version (Mars-GRAM 2000): Users Guide

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; James, B. F.

    2000-01-01

    This report presents Mars Global Reference Atmospheric Model 2000 Version (Mars-GRAM 2000) and its new features. All parameterizations for temperature, pressure, density, and winds versus height, latitude, longitude, time of day, and L(sub s) have been replaced by input data tables from NASA Ames Mars General Circulation Model (MGCM) for the surface through 80-km altitude and the University of Arizona Mars Thermospheric General Circulation Model (MTGCM) for 80 to 170 km. A modified Stewart thermospheric model is still used for higher altitudes and for dependence on solar activity. "Climate factors" to tune for agreement with GCM data are no longer needed. Adjustment of exospheric temperature is still an option. Consistent with observations from Mars Global Surveyor, a new longitude-dependent wave model is included with user input to specify waves having 1 to 3 wavelengths around the planet. A simplified perturbation model has been substituted for the earlier one. An input switch allows users to select either East or West longitude positive. This memorandum includes instructions on obtaining Mars-GRAM source code and data files and for running the program. It also provides sample input and output and an example for incorporating Mars-GRAM as an atmospheric subroutine in a trajectory code.

  13. Non-linear sigma-models and string theories

    SciTech Connect

    Sen, A.

    1986-10-01

    The connection between sigma-models and string theories is discussed, as well as how the sigma-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs. (LEW)

  14. User Guide for VISION 3.4.7 (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Wendell D. Hintze

    2011-07-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters and options; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating 'what if' scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., 'reactor types' not individual reactors and 'separation types' not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation or disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. VISION is comprised of several Microsoft Excel input files, a Powersim Studio core, and several Microsoft Excel output files. All must be co-located in the same folder on a PC to function. You must use Powersim Studio 8 or better. We have tested VISION with the Studio 8 Expert, Executive, and Education versions. The Expert and Education

  15. User-driven Cloud Implementation of environmental models and data for all

    NASA Astrophysics Data System (ADS)

    Gurney, R. J.; Percy, B. J.; Elkhatib, Y.; Blair, G. S.

    2014-12-01

    Environmental data and models come from disparate sources over a variety of geographical and temporal scales with different resolutions and data standards, often including terabytes of data and model simulations. Unfortunately, these data and models tend to remain solely within the custody of the private and public organisations which create the data, and the scientists who build models and generate results. Although many models and datasets are theoretically available to others, the lack of ease of access tends to keep them out of reach of many. We have developed an intuitive web-based tool that utilises environmental models and datasets located in a cloud to produce results that are appropriate to the user. Storyboards showing the interfaces and visualisations have been created for each of several exemplars. A library of virtual machine images has been prepared to serve these exemplars. Each virtual machine image has been tailored to run computer models appropriate to the end user. Two approaches have been used; first as RESTful web services conforming to the Open Geospatial Consortium (OGC) Web Processing Service (WPS) interface standard using the Python-based PyWPS; second, a MySQL database interrogated using PHP code. In all cases, the web client sends the server an HTTP GET request to execute the process with a number of parameter values and, once execution terminates, an XML or JSON response is sent back and parsed at the client side to extract the results. All web services are stateless, i.e. application state is not maintained by the server, reducing its operational overheads and simplifying infrastructure management tasks such as load balancing and failure recovery. A hybrid cloud solution has been used with models and data sited on both private and public clouds. The storyboards have been transformed into intuitive web interfaces at the client side using HTML, CSS and JavaScript, utilising plug-ins such as jQuery and Flot (for graphics), and Google Maps

  16. The integration of a novice user interface into a professional modeling tool.

    PubMed Central

    Ramakrishnan, S.; Hmelo, C. E.; Day, R. S.; Shirey, W. E.; Huang, Q.

    1998-01-01

    This paper describes a software tool, the Oncology Thinking Cap (OncoTCAP) and reports on our efforts to develop a novice user interface to simplify the task of describing biological models of cancer and its treatment. Oncology Thinking Cap includes a modeling tool for making relationships explicit and provide dynamic feedback about the interaction between cancer cell kinetics, treatments, and patient outcomes. OncoTCAP supports student learning by making normally invisible processes visible and providing a representational tool that can be used to conduct thought experiments. We also describe our novice interface and report the results of initial usability testing. Images Figure 2 Figure 3 PMID:9929305

  17. A users manual for the method of moments Aircraft Modeling Code (AMC), version 2

    NASA Technical Reports Server (NTRS)

    Peters, M. E.; Newman, E. H.

    1994-01-01

    This report serves as a user's manual for Version 2 of the 'Aircraft Modeling Code' or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. This report describes the input command language and also includes several examples which illustrate typical code inputs and outputs.

  18. A user's manual for the method of moments Aircraft Modeling Code (AMC)

    NASA Technical Reports Server (NTRS)

    Peters, M. E.; Newman, E. H.

    1989-01-01

    This report serves as a user's manual for the Aircraft Modeling Code or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. The input command language is described and several examples which illustrate typical code inputs and outputs are also included.

  19. Family Theory According to the Cambridge Model

    ERIC Educational Resources Information Center

    White, Stephen L.

    1978-01-01

    This paper is a summary of and an introduction to the family systems theory of David Kantor and William Lehr as expressed in their book, Inside the Family. Their concepts of family boundaries, dimensions, typal arrangements, and the four player system are presented and discussed. (Author)

  20. A Comprehensive and Systematic Model of User Evaluation of Web Search Engines: II. An Evaluation by Undergraduates.

    ERIC Educational Resources Information Center

    Su, Louise T.

    2003-01-01

    Presents an application of a model of user evaluation of four major Web search engines (Alta Vista, Excite, Infoseek, and Lycos) by undergraduates. Evaluation was based on 16 performance measures representing five evaluation criteria-relevance, efficiency, utility, user satisfaction, and connectivity. Content analysis of verbal data identified a…

  1. Modeling transonic aerodynamic response using nonlinear systems theory for use with modern control theory

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1993-01-01

    The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.

  2. Gutzwiller variational theory for the Hubbard model with attractive interaction.

    PubMed

    Bünemann, Jörg; Gebhard, Florian; Radnóczi, Katalin; Fazekas, Patrik

    2005-06-29

    We investigate the electronic and superconducting properties of a negative-U Hubbard model. For this purpose we evaluate a recently introduced variational theory based on Gutzwiller-correlated BCS wavefunctions. We find significant differences between our approach and standard BCS theory, especially for the superconducting gap. For small values of |U|, we derive analytical expressions for the order parameter and the superconducting gap which we compare to exact results from perturbation theory.

  3. Psycholinguistic Theory of Learning to Read Compared to the Traditional Theory Model.

    ERIC Educational Resources Information Center

    Murphy, Robert F.

    A comparison of two models of the reading process--the psycholinguistic model, in which learning to read is seen as a top-down, holistic procedure, and the traditional theory model, in which learning to read is seen as a bottom-up, atomistic procedure--is provided in this paper. The first part of the paper provides brief overviews of the following…

  4. Hawaii demand-side management resource assessment. Final report, Reference Volume 4: The DBEDT DSM assessment model user`s manual

    SciTech Connect

    1995-04-01

    The DBEDT DSM Assessment Model (DSAM) is a spreadsheet model developed in Quattro Pro for Windows that is based on the integration of the DBEDT energy forecasting model, ENERGY 2020, with the output from the building energy use simulation model, DOE-2. DOE-2 provides DSM impact estimates for both energy and peak demand. The ``User`s Guide`` is designed to assist DBEDT staff in the operation of DSAM. Supporting information on model structure and data inputs are provided in Volumes 2 and 3 of the Final Report. DSAM is designed to provide DBEDT estimates of the potential DSM resource for each county in Hawaii by measure, program, sector, year, and levelized cost category. The results are provided for gas and electric and for both energy and peak demand. There are two main portions of DSAM, the residential sector and the commercial sector. The basic underlying logic for both sectors are the same. However, there are some modeling differences between the two sectors. The differences are primarily the result of (1) the more complex nature of the commercial sector, (2) memory limitations within Quattro Pro, and (3) the fact that the commercial sector portion of the model was written four months after the residential sector portion. The structure for both sectors essentially consists of a series of input spreadsheets, the portion of the model where the calculations are performed, and a series of output spreadsheets. The output spreadsheets contain both detailed and summary tables and graphs.

  5. The danger model: questioning an unconvincing theory.

    PubMed

    Józefowski, Szczepan

    2016-02-01

    Janeway's pattern recognition theory holds that the immune system detects infection through a limited number of the so-called pattern recognition receptors (PRRs). These receptors bind specific chemical compounds expressed by entire groups of related pathogens, but not by host cells (pathogen-associated molecular patterns (PAMPs). In contrast, Matzinger's danger hypothesis postulates that products released from stressed or damaged cells have a more important role in the activation of immune system than the recognition of nonself. These products, named by analogy to PAMPs as danger-associated molecular patterns (DAMPs), are proposed to act through the same receptors (PRRs) as PAMPs and, consequently, to stimulate largely similar responses. Herein, I review direct and indirect evidence that contradict the widely accepted danger theory, and suggest that it may be false.

  6. [Models of economic theory of population growth].

    PubMed

    Von Zameck, W

    1987-01-01

    "The economic theory of population growth applies the opportunity cost approach to the fertility decision. Variations and differentials in fertility are caused by the available resources and relative prices or by the relative production costs of child services. Pure changes in real income raise the demand for children or the total amount spent on children. If relative prices or production costs and real income are affected together the effect on fertility requires separate consideration." (SUMMARY IN ENG)

  7. An information model to support user-centered design of medical devices.

    PubMed

    Hagedorn, Thomas J; Krishnamurty, Sundar; Grosse, Ian R

    2016-08-01

    The process of engineering design requires the product development team to balance the needs and limitations of many stakeholders, including those of the user, regulatory organizations, and the designing institution. This is particularly true in medical device design, where additional consideration must be given for a much more complex user-base that can only be accessed on a limited basis. Given this inherent challenge, few projects exist that consider design domain concepts, such as aspects of a detailed design, a detailed view of various stakeholders and their capabilities, along with the user-needs simultaneously. In this paper, we present a novel information model approach that combines a detailed model of design elements with a model of the design itself, customer requirements, and of the capabilities of the customer themselves. The information model is used to facilitate knowledge capture and automated reasoning across domains with a minimal set of rules by adopting a terminology that treats customer and design specific factors identically, thus enabling straightforward assessments. A uniqueness of this approach is that it systematically provides an integrated perspective on the key usability information that drive design decisions towards more universal or effective outcomes with the very design information impacted by the usability information. This can lead to cost-efficient optimal designs based on a direct inclusion of the needs of customers alongside those of business, marketing, and engineering requirements. Two case studies are presented to show the method's potential as a more effective knowledge management tool with built-in automated inferences that provide design insight, as well as its overall effectiveness as a platform to develop and execute medical device design from a holistic perspective. PMID:27401857

  8. An information model to support user-centered design of medical devices.

    PubMed

    Hagedorn, Thomas J; Krishnamurty, Sundar; Grosse, Ian R

    2016-08-01

    The process of engineering design requires the product development team to balance the needs and limitations of many stakeholders, including those of the user, regulatory organizations, and the designing institution. This is particularly true in medical device design, where additional consideration must be given for a much more complex user-base that can only be accessed on a limited basis. Given this inherent challenge, few projects exist that consider design domain concepts, such as aspects of a detailed design, a detailed view of various stakeholders and their capabilities, along with the user-needs simultaneously. In this paper, we present a novel information model approach that combines a detailed model of design elements with a model of the design itself, customer requirements, and of the capabilities of the customer themselves. The information model is used to facilitate knowledge capture and automated reasoning across domains with a minimal set of rules by adopting a terminology that treats customer and design specific factors identically, thus enabling straightforward assessments. A uniqueness of this approach is that it systematically provides an integrated perspective on the key usability information that drive design decisions towards more universal or effective outcomes with the very design information impacted by the usability information. This can lead to cost-efficient optimal designs based on a direct inclusion of the needs of customers alongside those of business, marketing, and engineering requirements. Two case studies are presented to show the method's potential as a more effective knowledge management tool with built-in automated inferences that provide design insight, as well as its overall effectiveness as a platform to develop and execute medical device design from a holistic perspective.

  9. Theory and model use in social marketing health interventions.

    PubMed

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  10. Theory and model use in social marketing health interventions.

    PubMed

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions. PMID:22934539

  11. Lectures on F-theory compactifications and model building

    NASA Astrophysics Data System (ADS)

    Weigand, Timo

    2010-11-01

    These lecture notes are devoted to formal and phenomenological aspects of F-theory. We begin with a pedagogical introduction to the general concepts of F-theory, covering classic topics such as the connection to type IIB orientifolds, the geometry of elliptic fibrations and the emergence of gauge groups, matter and Yukawa couplings. As a suitable framework for the construction of compact F-theory vacua we describe a special class of Weierstrass models called Tate models, whose local properties are captured by the spectral cover construction. Armed with this technology we proceed with a survey of F-theory grand unified theory (GUT) models, aiming at an overview of basic conceptual and phenomenological aspects, in particular in connection with GUT breaking via hypercharge flux.

  12. Optimal water allocation in small hydropower plants between traditional and non-traditional water users: merging theory and existing practices.

    NASA Astrophysics Data System (ADS)

    Gorla, Lorenzo; Crouzy, Benoît; Perona, Paolo

    2014-05-01

    Water demand for hydropower production is increasing together with the consciousness of the importance of riparian ecosystems and biodiversity. Some Cantons in Switzerland and other alpine regions in Austria and in Süd Tiröl (Italy) started replacing the inadequate concept of Minimum Flow Requirement (MFR) with a dynamic one, by releasing a fix percentage of the total inflow (e.g. 25 %) to the environment. Starting from a model proposed by Perona et al. (2013) and the need of including the environment as an actual water user, we arrived to similar qualitative results, and better quantitative performances. In this paper we explore the space of non-proportional water repartition rules analysed by Gorla and Perona (2013), and we propose new ecological indicators which are directly derived from current ecologic evaluation practices (fish habitat modelling and hydrological alteration). We demonstrate that both MFR water redistribution policy and also proportional repartition rules can be improved using nothing but available information. Furthermore, all water redistribution policies can be described by the model proposed by Perona et al. (2013) in terms of the Principle of Equal Marginal Utility (PEMU) and a suitable class of nonlinear functions. This is particularly useful to highlights implicit assumptions and choosing best-compromise solutions, providing analytical reasons explaining why efficiency cannot be attained by classic repartition rules. Each water repartition policy underlies an ecosystem monetization and a political choice always has to be taken. We explicit the value of the ecosystem health underlying each policy by means of the PEMU under a few assumptions, and discuss how the theoretic efficient redistribution law obtained by our approach is feasible and doesn't imply high costs or advanced management tools. For small run-of-river power plants, this methodology answers the question "how much water should be left to the river?" and is therefore a

  13. The HADDOCK2.2 Web Server: User-Friendly Integrative Modeling of Biomolecular Complexes.

    PubMed

    van Zundert, G C P; Rodrigues, J P G L M; Trellet, M; Schmitz, C; Kastritis, P L; Karaca, E; Melquiond, A S J; van Dijk, M; de Vries, S J; Bonvin, A M J J

    2016-02-22

    The prediction of the quaternary structure of biomolecular macromolecules is of paramount importance for fundamental understanding of cellular processes and drug design. In the era of integrative structural biology, one way of increasing the accuracy of modeling methods used to predict the structure of biomolecular complexes is to include as much experimental or predictive information as possible in the process. This has been at the core of our information-driven docking approach HADDOCK. We present here the updated version 2.2 of the HADDOCK portal, which offers new features such as support for mixed molecule types, additional experimental restraints and improved protocols, all of this in a user-friendly interface. With well over 6000 registered users and 108,000 jobs served, an increasing fraction of which on grid resources, we hope that this timely upgrade will help the community to solve important biological questions and further advance the field. The HADDOCK2.2 Web server is freely accessible to non-profit users at http://haddock.science.uu.nl/services/HADDOCK2.2.

  14. The HADDOCK2.2 Web Server: User-Friendly Integrative Modeling of Biomolecular Complexes.

    PubMed

    van Zundert, G C P; Rodrigues, J P G L M; Trellet, M; Schmitz, C; Kastritis, P L; Karaca, E; Melquiond, A S J; van Dijk, M; de Vries, S J; Bonvin, A M J J

    2016-02-22

    The prediction of the quaternary structure of biomolecular macromolecules is of paramount importance for fundamental understanding of cellular processes and drug design. In the era of integrative structural biology, one way of increasing the accuracy of modeling methods used to predict the structure of biomolecular complexes is to include as much experimental or predictive information as possible in the process. This has been at the core of our information-driven docking approach HADDOCK. We present here the updated version 2.2 of the HADDOCK portal, which offers new features such as support for mixed molecule types, additional experimental restraints and improved protocols, all of this in a user-friendly interface. With well over 6000 registered users and 108,000 jobs served, an increasing fraction of which on grid resources, we hope that this timely upgrade will help the community to solve important biological questions and further advance the field. The HADDOCK2.2 Web server is freely accessible to non-profit users at http://haddock.science.uu.nl/services/HADDOCK2.2. PMID:26410586

  15. Using the NIATx Model to Implement User-Centered Design of Technology for Older Adults

    PubMed Central

    Maus, Adam; Judkins, Julianne; Dinauer, Susan; Isham, Andrew; Johnson, Roberta; Landucci, Gina; Atwood, Amy K

    2016-01-01

    What models can effectively guide the creation of eHealth and mHealth technologies? This paper describes the use of the NIATx model as a framework for the user-centered design of a new technology for older adults. The NIATx model is a simple framework of process improvement based on the following principles derived from an analysis of decades of research from various industries about why some projects fail and others succeed: (1) Understand and involve the customer; (2) fix key problems; (3) pick an influential change leader; (4) get ideas from outside the field; (5) use rapid-cycle testing. This paper describes the use of these principles in technology development, the strengths and challenges of using this approach in this context, and lessons learned from the process. Overall, the NIATx model enabled us to produce a user-focused technology that the anecdotal evidence available so far suggests is engaging and useful to older adults. The first and fourth principles were especially important in developing the technology; the fourth proved the most challenging to use. PMID:27025985

  16. A user-oriented and computerized model for estimating vehicle ride quality

    NASA Technical Reports Server (NTRS)

    Leatherwood, J. D.; Barker, L. M.

    1984-01-01

    A simplified empirical model and computer program for estimating passenger ride comfort within air and surface transportation systems are described. The model is based on subjective ratings from more than 3000 persons who were exposed to controlled combinations of noise and vibration in the passenger ride quality apparatus. This model has the capability of transforming individual elements of a vehicle's noise and vibration environment into subjective discomfort units and then combining the subjective units to produce a single discomfort index typifying passenger acceptance of the environment. The computational procedures required to obtain discomfort estimates are discussed, and a user oriented ride comfort computer program is described. Examples illustrating application of the simplified model to helicopter and automobile ride environments are presented.

  17. An evolving user-oriented model of Internet health information seeking.

    PubMed

    Gaie, Martha J

    2006-01-01

    This paper presents an evolving user-oriented model of Internet health information seeking (IS) based on qualitative data collected from 22 lung cancer (LC) patients and caregivers. This evolving model represents information search behavior as more highly individualized, complex, and dynamic than previous models, including pre-search psychological activity, use of multiple heuristics throughout the process, and cost-benefit evaluation of search results. This study's findings suggest that IS occurs in four distinct phases: search initiation/continuation, selective exposure, message processing, and message evaluation. The identification of these phases and the heuristics used within them suggests a higher order of complexity in the decision-making processes that underlie IS, which could lead to the development of a conceptual framework that more closely reflects the complex nature of contextualized IS. It also illustrates the advantages of using qualitative methods to extract more subtle details of the IS process and fill in the gaps in existing models.

  18. Four Philosophical Models of the Relation Between Theory and Practice

    ERIC Educational Resources Information Center

    Jorgensen, Estelle R.

    2005-01-01

    Since music education straddles theory and practice, the author's purpose is to sketch the strengths and weaknesses of four philosophical models of the relationship between theory and practice. She demonstrates that none of them suffices when taken alone; each has something to offer and its own detractions. She then concludes with four suggested…

  19. Scaling theory of depinning in the Sneppen model

    SciTech Connect

    Maslov, S.; Paczuski, M. Department of Physics, State University of New York at Stony Brook, Stony Brook, New York 11790 The Isaac Newton Institute for Mathematical Sciences, 20 Clarkson Road, Cambridge CB4 0EH )

    1994-08-01

    We develop a scaling theory for the critical depinning behavior of the Sneppen interface model [Phys. Rev. Lett. [bold 69], 3539 (1992)]. This theory is based on a gap'' equation that describes the self-organization process to a critical state of the depinning transition. All of the critical exponents can be expressed in terms of two independent exponents, [nu][sub [parallel

  20. User-Friendly Predictive Modeling of Greenhouse Gas (GHG) Fluxes and Carbon Storage in Tidal Wetlands

    NASA Astrophysics Data System (ADS)

    Ishtiaq, K. S.; Abdul-Aziz, O. I.

    2015-12-01

    We developed user-friendly empirical models to predict instantaneous fluxes of CO2 and CH4 from coastal wetlands based on a small set of dominant hydro-climatic and environmental drivers (e.g., photosynthetically active radiation, soil temperature, water depth, and soil salinity). The dominant predictor variables were systematically identified by applying a robust data-analytics framework on a wide range of possible environmental variables driving wetland greenhouse gas (GHG) fluxes. The method comprised of a multi-layered data-analytics framework, including Pearson correlation analysis, explanatory principal component and factor analyses, and partial least squares regression modeling. The identified dominant predictors were finally utilized to develop power-law based non-linear regression models to predict CO2 and CH4 fluxes under different climatic, land use (nitrogen gradient), tidal hydrology and salinity conditions. Four different tidal wetlands of Waquoit Bay, MA were considered as the case study sites to identify the dominant drivers and evaluate model performance. The study sites were dominated by native Spartina Alterniflora and characterized by frequent flooding and high saline conditions. The model estimated the potential net ecosystem carbon balance (NECB) both in gC/m2 and metric tonC/hectare by up-scaling the instantaneous predicted fluxes to the growing season and accounting for the lateral C flux exchanges between the wetlands and estuary. The entire model was presented in a single Excel spreadsheet as a user-friendly ecological engineering tool. The model can aid the development of appropriate GHG offset protocols for setting monitoring plans for tidal wetland restoration and maintenance projects. The model can also be used to estimate wetland GHG fluxes and potential carbon storage under various IPCC climate change and sea level rise scenarios; facilitating an appropriate management of carbon stocks in tidal wetlands and their incorporation into a

  1. SALMOD: a population model for salmonids: user's manual. Version W3

    USGS Publications Warehouse

    Bartholow, John; Heasley, John; Laake, Jeff; Sandelin, Jeff; Coughlan, Beth A.K.; Moos, Alan

    2002-01-01

    SALMOD is a computer model that simulates the dynamics of freshwater salmonid populations, both anadromous and resident. The conceptual model was developed in a workshop setting (Williamson et al. 1993) using fish experts concerned with Trinity River chinook restoration. The model builds on the foundation laid by similar models (see Cheslak and Jacobson 1990). The model’s premise that that egg and fish mortality are directly related to spatially and temporally variable micro- and macrohabitat limitations, which themselves are related to the timing and amount of streamflow and other meteorological variables. Habitat quality and capacity are characterized by the hydraulic and thermal properties of individual mesohabitats, which we use as spatial “computation units” in the model. The model tracks a population of spatially distinct cohorts that originate as gees and grow from one life stage to another as a function of local water temperature. Individual cohorts either remain in the computational unit in which they emerged or move, in whole or in part, to nearby units (see McCormick et al. 1998). Model processes include spawning (with red superimposition and incubation losses), growth (including egg maturation), mortality, and movement (freshet-induced, habitat-induced, and seasonal). Model processes are implemented such that the user (modeler) has the ability to more-or-less program the model on the fly to create the dynamics thought to animate the population. SALMOD then tabulates the various causes of mortality and the whereabouts of fish.

  2. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    SciTech Connect

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  3. Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1994-01-01

    An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.

  4. AIRCL: A programmed system for generating NC tapes for airplane models. User's manual

    NASA Technical Reports Server (NTRS)

    Akgerman, N.; Billhardt, C. F.

    1981-01-01

    A computer program is presented which calculates the cutter location file needed to machine models of airplane wings or wing-fuselage combinations on numerically controlled machine tools. Input to the program is a data file consisting of coordinates on the fuselage and wing. From this data file, the program calculates tool offsets, determines the intersection between wing and fuselage tool paths, and generates additional information needed to machine the fuselage and/or wing. Output from the program can be post processed for use on a variety of milling machines. Information on program structure and methodology is given as well as the user's manual for implementation of the program.

  5. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  6. A User-Friendly Model for Spray Drying to Aid Pharmaceutical Product Development

    PubMed Central

    Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L. J.; Frijlink, Henderik W.

    2013-01-01

    The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach. PMID:24040240

  7. Functional volumes modeling: theory and preliminary assessment.

    PubMed

    Fox, P T; Lancaster, J L; Parsons, L M; Xiong, J H; Zamarripa, F

    1997-01-01

    A construct for metanalytic modeling of the functional organization of the human brain, termed functional volumes modeling (FVM), is presented and preliminarily tested. FVM uses the published literature to model brain functional areas as spatial probability distributions. The FVM statistical model estimates population variance (i.e., among individuals) from the variance observed among group-mean studies, these being the most prevalent type of study in the functional imaging literature. The FVM modeling strategy is tested by: (1) constructing an FVM of the mouth region of primary motor cortex using published, group-mean, functional imaging reports as input, and (2) comparing the confidence bounds predicted by that FVM with those observed in 10 normal subjects performing overt-speech tasks. The FVM model correctly predicted the mean location and spatial distribution of per-subject functional responses. FVM has a wide range of applications, including hypothesis testing for statistical parametric images.

  8. Convergent perturbation theory for lattice models with fermions

    NASA Astrophysics Data System (ADS)

    Sazonov, V. K.

    2016-05-01

    The standard perturbation theory in QFT and lattice models leads to the asymptotic expansions. However, an appropriate regularization of the path or lattice integrals allows one to construct convergent series with an infinite radius of the convergence. In the earlier studies, this approach was applied to the purely bosonic systems. Here, using bosonization, we develop the convergent perturbation theory for a toy lattice model with interacting fermionic and bosonic fields.

  9. Consumer preference models: fuzzy theory approach

    NASA Astrophysics Data System (ADS)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  10. User's guide for a personal computer model of turbulence at a wind turbine rotor

    NASA Astrophysics Data System (ADS)

    Connell, J. R.; Powell, D. C.; Gower, G. L.

    1989-08-01

    This document is primarily: (1) a user's guide for the personal computer (PC) version of the code for the PNL computational model of the rotationally sampled wind speed (RODASIM11), and (2) a brief guide to the growing literature on the subject of rotationally sampled turbulence, from which the model is derived. The model generates values of turbulence experienced by single points fixed in the rotating frame of reference of an arbitrary wind turbine blade. The character of the turbulence depends on the specification of mean wind speed, the variance of turbulence, the crosswind and along-wind integral scales of turbulence, mean wind shear, and the hub height, radius, and angular speed of rotation of any point at which wind fluctuation is to be calculated.

  11. Bianchi class A models in Sàez-Ballester's theory

    NASA Astrophysics Data System (ADS)

    Socorro, J.; Espinoza-García, Abraham

    2012-08-01

    We apply the Sàez-Ballester (SB) theory to Bianchi class A models, with a barotropic perfect fluid in a stiff matter epoch. We obtain exact classical solutions à la Hamilton for Bianchi type I, II and VIh=-1 models. We also find exact quantum solutions to all Bianchi Class A models employing a particular ansatz for the wave function of the universe.

  12. A Dynamic Systems Theory Model of Visual Perception Development

    ERIC Educational Resources Information Center

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  13. A Sharing Item Response Theory Model for Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Segall, Daniel O.

    2004-01-01

    A new sharing item response theory (SIRT) model is presented that explicitly models the effects of sharing item content between informants and test takers. This model is used to construct adaptive item selection and scoring rules that provide increased precision and reduced score gains in instances where sharing occurs. The adaptive item selection…

  14. The monster sporadic group and a theory underlying superstring models

    SciTech Connect

    Chapline, G.

    1996-09-01

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs.

  15. Baldrige Theory into Practice: A Generic Model

    ERIC Educational Resources Information Center

    Arif, Mohammed

    2007-01-01

    Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…

  16. The theory research of multi-user quantum access network with Measurement Device Independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Ji, Yi-Ming; Li, Yun-Xia; Shi, Lei; Meng, Wen; Cui, Shu-Min; Xu, Zhen-Yu

    2015-10-01

    Quantum access network can't guarantee the absolute security of multi-user detector and eavesdropper can get access to key information through time-shift attack and other ways. Measurement-device-independent quantum key distribution is immune from all the detection attacks, and accomplishes the safe sharing of quantum key. In this paper, that Measurement-device-independent quantum key distribution is used in the application of multi-user quantum access to the network is on the research. By adopting time-division multiplexing technology to achieve the sharing of multiuser detector, the system structure is simplified and the security of quantum key sharing is acquired.

  17. Measurement Models for Reasoned Action Theory.

    PubMed

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  18. Technical documentation and user's guide for City-County Allocation Model (CCAM). Version 1. 0

    SciTech Connect

    Clark, L.T. Jr.; Scott, M.J.; Hammer, P.

    1986-05-01

    The City-County Allocation Model (CCAM) was developed as part of the Monitored Retrievable Storage (MRS) Program. The CCAM model was designed to allocate population changes forecasted by the MASTER model to specific local communities within commuting distance of the MRS facility. The CCAM model was designed to then forecast the potential changes in demand for key community services such as housing, police protection, and utilities for these communities. The CCAM model uses a flexible on-line data base on demand for community services that is based on a combination of local service levels and state and national service standards. The CCAM model can be used to quickly forecast the potential community service consequence of economic development for local communities anywhere in the country. The remainder of this document is organized as follows. The purpose of this manual is to assist the user in understanding and operating the City-County Allocation Model (CCAM). The annual explains the data sources for the model and code modifications as well as the operational procedures.

  19. The Family FIRO Model: The Integration of Group Theory and Family Theory.

    ERIC Educational Resources Information Center

    Colangelo, Nicholas; Doherty, William J.

    1988-01-01

    Presents the Family Fundamental Interpersonal Relations Orientation (Family FIRO) Model, an integration of small-group theory and family therapy. The model is offered as a framework for organizing family issues. Discusses three fundamental issues of human relatedness and their applicability to group dynamics. (Author/NB)

  20. Incentivizing biodiversity conservation in artisanal fishing communities through territorial user rights and business model innovation.

    PubMed

    Gelcich, Stefan; Donlan, C Josh

    2015-08-01

    Territorial user rights for fisheries are being promoted to enhance the sustainability of small-scale fisheries. Using Chile as a case study, we designed a market-based program aimed at improving fishers' livelihoods while incentivizing the establishment and enforcement of no-take areas within areas managed with territorial user right regimes. Building on explicit enabling conditions (i.e., high levels of governance, participation, and empowerment), we used a place-based, human-centered approach to design a program that will have the necessary support and buy-in from local fishers to result in landscape-scale biodiversity benefits. Transactional infrastructure must be complex enough to capture the biodiversity benefits being created, but simple enough so that the program can be scaled up and is attractive to potential financiers. Biodiversity benefits created must be commoditized, and desired behavioral changes must be verified within a transactional context. Demand must be generated for fisher-created biodiversity benefits in order to attract financing and to scale the market model. Important design decisions around these 3 components-supply, transactional infrastructure, and demand-must be made based on local social-ecological conditions. Our market model, which is being piloted in Chile, is a flexible foundation on which to base scalable opportunities to operationalize a scheme that incentivizes local, verifiable biodiversity benefits via conservation behaviors by fishers that could likely result in significant marine conservation gains and novel cross-sector alliances.

  1. Incentivizing biodiversity conservation in artisanal fishing communities through territorial user rights and business model innovation.

    PubMed

    Gelcich, Stefan; Donlan, C Josh

    2015-08-01

    Territorial user rights for fisheries are being promoted to enhance the sustainability of small-scale fisheries. Using Chile as a case study, we designed a market-based program aimed at improving fishers' livelihoods while incentivizing the establishment and enforcement of no-take areas within areas managed with territorial user right regimes. Building on explicit enabling conditions (i.e., high levels of governance, participation, and empowerment), we used a place-based, human-centered approach to design a program that will have the necessary support and buy-in from local fishers to result in landscape-scale biodiversity benefits. Transactional infrastructure must be complex enough to capture the biodiversity benefits being created, but simple enough so that the program can be scaled up and is attractive to potential financiers. Biodiversity benefits created must be commoditized, and desired behavioral changes must be verified within a transactional context. Demand must be generated for fisher-created biodiversity benefits in order to attract financing and to scale the market model. Important design decisions around these 3 components-supply, transactional infrastructure, and demand-must be made based on local social-ecological conditions. Our market model, which is being piloted in Chile, is a flexible foundation on which to base scalable opportunities to operationalize a scheme that incentivizes local, verifiable biodiversity benefits via conservation behaviors by fishers that could likely result in significant marine conservation gains and novel cross-sector alliances. PMID:25737027

  2. A model for developing outcome measures from the perspectives of mental health service users.

    PubMed

    Rose, Diana; Evans, Jo; Sweeney, Angela; Wykes, Til

    2011-01-01

    It is becoming increasingly recognized that conventionally derived outcome measures in mental health research are problematic. This is both because of the methodology used and because a 'good' outcome is framed from the perspective of clinicians and researchers. This paper describes a methodology for developing outcome measures for use in large studies entirely from the perspective of mental health service users. It is a mixed methods model starting with a participatory and qualitative methodology and proceeding to psychometric testing. At all stages, the researchers are themselves mental health service users. In the first phase of the model, focus groups are convened comprising people who have received the treatment or service being measured. The focus groups meet twice resulting in a draft mixed-methods questionnaire devised from thematic analysis of the focus group data. This is then taken to expert panels, again comprising individuals who have received the treatment or service being evaluated for refinement. Following this, a feasibility study is conducted with N ∼ 50 participants and changes made iteratively to the questionnaire in light of feedback. The final measure is subject to psychometric testing both to ensure it is robust and to explore similarities and differences with conventionally derived measures. PMID:21338297

  3. Homogeneous cosmological models in Yang's gravitation theory

    NASA Technical Reports Server (NTRS)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  4. Coordinating the Complexity of Tools, Tasks, and Users: On Theory-Based Approaches to Authoring Tool Usability

    ERIC Educational Resources Information Center

    Murray, Tom

    2016-01-01

    Intelligent Tutoring Systems authoring tools are highly complex educational software applications used to produce highly complex software applications (i.e. ITSs). How should our assumptions about the target users (authors) impact the design of authoring tools? In this article I first reflect on the factors leading to my original 1999 article on…

  5. User's manual for heat-pump seasonal-performance model (SPM) with selected parametric examples

    SciTech Connect

    Not Available

    1982-06-30

    The Seasonal Performance Model (SPM) was developed to provide an accurate source of seasonal energy consumption and cost predictions for the evaluation of heat pump design options. The program uses steady state heat pump performance data obtained from manufacturers' or Computer Simulation Model runs. The SPM was originally developed in two forms - a cooling model for central air conditioners and heat pumps and a heating model for heat pumps. The original models have undergone many modifications, which are described, to improve the accuracy of predictions and to increase flexibility for use in parametric evaluations. Insights are provided into the theory and construction of the major options, and into the use of the available options and output variables. Specific investigations provide examples of the possible applications of the model. (LEW)

  6. BIGFLOW: A numerical code for simulating flow in variably saturated, heterogeneous geologic media. Theory and user`s manaual, Version 1.1

    SciTech Connect

    Ababou, R.; Bagtzoglou, A.C.

    1993-06-01

    This report documents BIGFLOW 1.1, a numerical code for simulating flow in variably saturated heterogeneous geologic media. It contains the underlying mathematical and numerical models, test problems, benchmarks, and applications of the BIGFLOW code. The BIGFLOW software package is composed of a simulation and an interactive data processing code (DATAFLOW). The simulation code solves linear and nonlinear porous media flow equations based on Darcy`s law, appropriately generalized to account for 3D, deterministic, or random heterogeneity. A modified Picard Scheme is used for linearizing unsaturated flow equations, and preconditioned iterative methods are used for solving the resulting matrix systems. The data processor (DATAFLOW) allows interactive data entry, manipulation, and analysis of 3D datasets. The report contains analyses of computational performance carried out using Cray-2 and Cray-Y/MP8 supercomputers. Benchmark tests include comparisons with other independently developed codes, such as PORFLOW and CMVSFS, and with analytical or semi-analytical solutions.

  7. Scaling Users' Perceptions of Library Service Quality Using Item Response Theory: A LibQUAL+ [TM] Study

    ERIC Educational Resources Information Center

    Wei, Youhua; Thompson, Bruce; Cook, C. Colleen

    2005-01-01

    LibQUAL+[TM] data to date have not been subjected to the modern measurement theory called polytomous item response theory (IRT). The data interpreted here were collected from 42,090 participants who completed the "American English" version of the 22 core LibQUAL+[TM] items, and 12,552 participants from Australia and Europe who completed the…

  8. Evaluation of custom energy expenditure models for SenseWear armband in manual wheelchair users.

    PubMed

    Tsang, KaLai; Hiremath, Shivayogi V; Cooper, Rory A; Ding, Dan

    2015-01-01

    Physical activity monitors are increasingly used to help the general population lead a healthy lifestyle by keeping track of their daily physical activity (PA) and energy expenditure (EE). However, none of the commercially available activity monitors can accurately estimate PA and EE in people who use wheelchairs as their primary means of mobility. Researchers have recently developed custom EE prediction models for manual wheelchair users (MWUs) with spinal cord injuries (SCIs) based on a commercial activity monitor--the SenseWear armband. This study evaluated the performance of two custom EE prediction models, including a general model and a set of activity-specific models among 45 MWUs with SCI. The estimated EE was obtained by using the two custom models and the default manufacturer's model, and it was compared with the gold standard measured by the K4b2 portable metabolic cart. The general, activity-specific, and default models had a mean signed percent error (mean +/- standard deviation) of -2.8 +/- 26.1%, -4.8 +/- 25.4%, and -39.6 +/- 37.8%, respectively. The intraclass correlation coefficient was 0.86 (95% confidence interval [CI] = 0.82 to 0.89) for the general model, 0.83 (95% CI = 0.79 to 0.87) for the activity-specific model, and 0.62 (95% CI = 0.16 to 0.81) for the default model. The custom models for the SenseWear armband significantly improved the EE estimation accuracy for MWUs with SCI. PMID:26745837

  9. Modeling Developmental Transitions in Adaptive Resonance Theory

    ERIC Educational Resources Information Center

    Raijmakers, Maartje E. J.; Molenaar, Peter C. M.

    2004-01-01

    Neural networks are applied to a theoretical subject in developmental psychology: modeling developmental transitions. Two issues that are involved will be discussed: discontinuities and acquiring qualitatively new knowledge. We will argue that by the appearance of a bifurcation, a neural network can show discontinuities and may acquire…

  10. Microbial community modeling using reliability theory.

    PubMed

    Zilles, Julie L; Rodríguez, Luis F; Bartolerio, Nicholas A; Kent, Angela D

    2016-08-01

    Linking microbial community composition with the corresponding ecosystem functions remains challenging. Because microbial communities can differ in their functional responses, this knowledge gap limits ecosystem assessment, design and management. To develop models that explicitly incorporate microbial populations and guide efforts to characterize their functional differences, we propose a novel approach derived from reliability engineering. This reliability modeling approach is illustrated here using a microbial ecology dataset from denitrifying bioreactors. Reliability modeling is well-suited for analyzing the stability of complex networks composed of many microbial populations. It could also be applied to evaluate the redundancy within a particular biochemical pathway in a microbial community. Reliability modeling allows characterization of the system's resilience and identification of failure-prone functional groups or biochemical steps, which can then be targeted for monitoring or enhancement. The reliability engineering approach provides a new perspective for unraveling the interactions between microbial community diversity, functional redundancy and ecosystem services, as well as practical tools for the design and management of engineered ecosystems.

  11. A Graphical User Interface for Parameterizing Biochemical Models of Photosynthesis and Chlorophyll Fluorescence

    NASA Astrophysics Data System (ADS)

    Kornfeld, A.; Van der Tol, C.; Berry, J. A.

    2015-12-01

    Recent advances in optical remote sensing of photosynthesis offer great promise for estimating gross primary productivity (GPP) at leaf, canopy and even global scale. These methods -including solar-induced chlorophyll fluorescence (SIF) emission, fluorescence spectra, and hyperspectral features such as the red edge and the photochemical reflectance index (PRI) - can be used to greatly enhance the predictive power of global circulation models (GCMs) by providing better constraints on GPP. The way to use measured optical data to parameterize existing models such as SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes) is not trivial, however. We have therefore extended a biochemical model to include fluorescence and other parameters in a coupled treatment. To help parameterize the model, we then use nonlinear curve-fitting routines to determine the parameter set that enables model results to best fit leaf-level gas exchange and optical data measurements. To make the tool more accessible to all practitioners, we have further designed a graphical user interface (GUI) based front-end to allow researchers to analyze data with a minimum of effort while, at the same time, allowing them to change parameters interactively to visualize how variation in model parameters affect predicted outcomes such as photosynthetic rates, electron transport, and chlorophyll fluorescence. Here we discuss the tool and its effectiveness, using recently-gathered leaf-level data.

  12. Users Manual for the Geospatial Stream Flow Model (GeoSFM)

    USGS Publications Warehouse

    Artan, Guleid A.; Asante, Kwabena; Smith, Jodie; Pervez, Md Shahriar; Entenmann, Debbie; Verdin, James P.; Rowland, James

    2008-01-01

    The monitoring of wide-area hydrologic events requires the manipulation of large amounts of geospatial and time series data into concise information products that characterize the location and magnitude of the event. To perform these manipulations, scientists at the U.S. Geological Survey Center for Earth Resources Observation and Science (EROS), with the cooperation of the U.S. Agency for International Development, Office of Foreign Disaster Assistance (USAID/OFDA), have implemented a hydrologic modeling system. The system includes a data assimilation component to generate data for a Geospatial Stream Flow Model (GeoSFM) that can be run operationally to identify and map wide-area streamflow anomalies. GeoSFM integrates a geographical information system (GIS) for geospatial preprocessing and postprocessing tasks and hydrologic modeling routines implemented as dynamically linked libraries (DLLs) for time series manipulations. Model results include maps that depicting the status of streamflow and soil water conditions. This Users Manual provides step-by-step instructions for running the model and for downloading and processing the input data required for initial model parameterization and daily operation.

  13. Effect of Human Model Height and Sex on Induced Current Dosimetry in Household Induction Heater Users

    NASA Astrophysics Data System (ADS)

    Tarao, Hiroo; Hayashi, Noriyuki; Isaka, Katsuo

    Induced currents in the high-resolution, anatomical human models are numerically calculated by the impedance method. The human models are supposed to be exposed to highly inhomogeneous 20.9 kHz magnetic fields from a household induction heater (IH). In the case of the adult models, the currents ranging from 5 to 19 mA/m2 are induced for between the shoulder and lower abdomen. Meanwhile, in the case of the child models, the currents ranging from 5 to 21 mA/m2 are induced for between the head and abdomen. In particular, the induced currents near the brain tissue are almost the same as those near the abdomen. When the induced currents in the central nervous system tissues are considered, the induced currents in the child model are 2.1 to 6.9 times as large as those in the adult model under the same B-field exposure environment. These results suggest the importance of further investigation intended for a pregnant female who uses the IH as well as for a child (or the IH users of small standing height).

  14. Time dependent turbulence modeling and analytical theories of turbulence

    NASA Technical Reports Server (NTRS)

    Rubinstein, R.

    1993-01-01

    By simplifying the direct interaction approximation (DIA) for turbulent shear flow, time dependent formulas are derived for the Reynolds stresses which can be included in two equation models. The Green's function is treated phenomenologically, however, following Smith and Yakhot, we insist on the short and long time limits required by DIA. For small strain rates, perturbative evaluation of the correlation function yields a time dependent theory which includes normal stress effects in simple shear flows. From this standpoint, the phenomenological Launder-Reece-Rodi model is obtained by replacing the Green's function by its long time limit. Eddy damping corrections to short time behavior initiate too quickly in this model; in contrast, the present theory exhibits strong suppression of eddy damping at short times. A time dependent theory for large strain rates is proposed in which large scales are governed by rapid distortion theory while small scales are governed by Kolmogorov inertial range dynamics. At short times and large strain rates, the theory closely matches rapid distortion theory, but at long times it relaxes to an eddy damping model.

  15. TIME Impact - a new user-friendly tuberculosis (TB) model to inform TB policy decisions.

    PubMed

    Houben, R M G J; Lalli, M; Sumner, T; Hamilton, M; Pedrazzoli, D; Bonsu, F; Hippner, P; Pillay, Y; Kimerling, M; Ahmedov, S; Pretorius, C; White, R G

    2016-01-01

    Tuberculosis (TB) is the leading cause of death from infectious disease worldwide, predominantly affecting low- and middle-income countries (LMICs), where resources are limited. As such, countries need to be able to choose the most efficient interventions for their respective setting. Mathematical models can be valuable tools to inform rational policy decisions and improve resource allocation, but are often unavailable or inaccessible for LMICs, particularly in TB. We developed TIME Impact, a user-friendly TB model that enables local capacity building and strengthens country-specific policy discussions to inform support funding applications at the (sub-)national level (e.g. Ministry of Finance) or to international donors (e.g. the Global Fund to Fight AIDS, Tuberculosis and Malaria).TIME Impact is an epidemiological transmission model nested in TIME, a set of TB modelling tools available for free download within the widely-used Spectrum software. The TIME Impact model reflects key aspects of the natural history of TB, with additional structure for HIV/ART, drug resistance, treatment history and age. TIME Impact enables national TB programmes (NTPs) and other TB policymakers to better understand their own TB epidemic, plan their response, apply for funding and evaluate the implementation of the response.The explicit aim of TIME Impact's user-friendly interface is to enable training of local and international TB experts towards independent use. During application of TIME Impact, close involvement of the NTPs and other local partners also builds critical understanding of the modelling methods, assumptions and limitations inherent to modelling. This is essential to generate broad country-level ownership of the modelling data inputs and results. In turn, it stimulates discussions and a review of the current evidence and assumptions, strengthening the decision-making process in general.TIME Impact has been effectively applied in a variety of settings. In South Africa, it

  16. TIME Impact - a new user-friendly tuberculosis (TB) model to inform TB policy decisions.

    PubMed

    Houben, R M G J; Lalli, M; Sumner, T; Hamilton, M; Pedrazzoli, D; Bonsu, F; Hippner, P; Pillay, Y; Kimerling, M; Ahmedov, S; Pretorius, C; White, R G

    2016-03-24

    Tuberculosis (TB) is the leading cause of death from infectious disease worldwide, predominantly affecting low- and middle-income countries (LMICs), where resources are limited. As such, countries need to be able to choose the most efficient interventions for their respective setting. Mathematical models can be valuable tools to inform rational policy decisions and improve resource allocation, but are often unavailable or inaccessible for LMICs, particularly in TB. We developed TIME Impact, a user-friendly TB model that enables local capacity building and strengthens country-specific policy discussions to inform support funding applications at the (sub-)national level (e.g. Ministry of Finance) or to international donors (e.g. the Global Fund to Fight AIDS, Tuberculosis and Malaria).TIME Impact is an epidemiological transmission model nested in TIME, a set of TB modelling tools available for free download within the widely-used Spectrum software. The TIME Impact model reflects key aspects of the natural history of TB, with additional structure for HIV/ART, drug resistance, treatment history and age. TIME Impact enables national TB programmes (NTPs) and other TB policymakers to better understand their own TB epidemic, plan their response, apply for funding and evaluate the implementation of the response.The explicit aim of TIME Impact's user-friendly interface is to enable training of local and international TB experts towards independent use. During application of TIME Impact, close involvement of the NTPs and other local partners also builds critical understanding of the modelling methods, assumptions and limitations inherent to modelling. This is essential to generate broad country-level ownership of the modelling data inputs and results. In turn, it stimulates discussions and a review of the current evidence and assumptions, strengthening the decision-making process in general.TIME Impact has been effectively applied in a variety of settings. In South Africa, it

  17. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  18. CIRCE2/DEKGEN2: A software package for facilitated optical analysis of 3-D distributed solar energy concentrators. Theory and user manual

    SciTech Connect

    Romero, V.J.

    1994-03-01

    CIRCE2 is a computer code for modeling the optical performance of three-dimensional dish-type solar energy concentrators. Statistical methods are used to evaluate the directional distribution of reflected rays from any given point on the concentrator. Given concentrator and receiver geometries, sunshape (angular distribution of incident rays from the sun), and concentrator imperfections such as surface roughness and random deviation in slope, the code predicts the flux distribution and total power incident upon the target. Great freedom exists in the variety of concentrator and receiver configurations that can be modeled. Additionally, provisions for shading and receiver aperturing are included.- DEKGEN2 is a preprocessor designed to facilitate input of geometry, error distributions, and sun models. This manual describes the optical model, user inputs, code outputs, and operation of the software package. A user tutorial is included in which several collectors are built and analyzed in step-by-step examples.

  19. Dust in fusion plasmas: theory and modeling

    SciTech Connect

    Smirnov, R. D.; Pigarov, A. Yu.; Krasheninnikov, S. I.; Mendis, D. A.; Rosenberg, M.; Rudakov, D.; Tanaka, Y.; Rognlien, T. D.; Soboleva, T. K.; Shukla, P. K.; Bray, B. D.; West, W. P.; Roquemore, A. L.; Skinner, C. H.

    2008-09-07

    Dust may have a large impact on ITER-scale plasma experiments including both safety and performance issues. However, the physics of dust in fusion plasmas is very complex and multifaceted. Here, we discuss different aspects of dust dynamics including dust-plasma, and dust-surface interactions. We consider the models of dust charging, heating, evaporation/sublimation, dust collision with material walls, etc., which are suitable for the conditions of fusion plasmas. The physical models of all these processes have been incorporated into the DUST Transport (DUSTT) code. Numerical simulations demonstrate that dust particles are very mobile and accelerate to large velocities due to the ion drag force (cruise speed >100 m/s). Deep penetration of dust particles toward the plasma core is predicted. It is shown that DUSTT is capable of reproducing many features of recent dust-related experiments, but much more work is still needed.

  20. INTERLINE 5.0 -- An expanded railroad routing model: Program description, methodology, and revised user`s manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S.; Clarke, D.B.; Jacobi, J.M.

    1993-03-01

    A rail routine model, INTERLINE, has been developed at the Oak Ridge National Laboratory to investigate potential routes for transporting radioactive materials. In Version 5.0, the INTERLINE routing algorithms have been enhanced to include the ability to predict alternative routes, barge routes, and population statistics for any route. The INTERLINE railroad network is essentially a computerized rail atlas describing the US railroad system. All rail lines, with the exception of industrial spurs, are included in the network. Inland waterways and deep water routes along with their interchange points with the US railroadsystem are also included. The network contains over 15,000 rail and barge segments (links) and over 13,000 stations, interchange points, ports, and other locations (nodes). The INTERLINE model has been converted to operate on an IBM-compatible personal computer. At least a 286 computer with a hard disk containing approximately 6 MB of free space is recommended. Enhanced program performance will be obtained by using arandom-access memory drive on a 386 or 486 computer.

  1. Attachment theory and theory of planned behavior: an integrative model predicting underage drinking.

    PubMed

    Lac, Andrew; Crano, William D; Berger, Dale E; Alvaro, Eusebio M

    2013-08-01

    Research indicates that peer and maternal bonds play important but sometimes contrasting roles in the outcomes of children. Less is known about attachment bonds to these 2 reference groups in young adults. Using a sample of 351 participants (18 to 20 years of age), the research integrated two theoretical traditions: attachment theory and theory of planned behavior (TPB). The predictive contribution of both theories was examined in the context of underage adult alcohol use. Using full structural equation modeling, results substantiated the hypotheses that secure peer attachment positively predicted norms and behavioral control toward alcohol, but secure maternal attachment inversely predicted attitudes and behavioral control toward alcohol. Alcohol attitudes, norms, and behavioral control each uniquely explained alcohol intentions, which anticipated an increase in alcohol behavior 1 month later. The hypothesized processes were statistically corroborated by tests of indirect and total effects. These findings support recommendations for programs designed to curtail risky levels of underage drinking using the tenets of attachment theory and TPB.

  2. Group theory and biomolecular conformation: I. Mathematical and computational models

    PubMed Central

    Chirikjian, Gregory S

    2010-01-01

    Biological macromolecules, and the complexes that they form, can be described in a variety of ways ranging from quantum mechanical and atomic chemical models, to coarser grained models of secondary structure and domains, to continuum models. At each of these levels, group theory can be used to describe both geometric symmetries and conformational motion. In this survey, a detailed account is provided of how group theory has been applied across computational structural biology to analyze the conformational shape and motion of macromolecules and complexes. PMID:20827378

  3. L∞-algebra models and higher Chern-Simons theories

    NASA Astrophysics Data System (ADS)

    Ritter, Patricia; Sämann, Christian

    2016-10-01

    We continue our study of zero-dimensional field theories in which the fields take values in a strong homotopy Lie algebra. In the first part, we review in detail how higher Chern-Simons theories arise in the AKSZ-formalism. These theories form a universal starting point for the construction of L∞-algebra models. We then show how to describe superconformal field theories and how to perform dimensional reductions in this context. In the second part, we demonstrate that Nambu-Poisson and multisymplectic manifolds are closely related via their Heisenberg algebras. As a byproduct of our discussion, we find central Lie p-algebra extensions of 𝔰𝔬(p + 2). Finally, we study a number of L∞-algebra models which are physically interesting and which exhibit quantized multisymplectic manifolds as vacuum solutions.

  4. Applying learning theories and instructional design models for effective instruction.

    PubMed

    Khalil, Mohammed K; Elkhider, Ihsan A

    2016-06-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory.

  5. Automated Physico-Chemical Cell Model Development through Information Theory

    SciTech Connect

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  6. User's guide to Model Viewer, a program for three-dimensional visualization of ground-water model results

    USGS Publications Warehouse

    Hsieh, Paul A.; Winston, Richard B.

    2002-01-01

    Model Viewer is a computer program that displays the results of three-dimensional groundwater models. Scalar data (such as hydraulic head or solute concentration) may be displayed as a solid or a set of isosurfaces, using a red-to-blue color spectrum to represent a range of scalar values. Vector data (such as velocity or specific discharge) are represented by lines oriented to the vector direction and scaled to the vector magnitude. Model Viewer can also display pathlines, cells or nodes that represent model features such as streams and wells, and auxiliary graphic objects such as grid lines and coordinate axes. Users may crop the model grid in different orientations to examine the interior structure of the data. For transient simulations, Model Viewer can animate the time evolution of the simulated quantities. The current version (1.0) of Model Viewer runs on Microsoft Windows 95, 98, NT and 2000 operating systems, and supports the following models: MODFLOW-2000, MODFLOW-2000 with the Ground-Water Transport Process, MODFLOW-96, MOC3D (Version 3.5), MODPATH, MT3DMS, and SUTRA (Version 2D3D.1). Model Viewer is designed to directly read input and output files from these models, thus minimizing the need for additional postprocessing. This report provides an overview of Model Viewer. Complete instructions on how to use the software are provided in the on-line help pages.

  7. FINDING POTENTIALLY UNSAFE NUTRITIONAL SUPPLEMENTS FROM USER REVIEWS WITH TOPIC MODELING.

    PubMed

    Sullivan, Ryan; Sarker, Abeed; O'Connor, Karen; Goodin, Amanda; Karlsrud, Mark; Gonzalez, Graciela

    2016-01-01

    Although dietary supplements are widely used and generally are considered safe, some supplements have been identified as causative agents for adverse reactions, some of which may even be fatal. The Food and Drug Administration (FDA) is responsible for monitoring supplements and ensuring that supplements are safe. However, current surveillance protocols are not always effective. Leveraging user-generated textual data, in the form of Amazon.com reviews for nutritional supplements, we use natural language processing techniques to develop a system for the monitoring of dietary supplements. We use topic modeling techniques, specifically a variation of Latent Dirichlet Allocation (LDA), and background knowledge in the form of an adverse reaction dictionary to score products based on their potential danger to the public. Our approach generates topics that semantically capture adverse reactions from a document set consisting of reviews posted by users of specific products, and based on these topics, we propose a scoring mechanism to categorize products as "high potential danger", "average potential danger" and "low potential danger." We evaluate our system by comparing the system categorization with human annotators, and we find that the our system agrees with the annotators 69.4% of the time. With these results, we demonstrate that our methods show promise and that our system represents a proof of concept as a viable low-cost, active approach for dietary supplement monitoring.

  8. FINDING POTENTIALLY UNSAFE NUTRITIONAL SUPPLEMENTS FROM USER REVIEWS WITH TOPIC MODELING.

    PubMed

    Sullivan, Ryan; Sarker, Abeed; O'Connor, Karen; Goodin, Amanda; Karlsrud, Mark; Gonzalez, Graciela

    2016-01-01

    Although dietary supplements are widely used and generally are considered safe, some supplements have been identified as causative agents for adverse reactions, some of which may even be fatal. The Food and Drug Administration (FDA) is responsible for monitoring supplements and ensuring that supplements are safe. However, current surveillance protocols are not always effective. Leveraging user-generated textual data, in the form of Amazon.com reviews for nutritional supplements, we use natural language processing techniques to develop a system for the monitoring of dietary supplements. We use topic modeling techniques, specifically a variation of Latent Dirichlet Allocation (LDA), and background knowledge in the form of an adverse reaction dictionary to score products based on their potential danger to the public. Our approach generates topics that semantically capture adverse reactions from a document set consisting of reviews posted by users of specific products, and based on these topics, we propose a scoring mechanism to categorize products as "high potential danger", "average potential danger" and "low potential danger." We evaluate our system by comparing the system categorization with human annotators, and we find that the our system agrees with the annotators 69.4% of the time. With these results, we demonstrate that our methods show promise and that our system represents a proof of concept as a viable low-cost, active approach for dietary supplement monitoring. PMID:26776215

  9. New theories of root growth modelling

    NASA Astrophysics Data System (ADS)

    Landl, Magdalena; Schnepf, Andrea; Vanderborght, Jan; Huber, Katrin; Javaux, Mathieu; Bengough, A. Glyn; Vereecken, Harry

    2016-04-01

    In dynamic root architecture models, root growth is represented by moving root tips whose line trajectory results in the creation of new root segments. Typically, the direction of root growth is calculated as the vector sum of various direction-affecting components. However, in our simulations this did not reproduce experimental observations of root growth in structured soil. We therefore developed a new approach to predict the root growth direction. In this approach we distinguish between, firstly, driving forces for root growth, i.e. the force exerted by the root which points in the direction of the previous root segment and gravitropism, and, secondly, the soil mechanical resistance to root growth or penetration resistance. The latter can be anisotropic, i.e. depending on the direction of growth, which leads to a difference between the direction of the driving force and the direction of the root tip movement. Anisotropy of penetration resistance can be caused either by microscale differences in soil structure or by macroscale features, including macropores. Anisotropy at the microscale is neglected in our model. To allow for this, we include a normally distributed random deflection angle α to the force which points in the direction of the previous root segment with zero mean and a standard deviation σ. The standard deviation σ is scaled, so that the deflection from the original root tip location does not depend on the spatial resolution of the root system model. Similarly to the water flow equation, the direction of the root tip movement corresponds to the water flux vector while the driving forces are related to the water potential gradient. The analogue of the hydraulic conductivity tensor is the root penetrability tensor. It is determined by the inverse of soil penetration resistance and describes the ease with which a root can penetrate the soil. By adapting the three dimensional soil and root water uptake model R-SWMS (Javaux et al., 2008) in this way

  10. The Use of Modelling for Theory Building in Qualitative Analysis

    ERIC Educational Resources Information Center

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  11. Goodness-of-Fit Assessment of Item Response Theory Models

    ERIC Educational Resources Information Center

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  12. Theory and Practice: An Integrative Model Linking Class and Field

    ERIC Educational Resources Information Center

    Lesser, Joan Granucci; Cooper, Marlene

    2006-01-01

    Social work has evolved over the years taking on the challenges of the times. The profession now espouses a breadth of theoretical approaches and treatment modalities. We have developed a model to help graduate social work students master the skill of integrating theory and social work practice. The Integrative Model has five components: (l) The…

  13. Modeling pyramidal sensors in ray-tracing software by a suitable user-defined surface

    NASA Astrophysics Data System (ADS)

    Antichi, Jacopo; Munari, Matteo; Magrin, Demetrio; Riccardi, Armando

    2016-04-01

    Following the unprecedented results in terms of performances delivered by the first light adaptive optics system at the Large Binocular Telescope, there has been a wide-spread and increasing interest on the pyramid wavefront sensor (PWFS), which is the key component, together with the adaptive secondary mirror, of the adaptive optics (AO) module. Currently, there is no straightforward way to model a PWFS in standard sequential ray-tracing software. Common modeling strategies tend to be user-specific and, in general, are unsatisfactory for general applications. To address this problem, we have developed an approach to PWFS modeling based on user-defined surface (UDS), whose properties reside in a specific code written in C language, for the ray-tracing software ZEMAX™. With our approach, the pyramid optical component is implemented as a standard surface in ZEMAX™, exploiting its dynamic link library (DLL) conversion then greatly simplifying ray tracing and analysis. We have utilized the pyramid UDS DLL surface-referred to as pyramidal acronyms may be too risky (PAM2R)-in order to design the current PWFS-based AO system for the Giant Magellan Telescope, evaluating tolerances, with particular attention to the angular sensitivities, by means of sequential ray-tracing tools only, thus verifying PAM2R reliability and robustness. This work indicates that PAM2R makes the design of PWFS as simple as that of other optical standard components. This is particularly suitable with the advent of the extremely large telescopes era for which complexity is definitely one of the main challenges.

  14. Kinetic theories for spin models for cooperative relaxation dynamics

    NASA Astrophysics Data System (ADS)

    Pitts, Steven Jerome

    The facilitated kinetic Ising models with asymmetric spin flip constraints introduced by Jackle and co-workers [J. Jackle, S. Eisinger, Z. Phys. B 84, 115 (1991); J. Reiter, F. Mauch, J. Jackle, Physica A 184, 458 (1992)] exhibit complex relaxation behavior in their associated spin density time correlation functions. This includes the growth of relaxation times over many orders of magnitude when the thermodynamic control parameter is varied, and, in some cases, ergodic-nonergodic transitions. Relaxation equations for the time dependence of the spin density autocorrelation function for a set of these models are developed that relate this autocorrelation function to the irreducible memory function of Kawasaki [K. Kawasaki, Physica A 215, 61 (1995)] using a novel diagrammatic series approach. It is shown that the irreducible memory function in a theory of the relaxation of an autocorrelation function in a Markov model with detailed balance plays the same role as the part of the memory function approximated by a polynomial function of the autocorrelation function with positive coefficients in schematic simple mode coupling theories for supercooled liquids [W. Gotze, in Liquids, Freezing and the Glass Transition, D. Levesque, J. P. Hansen, J. Zinn-Justin eds., 287 (North Holland, New York, 1991)]. Sets of diagrams in the series for the irreducible memory function are summed which lead to approximations of this type. The behavior of these approximations is compared with known results from previous analytical calculations and from numerical simulations. For the simplest one dimensional model, relaxation equations that are closely related to schematic extended mode coupling theories [W. Gotze, ibid] are also derived using the diagrammatic series. Comparison of the results of these approximate theories with simulation data shows that these theories improve significantly on the results of the theories of the simple schematic mode coupling theory type. The potential

  15. Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study

    NASA Astrophysics Data System (ADS)

    Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana

    The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.

  16. Three studies testing the effects of role models on product users' safety behavior.

    PubMed

    deTurck, M A; Chih, I H; Hsu, Y P

    1999-09-01

    Three studies were conducted to determine the effect of a role model's safety behavior on observers' safety behavior. In Studies 1 and 2, role models (confederates) used a cleaning product requiring them to wear safety gloves. Study 1 examined observers' safety behavior after they witnessed a friendly (unfriendly) role model's safety behavior in one of four conditions: 1) wearing rubber gloves, 2) not wearing rubber gloves and experiencing no chemical burn, 3) not wearing rubber gloves and experiencing a mild chemical burn, and 4) not wearing rubber gloves and experiencing a severe chemical burn. In Study 2, participants tested a cleaning product with a warning message (low hazard vs. high hazard) after observing a role model first test the cleaning product in one of the four conditions specified above. As predicted, in Studies 1 and 2, observers were influenced by the role model's safety behavior. However, the friendliness of the role model (Study 1) and level of hazard (Study 2) communicated in the warning message did not influence participants' safety behavior. Using an over-the-counter pain reliever, Study 3 tested the joint effects of: 1) the level of hazard communicated in the warning, 2) observers' outcome-relevant involvement, and 3) role model's compliance. Although the level of hazard communicated in the warning exerted no impact on observers' safety compliance, the role model's safety behavior and level of involvement jointly influenced observers' safety behavior. The implications of the findings and future research directions are discussed. Actual or potential applications of the research include, but are not limited to, using role models in warning messages and safety training programs to demonstrate the proper use of safety gear so as to enhance product users' compliance with safety recommendations. PMID:10665208

  17. Nanofluid Drop Evaporation: Experiment, Theory, and Modeling

    NASA Astrophysics Data System (ADS)

    Gerken, William James

    Nanofluids, stable colloidal suspensions of nanoparticles in a base fluid, have potential applications in the heat transfer, combustion and propulsion, manufacturing, and medical fields. Experiments were conducted to determine the evaporation rate of room temperature, millimeter-sized pendant drops of ethanol laden with varying amounts (0-3% by weight) of 40-60 nm aluminum nanoparticles (nAl). Time-resolved high-resolution drop images were collected for the determination of early-time evaporation rate (D2/D 02 > 0.75), shown to exhibit D-square law behavior, and surface tension. Results show an asymptotic decrease in pendant drop evaporation rate with increasing nAl loading. The evaporation rate decreases by approximately 15% at around 1% to 3% nAl loading relative to the evaporation rate of pure ethanol. Surface tension was observed to be unaffected by nAl loading up to 3% by weight. A model was developed to describe the evaporation of the nanofluid pendant drops based on D-square law analysis for the gas domain and a description of the reduction in liquid fraction available for evaporation due to nanoparticle agglomerate packing near the evaporating drop surface. Model predictions are in relatively good agreement with experiment, within a few percent of measured nanofluid pendant drop evaporation rate. The evaporation of pinned nanofluid sessile drops was also considered via modeling. It was found that the same mechanism for nanofluid evaporation rate reduction used to explain pendant drops could be used for sessile drops. That mechanism is a reduction in evaporation rate due to a reduction in available ethanol for evaporation at the drop surface caused by the packing of nanoparticle agglomerates near the drop surface. Comparisons of the present modeling predictions with sessile drop evaporation rate measurements reported for nAl/ethanol nanofluids by Sefiane and Bennacer [11] are in fairly good agreement. Portions of this abstract previously appeared as: W. J

  18. A class of effective field theory models of cosmic acceleration

    SciTech Connect

    Bloomfield, Jolyon K.; Flanagan, Éanna É. E-mail: eef3@cornell.edu

    2012-10-01

    We explore a class of effective field theory models of cosmic acceleration involving a metric and a single scalar field. These models can be obtained by starting with a set of ultralight pseudo-Nambu-Goldstone bosons whose couplings to matter satisfy the weak equivalence principle, assuming that one boson is lighter than all the others, and integrating out the heavier fields. The result is a quintessence model with matter coupling, together with a series of correction terms in the action in a covariant derivative expansion, with specific scalings for the coefficients. After eliminating higher derivative terms and exploiting the field redefinition freedom, we show that the resulting theory contains nine independent free functions of the scalar field when truncated at four derivatives. This is in contrast to the four free functions found in similar theories of single-field inflation, where matter is not present. We discuss several different representations of the theory that can be obtained using the field redefinition freedom. For perturbations to the quintessence field today on subhorizon lengthscales larger than the Compton wavelength of the heavy fields, the theory is weakly coupled and natural in the sense of t'Hooft. The theory admits a regime where the perturbations become modestly nonlinear, but very strong nonlinearities lie outside its domain of validity.

  19. Integrated Modeling Program, Applied Chemical Theory (IMPACT)

    PubMed Central

    BANKS, JAY L.; BEARD, HEGE S.; CAO, YIXIANG; CHO, ART E.; DAMM, WOLFGANG; FARID, RAMY; FELTS, ANTHONY K.; HALGREN, THOMAS A.; MAINZ, DANIEL T.; MAPLE, JON R.; MURPHY, ROBERT; PHILIPP, DEAN M.; REPASKY, MATTHEW P.; ZHANG, LINDA Y.; BERNE, BRUCE J.; FRIESNER, RICHARD A.; GALLICCHIO, EMILIO; LEVY, RONALD M.

    2009-01-01

    We provide an overview of the IMPACT molecular mechanics program with an emphasis on recent developments and a description of its current functionality. With respect to core molecular mechanics technologies we include a status report for the fixed charge and polarizable force fields that can be used with the program and illustrate how the force fields, when used together with new atom typing and parameter assignment modules, have greatly expanded the coverage of organic compounds and medicinally relevant ligands. As we discuss in this review, explicit solvent simulations have been used to guide our design of implicit solvent models based on the generalized Born framework and a novel nonpolar estimator that have recently been incorporated into the program. With IMPACT it is possible to use several different advanced conformational sampling algorithms based on combining features of molecular dynamics and Monte Carlo simulations. The program includes two specialized molecular mechanics modules: Glide, a high-throughput docking program, and QSite, a mixed quantum mechanics/molecular mechanics module. These modules employ the IMPACT infrastructure as a starting point for the construction of the protein model and assignment of molecular mechanics parameters, but have then been developed to meet specialized objectives with respect to sampling and the energy function. PMID:16211539

  20. Integrated Modeling Program, Applied Chemical Theory (IMPACT).

    PubMed

    Banks, Jay L; Beard, Hege S; Cao, Yixiang; Cho, Art E; Damm, Wolfgang; Farid, Ramy; Felts, Anthony K; Halgren, Thomas A; Mainz, Daniel T; Maple, Jon R; Murphy, Robert; Philipp, Dean M; Repasky, Matthew P; Zhang, Linda Y; Berne, Bruce J; Friesner, Richard A; Gallicchio, Emilio; Levy, Ronald M

    2005-12-01

    We provide an overview of the IMPACT molecular mechanics program with an emphasis on recent developments and a description of its current functionality. With respect to core molecular mechanics technologies we include a status report for the fixed charge and polarizable force fields that can be used with the program and illustrate how the force fields, when used together with new atom typing and parameter assignment modules, have greatly expanded the coverage of organic compounds and medicinally relevant ligands. As we discuss in this review, explicit solvent simulations have been used to guide our design of implicit solvent models based on the generalized Born framework and a novel nonpolar estimator that have recently been incorporated into the program. With IMPACT it is possible to use several different advanced conformational sampling algorithms based on combining features of molecular dynamics and Monte Carlo simulations. The program includes two specialized molecular mechanics modules: Glide, a high-throughput docking program, and QSite, a mixed quantum mechanics/molecular mechanics module. These modules employ the IMPACT infrastructure as a starting point for the construction of the protein model and assignment of molecular mechanics parameters, but have then been developed to meet specialized objectives with respect to sampling and the energy function.

  1. Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation.

    NASA Astrophysics Data System (ADS)

    Lawley, Russell; Barron, Mark; Lark, Murray

    2015-04-01

    At BGS, expert elicitation has been used to evaluate uncertainty of surveyed boundaries in several, common, geological scenarios. As a result, a 'collective' understanding of the issues surrounding each scenario has emerged. The work has provoked wider debate in three key areas: a) what can we do to resolve those scenarios where a 'consensus' of understanding cannot be achieved b) what does it mean for survey practices and subsequent use of maps in 3D models c) how do we communicate the 'collective' understanding of geological mapping (with or without consensus for specific scenarios). Previous work elicited expert judgement for uncertainty in six contrasting mapping scenarios. In five cases it was possible to arrive at a consensus model; in a sixth case experts with different experience (length of service, academic background) took very different views of the nature of the mapping problem. The scenario concerned identification of the boundary between two contrasting tills (one derived from Triassic source materials being red in colour; the other, derived from Jurassic materials being grey in colour). Initial debate during the elicitation identified that the colour contrast should provide some degree of confidence in locating the boundary via traditional auger-traverse survey methods. However, as the elicitation progressed, it became clear that the complexities of the relationship between the two Tills were not uniformly understood across the experts and the panel could not agree a consensus regarding the spatial uncertainty of the boundary. The elicitation process allowed a significant degree of structured knowledge-exchange between experts of differing backgrounds and was successful in identifying a measure of uncertainty for what was considered a contentious scenario. However, the findings have significant implications for a boundary-scenario that is widely mapped across the central regions of Great Britain. We will discuss our experience of the use of

  2. A Brinkmanship Game Theory Model of Terrorism

    NASA Astrophysics Data System (ADS)

    Melese, Francois

    This study reveals conditions under which a world leader might credibly issue a brinkmanship threat of preemptive action to deter sovereign states or transnational terrorist organizations from acquiring weapons of mass destruction (WMD). The model consists of two players: the United Nations (UN) “Principal,” and a terrorist organization “Agent.” The challenge in issuing a brinkmanship threat is that it needs to be sufficiently unpleasant to deter terrorists from acquiring WMD, while not being so repugnant to those that must carry it out that they would refuse to do so. Two “credibility constraints” are derived. The first relates to the unknown terrorist type (Hard or Soft), and the second to acceptable risks (“blowback”) to the World community. Graphing the incentive-compatible Nash equilibrium solutions reveals when a brinkmanship threat is credible, and when it is not - either too weak to be effective, or unacceptably dangerous to the World community.

  3. Scaling Theory and Modeling of DNA Evolution

    NASA Astrophysics Data System (ADS)

    Buldyrev, Sergey V.

    1998-03-01

    We present evidence supporting the possibility that the nucleotide sequence in noncoding DNA is power-law correlated. We do not find such long-range correlation in the coding regions of the gene, so we build a ``coding sequence finder'' to locate the coding regions of an unknown DNA sequence. We also propose a different coding sequence finding algorithm, based on the concept of mutual information(I. Große, S. V. Buldyrev, H. Herzel, H. E. Stanley, (preprint).). We describe our recent work on quantification of DNA patchiness, using long-range correlation measures (G. M. Viswanathan, S. V. Buldyrev, S. Havlin, and H. E. Stanley, Biophysical Journal 72), 866-875 (1997).. We also present our recent study of the simple repeat length distributions. We find that the distributions of some simple repeats in noncoding DNA have long power-law tails, while in coding DNA all simple repeat distributions decay exponentially. (N. V. Dokholyan, S. V. Buldyrev, S. Havlin, and H. E. Stanley, Phys. Rev. Lett (in press).) We discuss several models based on insertion-deletion and mutation-duplication mechanisms that relate long-range correlations in non-coding DNA to DNA evolution. Specifically, we relate long-range correlations in non-coding DNA to simple repeat expansion, and propose an evolutionary model that reproduces the power law distribution of simple repeat lengths. We argue that the absence of long-range correlations in protein coding sequences is related to their highly conserved primary structure which is necessary to insure protein folding.

  4. User's Guide for the Agricultural Non-Point Source (AGNPS) Pollution Model Data Generator

    USGS Publications Warehouse

    Finn, Michael P.; Scheidt, Douglas J.; Jaromack, Gregory M.

    2003-01-01

    BACKGROUND Throughout this user guide, we refer to datasets that we used in conjunction with developing of this software for supporting cartographic research and producing the datasets to conduct research. However, this software can be used with these datasets or with more 'generic' versions of data of the appropriate type. For example, throughout the guide, we refer to national land cover data (NLCD) and digital elevation model (DEM) data from the U.S. Geological Survey (USGS) at a 30-m resolution, but any digital terrain model or land cover data at any appropriate resolution will produce results. Another key point to keep in mind is to use a consistent data resolution for all the datasets per model run. The U.S. Department of Agriculture (USDA) developed the Agricultural Nonpoint Source (AGNPS) pollution model of watershed hydrology in response to the complex problem of managing nonpoint sources of pollution. AGNPS simulates the behavior of runoff, sediment, and nutrient transport from watersheds that have agriculture as their prime use. The model operates on a cell basis and is a distributed parameter, event-based model. The model requires 22 input parameters. Output parameters are grouped primarily by hydrology, sediment, and chemical output (Young and others, 1995.) Elevation, land cover, and soil are the base data from which to extract the 22 input parameters required by the AGNPS. For automatic parameter extraction, follow the general process described in this guide of extraction from the geospatial data through the AGNPS Data Generator to generate input parameters required by the pollution model (Finn and others, 2002.)

  5. User's guide for the stock-recruitment model validation program. Environmental Sciences Division Publication No. 1985

    SciTech Connect

    Christensen, S.W.; Kirk, B.L.; Goodyear, C.P.

    1982-06-01

    SRVAL is a FORTRAN IV computer code designed to aid in assessing the validity of curve-fits of the linearized Ricker stock-recruitment model, modified to incorporate multiple-age spawners and to include an environmental variable, to variously processed annual catch-per-unit effort statistics for a fish population. It is sometimes asserted that curve-fits of this kind can be used to determine the sensitivity of fish populations to such man-induced stresses as entrainment and impingement at power plants. The SRVAL code was developed to test such assertions. It was utilized in testimony written in connection with the Hudson River Power Case (US Environmental Protection Agency, Region II). This testimony was recently published as a NUREG report. Here, a user's guide for SRVAL is presented.

  6. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    SciTech Connect

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  7. A user-friendly one-dimensional model for wet volcanic plumes

    USGS Publications Warehouse

    Mastin, Larry G.

    2007-01-01

    This paper presents a user-friendly graphically based numerical model of one-dimensional steady state homogeneous volcanic plumes that calculates and plots profiles of upward velocity, plume density, radius, temperature, and other parameters as a function of height. The model considers effects of water condensation and ice formation on plume dynamics as well as the effect of water added to the plume at the vent. Atmospheric conditions may be specified through input parameters of constant lapse rates and relative humidity, or by loading profiles of actual atmospheric soundings. To illustrate the utility of the model, we compare calculations with field-based estimates of plume height (∼9 km) and eruption rate (>∼4 × 105 kg/s) during a brief tephra eruption at Mount St. Helens on 8 March 2005. Results show that the atmospheric conditions on that day boosted plume height by 1–3 km over that in a standard dry atmosphere. Although the eruption temperature was unknown, model calculations most closely match the observations for a temperature that is below magmatic but above 100°C.

  8. User's manual for the Sandia Waste-Isolation Flow and Transport model (SWIFT).

    SciTech Connect

    Reeves, Mark; Cranwell, Robert M.

    1981-11-01

    This report describes a three-dimensional finite-difference model (SWIFT) which is used to simulate flow and transport processes in geologic media. The model was developed for use by the Nuclear Regulatory Commission in the analysis of deep geologic nuclear waste-disposal facilities. This document, as indicated by the title, is a user's manual and is intended to facilitate the use of the SWIFT simulator. Mathematical equations, submodels, application notes, and a description of the program itself are given herein. In addition, a complete input data guide is given along with several appendices which are helpful in setting up a data-input deck. Computer code SWIFT (Sandia Waste Isolation, Flow and Transport Model) is a fully transient, three-dimensional model which solves the coupled equations for transport in geologic media. The processes considered are: (1) fluid flow; (2) heat transport; (3) dominant-species miscible displacement; and (4) trace-species miscible displacement. The first three processes are coupled via fluid density and viscosity. Together they provide the velocity field on which the fourth process depends.

  9. An Evolving User-oriented Model of Internet Health Information Seeking

    PubMed Central

    Gaie, Martha J.

    2006-01-01

    This paper presents an evolving user-oriented model of Internet health information seeking (IS) based on qualitative data collected from 22 lung cancer (LC) patients and caregivers. This evolving model represents information search behavior as more highly individualized, complex, and dynamic than previous models, including pre-search psychological activity, use of multiple heuristics throughout the process, and cost-benefit evaluation of search results. This study’s findings suggest that IS occurs in four distinct phases: search initiation/continuation, selective exposure, message processing, and message evaluation. The identification of these phases and the heuristics used within them suggests a higher order of complexity in the decision-making processes that underlie IS, which could lead to the development of a conceptual framework that more closely reflects the complex nature of contextualized IS. It also illustrates the advantages of using qualitative methods to extract more subtle details of the IS process and fill in the gaps in existing models. PMID:17238347

  10. HIGHWAY 3. 1: An enhanced HIGHWAY routing model: Program description, methodology, and revised user's manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S. ); Clarke, D.B.; Jacobi, J.M. . Transportation Center)

    1993-03-01

    The HIGHWAY program provides a flexible tool for predicting highway routes for transporting radioactive materials in the United States. The HIGHWAY data base is essentially a computerized road atlas that currently describes over 240,000 miles of highways. Complete descriptions of all Interstate System and most US highways (except those that parallel a nearby Interstate highway) are included in the data base. Many of the principal state highways and a number of local and county highways are also identified. The data base also includes locations of nuclear facilities and major airports. Several different types of routes may be calculated, depending on a set of user-supplied constraints. Routes are calculated by minimizing the total impedance between the origin and the destination. Basically, the impedance is defined as a function of distance and driving time along a particular highway segment. Several routing constraints can be imposed during the calculations. One of the special features of the HIGHWAY model is its ability to calculate routes that maximize use of Interstate System highways. This feature allows the user to predict routes for shipments of radioactive materials that conform to the US Department of Transportation routing regulations. Other features of the model include the ability to predict routes that bypass a specific state, city, town, or highway segment. Two special features have been incorporated in HIGHWAY, version 3.1. The first is the ability to automatically calculate alternative routes. Frequently, there are a number of routes between the source and destination that vary slightly in distance and estimated driving time. The HIGHWAY program offers a selection of different but nearly equal routes. The second special feature is the capability to calculate route-specific population density statistics. The population density distribution is calculated for each highway segment in the route and is reported on a state-by-state basis.

  11. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    ERIC Educational Resources Information Center

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  12. Integrating Developmental Theory and Methodology: Using Derivatives to Articulate Change Theories, Models, and Inferences

    ERIC Educational Resources Information Center

    Deboeck, Pascal R.; Nicholson, Jody; Kouros, Chrystyna; Little, Todd D.; Garber, Judy

    2015-01-01

    Matching theories about growth, development, and change to appropriate statistical models can present a challenge, which can result in misuse, misinterpretation, and underutilization of different analytical approaches. We discuss the use of "derivatives": the change of a construct with respect to the change in another construct.…

  13. M-theory model-building and proton stability

    SciTech Connect

    Ellis, J.; Faraggi, A.E.; Nanopoulos, D.V. ||

    1997-09-01

    The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z{sub 2} x Z{sub 2} orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  14. A theory of exchange rate modeling

    SciTech Connect

    Alekseev, A.A.

    1995-09-01

    The article examines exchange rate modeling for two cases: (a) when the trading partners have mutual interests and (b) when the trading partners have antogonistic interests. Exchange rates in world markets are determined by supply and demand for the currency of each state, and states may control the exchange rate of their currency by changing the interest rate, the volume of credit, and product prices in both domestic and export markets. Abstracting from issues of production and technology in different countries and also ignoring various trade, institutional, and other barriers, we consider in this article only the effect of export and import prices on the exchange rate, we propose a new criterion of external trade activity: each trading partner earns a profit which is proportional to the volume of benefits enjoyed by the other partner. We consider a trading cycle that consists of four stages: (a) purchase of goods in the domestic market with the object of selling them abroad; (b) sale of the goods in foreign markets; (c) purchase of goods abroad with the object of selling them in the domestic market; (d) sale of the goods domestically.

  15. A simplistic model for identifying prominent web users in directed multiplex social networks: a case study using Twitter networks

    NASA Astrophysics Data System (ADS)

    Loucif, Hemza; Boubetra, Abdelhak; Akrouf, Samir

    2016-10-01

    This paper aims to describe a new simplistic model dedicated to gauge the online influence of Twitter users based on a mixture of structural and interactional features. The model is an additive mathematical formulation which involves two main parts. The first part serves to measure the influence of the Twitter user on just his neighbourhood covering his followers. However, the second part evaluates the potential influence of the Twitter user beyond the circle of his followers. Particularly, it measures the likelihood that the tweets of the Twitter user will spread further within the social graph through the retweeting process. The model is tested on a data set involving four kinds of real-world egocentric networks. The empirical results reveal that an active ordinary user is more prominent than a non-active celebrity one. A simple comparison is conducted between the proposed model and two existing simplistic approaches. The results show that our model generates the most realistic influence scores due to its dealing with both explicit (structural and interactional) and implicit features.

  16. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    PubMed

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  17. Theory of compressive modeling and simulation

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith

    2013-05-01

    Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .

  18. Theory to practice: the humanbecoming leading-following model.

    PubMed

    Ursel, Karen L

    2015-01-01

    Guided by the humanbecoming leading-following model, the author designed a nursing theories course with the intention of creating a meaningful nursing theory to practice link. The author perceived that with the implementation of Situation-Background-Assessment-Recommendations (SBAR) communication, nursing staff had drifted away from using the Kardex™ in shift to shift reporting. Nurse students, faculty, and staff members supported the creation of a theories project which would engage nursing students in the pursuit of clinical excellence. The project chosen was to revise the existing Kardex™ (predominant nursing communication tool). In the project, guided by a nursing theory, nursing students focused on the unique patient's experience, depicting the specific role of nursing knowledge and the contributions of the registered nurse to the patient's healthcare journey. The emphasis of this theoretical learning was the application of a nursing theory to real-life clinical challenges with communication of relevant, timely, and accurate patient information, recognizing that real problems are often complex and require multi-perspective approaches. This project created learning opportunities where a nursing theory would be chosen by the nursing student clinical group and applied in their clinical specialty area. This practice activity served to broaden student understandings of the role of nursing knowledge and nursing theories in their professional practice. PMID:25520461

  19. Supersymmetry and String Theory: Beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Dine, Michael

    2007-01-01

    The past decade has witnessed dramatic developments in the field of theoretical physics. This book is a comprehensive introduction to these recent developments. It contains a review of the Standard Model, covering non-perturbative topics, and a discussion of grand unified theories and magnetic monopoles. It introduces the basics of supersymmetry and its phenomenology, and includes dynamics, dynamical supersymmetry breaking, and electric-magnetic duality. The book then covers general relativity and the big bang theory, and the basic issues in inflationary cosmologies before discussing the spectra of known string theories and the features of their interactions. The book also includes brief introductions to technicolor, large extra dimensions, and the Randall-Sundrum theory of warped spaces. This will be of great interest to graduates and researchers in the fields of particle theory, string theory, astrophysics and cosmology. The book contains several problems, and password protected solutions will be available to lecturers at www.cambridge.org/9780521858410. Provides reader with tools to confront limitations of the Standard Model Includes several exercises and problems Solutions are available to lecturers at www.cambridge.org/9780521858410

  20. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    NASA Astrophysics Data System (ADS)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  1. Cosmological models in Weyl geometrical scalar-tensor theory

    NASA Astrophysics Data System (ADS)

    Pucheu, M. L.; Alves Junior, F. A. P.; Barreto, A. B.; Romero, C.

    2016-09-01

    We investigate cosmological models in a recently proposed geometrical theory of gravity, in which the scalar field appears as part of the spacetime geometry. We extend the previous theory to include a scalar potential in the action. We solve the vacuum field equations for different choices of the scalar potential and give a detailed analysis of the solutions. We show that, in some cases, a cosmological scenario is found that seems to suggest the appearance of a geometric phase transition. We build a toy model, in which the accelerated expansion of the early Universe is driven by pure geometry.

  2. Consistent constraints on the Standard Model Effective Field Theory

    NASA Astrophysics Data System (ADS)

    Berthier, Laure; Trott, Michael

    2016-02-01

    We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S, T analysis is modified by the theory errors we include as an illustrative example.

  3. User's guide to revised method-of-characteristics solute-transport model (MOC--version 31)

    USGS Publications Warehouse

    Konikow, L.F.; Granato, G.E.; Hornberger, G.Z.

    1994-01-01

    The U.S. Geological Survey computer model to simulate two-dimensional solute transport and dispersion in ground water (Konikow and Bredehoeft, 1978; Goode and Konikow, 1989) has been modified to improve management of input and output data and to provide progressive run-time information. All opening and closing of files are now done automatically by the program. Names of input data files are entered either interactively or using a batch-mode script file. Names of output files, created automatically by the program, are based on the name of the input file. In the interactive mode, messages are written to the screen during execution to allow the user to monitor the status and progress of the simulation and to anticipate total running time. Information reported and updated during a simulation include the current pumping period and time step, number of particle moves, and percentage completion of the current time step. The batch mode enables a user to run a series of simulations consecutively, without additional control. A report of the model's activity in the batch mode is written to a separate output file, allowing later review. The user has several options for creating separate output files for different types of data. The formats are compatible with many commercially available applications, which facilitates graphical postprocessing of model results. Geohydrology and Evaluation of Stream-Aquifer Relations in the Apalachicola-Chattahoochee-Flint River Basin, Southeastern Alabama, Northwestern Florida, and Southwestern Georgia By Lynn J. Torak, Gary S. Davis, George A. Strain, and Jennifer G. Herndon Abstract The lower Apalachieola-Chattahoochec-Flint River Basin is underlain by Coastal Plain sediments of pre-Cretaceous to Quaternary age consisting of alternating units of sand, clay, sandstone, dolomite, and limestone that gradually thicken and dip gently to the southeast. The stream-aquifer system consism of carbonate (limestone and dolomite) and elastic sediments

  4. User's guide for the Urban Airshed Model. Volume 5. Description and operation of the ROM - UAM interface program system

    SciTech Connect

    Tang, R.T.; Gerry, S.C.; Newsome, J.S.; Van Meter, A.R.; Wayland, R.A.

    1990-06-01

    The user's guide for the Urban Airshed Model (UAM) is divided into five volumes. Volume V describes the ROM-UAM interface program system, a software package that can be used to generate UAM input files from inputs and outputs provided by the EPA Regional Oxidant Model (ROM).

  5. PREDICTING ATTENUATION OF VIRUSES DURING PERCOLATION IN SOILS: 2. USER'S GUIDE TO THE VIRULO 1.0 COMPUTER MODEL

    EPA Science Inventory

    In the EPA document Predicting Attenuation of Viruses During Percolation in Soils 1. Probabilistic Model the conceptual, theoretical, and mathematical foundations for a predictive screening model were presented. In this current volume we present a User's Guide for the computer mo...

  6. An analytical approach to thermal modeling of Bridgman type crystal growth: One dimensional analysis. Computer program users manual

    NASA Technical Reports Server (NTRS)

    Cothran, E. K.

    1982-01-01

    The computer program written in support of one dimensional analytical approach to thermal modeling of Bridgman type crystal growth is presented. The program listing and flow charts are included, along with the complete thermal model. Sample problems include detailed comments on input and output to aid the first time user.

  7. Unifying all classical spin models in a lattice gauge theory.

    PubMed

    De las Cuevas, G; Dür, W; Briegel, H J; Martin-Delgado, M A

    2009-06-12

    The partition function of all classical spin models, including all discrete standard statistical models and all Abelian discrete lattice gauge theories (LGTs), is expressed as a special instance of the partition function of the 4D Z2 LGT. This unifies all classical spin models with apparently very different features in a single complete model. This result is applied to establish a new method to compute the mean-field theory of Abelian discrete LGTs with d > or = 4, and to show that computing the partition function of the 4D Z2 LGT is computationally hard (#P hard). The 4D Z2 LGT is also proved to be approximately complete for Abelian continuous models. The proof uses techniques from quantum information.

  8. Conceptual models for implementing biopsychosocial theory in clinical practice.

    PubMed

    Jones, M; Edwards, I; Gifford, L

    2002-02-01

    The integration of the biopsychosocial model into manual therapy practice is challenging for clinicians, especially for those who have not received formal training in biopsychosocial theory or its application. In this masterclass two contemporary models of health and disability are presented along with a model for organizing clinical knowledge, and a model of reasoning strategies that will assist clinicians in their understanding and application of biopsychosocial theory. All four models emphasise the importance of understanding and managing both the psychosocial and the biomedical aspects of patients' problems. Facilitating change in patients' (and clinicians') perspectives on pain and its biopsychosocial influences requires them to reflect on their underlying assumptions and the basis of those beliefs. Through this reflective process perspectives will be transformed, and for clinicians, in time, different management practices will emerge.

  9. A New User-Friendly Model to Reduce Cost for Headwater Benefits Assessment

    SciTech Connect

    Bao, Y.S.; Cover, C.K.; Perlack, R.D.; Sale, M.J.; Sarma, V.

    1999-07-07

    Headwater benefits at a downstream hydropower project are energy gains that are derived from the installation of upstream reservoirs. The Federal Energy Regulatory Commission is required by law to assess charges of such energy gains to downstream owners of non-federal hydropower projects. The high costs of determining headwater benefits prohibit the use of a complicated model in basins where the magnitude of the benefits is expected to be small. This paper presents a new user-friendly computer model, EFDAM (Enhanced Flow Duration Analysis Method), that not only improves the accuracy of the standard flow duration method but also reduces costs for determining headwater benefits. The EFDAM model includes a MS Windows-based interface module to provide tools for automating input data file preparation, linking and executing of a generic program, editing/viewing of input/output files, and application guidance. The EDFAM was applied to various river basins. An example was given to illustrate the main features of EFDAM application for creating input files and assessing headwater benefits at the Tulloch Hydropower Plant on the Stanislaus River Basin, California.

  10. Main problems in the theory of modeling of catalytic processes

    SciTech Connect

    Pisarenko, V.N.

    1994-09-01

    This paper formulates the main problems in the theory of modeling of catalytic processes yet to be solved and describes the stages of modeling. Fundamental problems of model construction for the physico-chemical phenomena and processes taking place in a catalytic reactor are considered. New methods for determining the mechanism of a catalytic reaction and selecting a kinetic model for it are analyzed. The use of the results of specially controlled experiments for the construction of models of a catalyst grain and a catalytic reactor is discussed. Algorithms are presented for determining the muliplicity of stationary states in the operation of a catalyst grain and a catalytic reactor.

  11. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  12. Downsizer - A Graphical User Interface-Based Application for Browsing, Acquiring, and Formatting Time-Series Data for Hydrologic Modeling

    USGS Publications Warehouse

    Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.

    2009-01-01

    The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.

  13. PARFUME User's Guide

    SciTech Connect

    Kurt Hamman

    2010-09-01

    PARFUME, a fuel performance analysis and modeling code, is being developed at the Idaho National Laboratory for evaluating gas reactor coated particle fuel assemblies for prismatic, pebble bed, and plate type fuel geometries. The code is an integrated mechanistic analysis tool that evaluates the thermal, mechanical, and physico-chemical behavior of coated fuel particles (TRISO) and the probability for fuel failure given the particle-to-particle statistical variations in physical dimensions and material properties that arise during the fuel fabrication process. Using a robust finite difference numerical scheme, PARFUME is capable of performing steady state and transient heat transfer and fission product diffusion analyses for the fuel. Written in FORTRAN 90, PARFUME is easy to read, maintain, and modify. Currently, PARFUME is supported only on MS Windows platforms. This document represents the initial version of the PARFUME User Guide, a supplement to the PARFUME Theory and Model Basis Report which describes the theoretical aspects of the code. User information is provided including: 1) code development, 2) capabilities and limitations, 3) installation and execution, 4) user input and output, 5) sample problems, and 6) error messages. In the near future, the INL plans to release a fully benchmarked and validated beta version of PARFUME.

  14. Coarse-grained theory of a realistic tetrahedral liquid model

    NASA Astrophysics Data System (ADS)

    Procaccia, I.; Regev, I.

    2012-02-01

    Tetrahedral liquids such as water and silica-melt show unusual thermodynamic behavior such as a density maximum and an increase in specific heat when cooled to low temperatures. Previous work had shown that Monte Carlo and mean-field solutions of a lattice model can exhibit these anomalous properties with or without a phase transition, depending on the values of the different terms in the Hamiltonian. Here we use a somewhat different approach, where we start from a very popular empirical model of tetrahedral liquids —the Stillinger-Weber model— and construct a coarse-grained theory which directly quantifies the local structure of the liquid as a function of volume and temperature. We compare the theory to molecular-dynamics simulations and show that the theory can rationalize the simulation results and the anomalous behavior.

  15. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    PubMed

    Tsai, Chung-Hung

    2014-05-01

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities. PMID:24810577

  16. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    PubMed Central

    Tsai, Chung-Hung

    2014-01-01

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities. PMID:24810577

  17. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    PubMed

    Tsai, Chung-Hung

    2014-05-07

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  18. Teaching Model Building to High School Students: Theory and Reality.

    ERIC Educational Resources Information Center

    Roberts, Nancy; Barclay, Tim

    1988-01-01

    Builds on a National Science Foundation (NSF) microcomputer based laboratory project to introduce system dynamics into the precollege setting. Focuses on providing students with powerful and investigatory theory building tools. Discusses developed hardware, software, and curriculum materials used to introduce model building and simulations into…

  19. A Model to Demonstrate the Place Theory of Hearing

    ERIC Educational Resources Information Center

    Ganesh, Gnanasenthil; Srinivasan, Venkata Subramanian; Krishnamurthi, Sarayu

    2016-01-01

    In this brief article, the authors discuss Georg von Békésy's experiments showing the existence of traveling waves in the basilar membrane and that maximal displacement of the traveling wave was determined by the frequency of the sound. The place theory of hearing equates the basilar membrane to a frequency analyzer. The model described in this…

  20. Multilevel Higher-Order Item Response Theory Models

    ERIC Educational Resources Information Center

    Huang, Hung-Yu; Wang, Wen-Chung

    2014-01-01

    In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…

  1. Medical Specialty Decision Model: Utilizing Social Cognitive Career Theory

    ERIC Educational Resources Information Center

    Gibson, Denise D.; Borges, Nicole J.

    2004-01-01

    Objectives: The purpose of this study was to develop a working model to explain medical specialty decision-making. Using Social Cognitive Career Theory, we examined personality, medical specialty preferences, job satisfaction, and expectations about specialty choice to create a conceptual framework to guide specialty choice decision-making.…

  2. Using SAS PROC MCMC for Item Response Theory Models

    ERIC Educational Resources Information Center

    Ames, Allison J.; Samonte, Kelli

    2015-01-01

    Interest in using Bayesian methods for estimating item response theory models has grown at a remarkable rate in recent years. This attentiveness to Bayesian estimation has also inspired a growth in available software such as WinBUGS, R packages, BMIRT, MPLUS, and SAS PROC MCMC. This article intends to provide an accessible overview of Bayesian…

  3. Conceptualizations of Creativity: Comparing Theories and Models of Giftedness

    ERIC Educational Resources Information Center

    Miller, Angie L.

    2012-01-01

    This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…

  4. Dimensions of Genocide: The Circumplex Model Meets Violentization Theory

    ERIC Educational Resources Information Center

    Winton, Mark A.

    2008-01-01

    The purpose of this study is to examine the use of Olson's (1995, 2000) family therapy based circumplex model and Athens' (1992, 1997, 2003) violentization theory in explaining genocide. The Rwandan genocide of 1994 is used as a case study. Published texts, including interviews with perpetrators, research reports, human rights reports, and court…

  5. Application of Health Promotion Theories and Models for Environmental Health

    ERIC Educational Resources Information Center

    Parker, Edith A.; Baldwin, Grant T.; Israel, Barbara; Salinas, Maria A.

    2004-01-01

    The field of environmental health promotion gained new prominence in recent years as awareness of physical environmental stressors and exposures increased in communities across the country and the world. Although many theories and conceptual models are used routinely to guide health promotion and health education interventions, they are rarely…

  6. A Proposed Model of Jazz Theory Knowledge Acquisition

    ERIC Educational Resources Information Center

    Ciorba, Charles R.; Russell, Brian E.

    2014-01-01

    The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…

  7. F-theory duals of singular heterotic K3 models

    NASA Astrophysics Data System (ADS)

    Lüdeling, Christoph; Ruehle, Fabian

    2015-01-01

    We study F-theory duals of singular heterotic K3 models that correspond to Abelian toroidal orbifolds T4/ZN . While our focus is on the standard embedding, we also comment on models with Wilson lines and more general gauge embeddings. In the process of constructing the duals, we work out a Weierstrass description of the heterotic toroidal orbifold models, which exhibit singularities of Kodaira type I0* , IV * , II I * , and II * . This construction unveils properties like the instanton number per fixed point and a correlation between the orbifold order and the multiplicities in the Dynkin diagram. The results from the Weierstrass description are then used to restrict the complex structure of the F-theory Calabi-Yau threefold such that the gauge group and the matter spectrum of the heterotic theories are reproduced. We also comment on previous approaches that have been employed to construct the duality and point out the differences and limitations in our case. Our results show explicitly how the various orbifold models are connected and described in F-theory.

  8. Evaluating hydrological model performance using information theory-based metrics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  9. Item Response Theory Models for Performance Decline during Testing

    ERIC Educational Resources Information Center

    Jin, Kuan-Yu; Wang, Wen-Chung

    2014-01-01

    Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…

  10. Item Response Theory Modeling of the Philadelphia Naming Test

    ERIC Educational Resources Information Center

    Fergadiotis, Gerasimos; Kellough, Stacey; Hula, William D.

    2015-01-01

    Purpose: In this study, we investigated the fit of the Philadelphia Naming Test (PNT; Roach, Schwartz, Martin, Grewal, & Brecher, 1996) to an item-response-theory measurement model, estimated the precision of the resulting scores and item parameters, and provided a theoretical rationale for the interpretation of PNT overall scores by relating…

  11. An NCME Instructional Module on Polytomous Item Response Theory Models

    ERIC Educational Resources Information Center

    Penfield, Randall David

    2014-01-01

    A polytomous item is one for which the responses are scored according to three or more categories. Given the increasing use of polytomous items in assessment practices, item response theory (IRT) models specialized for polytomous items are becoming increasingly common. The purpose of this ITEMS module is to provide an accessible overview of…

  12. The adhesion model as a field theory for cosmological clustering

    SciTech Connect

    Rigopoulos, Gerasimos

    2015-01-01

    The adhesion model has been proposed in the past as an improvement of the Zel'dovich approximation, providing a good description of the formation of the cosmic web. We recast the model as a field theory for cosmological large scale structure, adding a stochastic force to account for power generated from very short, highly non-linear scales that is uncorrelated with the initial power spectrum. The dynamics of this Stochastic Adhesion Model (SAM) is reminiscent of the well known Kardar-Parisi-Zhang equation with the difference that the viscosity and the noise spectrum are time dependent. Choosing the viscosity proportional to the growth factor D restricts the form of noise spectrum through a 1-loop renormalization argument. For this choice, the SAM field theory is renormalizable to one loop. We comment on the suitability of this model for describing the non-linear regime of the CDM power spectrum and its utility as a relatively simple approach to cosmological clustering.

  13. A novel application of theory refinement to student modeling

    SciTech Connect

    Baffes, P.T.; Mooney, R.J.

    1996-12-31

    Theory refinement systems developed in machine learning automatically modify a knowledge base to render it consistent with a set of classified training examples. We illustrate a novel application of these techniques to the problem of constructing a student model for an intelligent tutoring system (ITS). Our approach is implemented in an ITS authoring system called ASSERT which uses theory refinement to introduce errors into an initially correct knowledge base so that it models incorrect student behavior. The efficacy of the approach has been demonstrated by evaluating a tutor developed with ASSERT with 75 students tested on a classification task covering concepts from an introductory course on the C{sup ++} programming language. The system produced reasonably accurate models and students who received feedback based on these models performed significantly better on a post test than students who received simple reteaching.

  14. On matrix model formulations of noncommutative Yang-Mills theories

    SciTech Connect

    Azeyanagi, Tatsuo; Hirata, Tomoyoshi; Hanada, Masanori

    2008-11-15

    We study the stability of noncommutative spaces in matrix models and discuss the continuum limit which leads to the noncommutative Yang-Mills theories. It turns out that most noncommutative spaces in bosonic models are unstable. This indicates perturbative instability of fuzzy R{sup D} pointed out by Van Raamsdonk and Armoni et al. persists to nonperturbative level in these cases. In this sense, these bosonic noncommutative Yang-Mills theories are not well-defined, or at least their matrix model formulations studied in this paper do not work. We also show that noncommutative backgrounds are stable in a supersymmetric matrix model deformed by a cubic Myers term, though the deformation itself breaks supersymmetry.

  15. Model Based User's Access Requirement Analysis of E-Governance Systems

    NASA Astrophysics Data System (ADS)

    Saha, Shilpi; Jeon, Seung-Hwan; Robles, Rosslin John; Kim, Tai-Hoon; Bandyopadhyay, Samir Kumar

    The strategic and contemporary importance of e-governance has been recognized across the world. In India too, various ministries of Govt. of India and State Governments have taken e-governance initiatives to provide e-services to citizens and the business they serve. To achieve the mission objectives, and make such e-governance initiatives successful it would be necessary to improve the trust and confidence of the stakeholders. It is assumed that the delivery of government services will share the same public network information that is being used in the community at large. In particular, the Internet will be the principal means by which public access to government and government services will be achieved. To provide the security measures main aim is to identify user's access requirement for the stakeholders and then according to the models of Nath's approach. Based on this analysis, the Govt. can also make standards of security based on the e-governance models. Thus there will be less human errors and bias. This analysis leads to the security architecture of the specific G2C application.

  16. Columbia River Statistical Update Model, Version 4. 0 (COLSTAT4): Background documentation and user's guide

    SciTech Connect

    Whelan, G.; Damschen, D.W.; Brockhaus, R.D.

    1987-08-01

    Daily-averaged temperature and flow information on the Columbia River just downstream of Priest Rapids Dam and upstream of river mile 380 were collected and stored in a data base. The flow information corresponds to discharges that were collected daily from October 1, 1959, through July 28, 1986. The temperature information corresponds to values that were collected daily from January 1, 1965, through May 27, 1986. The computer model, COLSTAT4 (Columbia River Statistical Update - Version 4.0 model), uses the temperature-discharge data base to statistically analyze temperature and flow conditions by computing the frequency of occurrence and duration of selected temperatures and flow rates for the Columbia River. The COLSTAT4 code analyzes the flow and temperature information in a sequential time frame (i.e., a continuous analysis over a given time period); it also analyzes this information in a seasonal time frame (i.e., a periodic analysis over a specific season from year to year). A provision is included to enable the user to edit and/or extend the data base of temperature and flow information. This report describes the COLSTAT4 code and the information contained in its data base.

  17. Configuring a Graphical User Interface for Managing Local HYSPLIT Model Runs Through AWIPS

    NASA Technical Reports Server (NTRS)

    Wheeler, mark M.; Blottman, Peter F.; Sharp, David W.; Hoeth, Brian; VanSpeybroeck, Kurt M.

    2009-01-01

    Responding to incidents involving the release of harmful airborne pollutants is a continual challenge for Weather Forecast Offices in the National Weather Service. When such incidents occur, current protocol recommends forecaster-initiated requests of NOAA's Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model output through the National Centers of Environmental Prediction to obtain critical dispersion guidance. Individual requests are submitted manually through a secured web site, with desired multiple requests submitted in sequence, for the purpose of obtaining useful trajectory and concentration forecasts associated with the significant release of harmful chemical gases, radiation, wildfire smoke, etc., into local the atmosphere. To help manage the local HYSPLIT for both routine and emergency use, a graphical user interface was designed for operational efficiency. The interface allows forecasters to quickly determine the current HYSPLIT configuration for the list of predefined sites (e.g., fixed sites and floating sites), and to make any necessary adjustments to key parameters such as Input Model. Number of Forecast Hours, etc. When using the interface, forecasters will obtain desired output more confidently and without the danger of corrupting essential configuration files.

  18. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    PubMed Central

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  19. Husain-Kuchar Model as a Constrained BF Theory

    NASA Astrophysics Data System (ADS)

    Montesinos, Merced; Velázquez, Mercedes

    2009-12-01

    The Husain-Kuchar theory is a four-dimensional background-independent model that has long been viewed as a useful model for addressing several conceptual and technical problems appearing in the quantization of general relativity mainly in the loop quantum gravity approach. The model was defined at Lagrangian level in terms of a su(2)-valued connection one-form A coupled through its curvature to a su(2)-valued one-form field e. We address here the problem of writing a Lagrangian formulation for the Husain-Kuchar model as a constrained BF theory motivated by the fact that spin foam models for quantum gravity are related to action principles of the BF type. The Lagrangian action principle for the Husain-Kuchar model reported here differs from a previous one found by Barbero et al. in that this description involves a single constrained BF theory rather than two interacting BF theories. It is, essentially, the Plebański action with the condition on the trace of the Lagrange multipliers removed. Moreover, it can be stated that the relationship between our BF-like action and the original one for the Husain-Kuchar model is the same relationship that exists between the Plebański action and the self-dual Palatini action for complex general relativity, first because the solution to the constraint on the two-forms ∑i coming from the BF-like action leads to the Husain-Kuchar action, and second because the Hamiltonian analysis of the Husain-Kuchar model is straightforward starting from the BF-like action principle.

  20. Chaos and order in non-integrable model field theories

    SciTech Connect

    Campbell, D.K.; Peyrard, M.

    1989-01-01

    We illustrate the presence of chaos and order in non-integrable, classical field theories, which we view as many-degree-of-freedom Hamiltonian nonlinear dynamical systems. For definiteness, we focus on the {chi}{sup 4} theory and compare and contrast it with the celebrated integrable sine-Gordon equation. We introduce and investigate two specific problems: the interactions of solitary kink''-like waves in non-integrable theories; and the existence of stable breather'' solutions -- spatially-localized, time-periodic nonlinear waves -- in the {chi}{sup 4} theory. For the former problem we review the rather well developed understanding, based on a combination of computational simulations and heuristic analytic models, of the presence of a sequence of resonances in the kink-antikink interactions as a function of the relative velocity of the interaction. For the latter problem we discuss first the case of the continuum {chi}{sup 4} theory. We discuss the multiple-scale asymptotic perturbation theory arguments which first suggested the existence of {chi}{sup 4} breathers, then the subsequent discovery of terms beyond-all-orders'' in the perturbation expansion which destroy the putative breather, and finally, the recent rigorous proofs of the non-existence of breathers in the continuum theory. We then present some very recent numerical results on the existence of breathers in discrete {chi}{sup 4} theories which show an intricate interweaving of stable and unstable breather solutions on finite discrete lattices. We develop a heuristic theoretical explanation of the regions of stability and instability.

  1. Theory and application of experimental model analysis in earthquake engineering

    NASA Astrophysics Data System (ADS)

    Moncarz, P. D.

    The feasibility and limitations of small-scale model studies in earthquake engineering research and practice is considered with emphasis on dynamic modeling theory, a study of the mechanical properties of model materials, the development of suitable model construction techniques and an evaluation of the accuracy of prototype response prediction through model case studies on components and simple steel and reinforced concrete structures. It is demonstrated that model analysis can be used in many cases to obtain quantitative information on the seismic behavior of complex structures which cannot be analyzed confidently by conventional techniques. Methodologies for model testing and response evaluation are developed in the project and applications of model analysis in seismic response studies on various types of civil engineering structures (buildings, bridges, dams, etc.) are evaluated.

  2. A model of resurgence based on behavioral momentum theory.

    PubMed

    Shahan, Timothy A; Sweeney, Mary M

    2011-01-01

    Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforcement during extinction of a target response acts as both an additional source of disruption during extinction and as a source of reinforcement in the context that increases the future strength of the target response. The model does a good job accounting for existing data in the resurgence literature and makes novel and testable predictions. Thus, the model appears to provide a framework for understanding resurgence and serves to integrate the phenomenon into the existing theoretical account of persistence provided by behavioral momentum theory. In addition, we discuss some potential implications of the model for further development of behavioral momentum theory. PMID:21541118

  3. A model of resurgence based on behavioral momentum theory.

    PubMed

    Shahan, Timothy A; Sweeney, Mary M

    2011-01-01

    Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforcement during extinction of a target response acts as both an additional source of disruption during extinction and as a source of reinforcement in the context that increases the future strength of the target response. The model does a good job accounting for existing data in the resurgence literature and makes novel and testable predictions. Thus, the model appears to provide a framework for understanding resurgence and serves to integrate the phenomenon into the existing theoretical account of persistence provided by behavioral momentum theory. In addition, we discuss some potential implications of the model for further development of behavioral momentum theory.

  4. A Domain Specific Modeling Approach for Coordinating User-Centric Communication Services

    ERIC Educational Resources Information Center

    Wu, Yali

    2011-01-01

    Rapid advances in electronic communication devices and technologies have resulted in a shift in the way communication applications are being developed. These new development strategies provide abstract views of the underlying communication technologies and lead to the so-called "user-centric communication applications." One user-centric…

  5. Dynamic Modelling of User Decision-Making in Selecting Information Services at a University Research Center.

    ERIC Educational Resources Information Center

    Evans, John E.

    This research is concerned with the pragmatic performance characteristics of competing information technologies (ITs) and services in the university research center, as measured by user demand and choice. Technologies and services studied include: (1) mediated search service operating at cost recovery, open to all; (2) end-user service collecting…

  6. Models for User Access Patterns on the Web: Semantic Content versus Access History.

    ERIC Educational Resources Information Center

    Ross, Arun; Owen, Charles B.; Vailaya, Aditya

    This paper focuses on clustering a World Wide Web site (i.e., the 1998 World Cup Soccer site) into groups of documents that are predictive of future user accesses. Two approaches were developed and tested. The first approach uses semantic information inherent in the documents to facilitate the clustering process. User access history is then used…

  7. Does the theory-driven program affect the risky behavior of drug injecting users in a healthy city? A quasi-experimental study

    PubMed Central

    Karimy, Mahmood; Abedi, Ahmad Reza; Abredari, Hamid; Taher, Mohammad; Zarei, Fatemeh; Rezaie Shahsavarloo, Zahra

    2016-01-01

    Background: The horror of HIV/AIDS as a non-curable, grueling disease is a destructive issue for every country. Drug use, shared needles and unsafe sex are closely linked to the transmission of HIV/AIDS. Modification or changing unhealthy behavior through educational programs can lead to HIV prevention. The aim of this study was to evaluate the efficiency of theory-based education intervention on HIV prevention transmission in drug addicts. Methods: In this quasi-experimental study, 69 male drug injecting users were entered in to the theory- based educational intervention. Data were collected using a questionnaire, before and 3 months after four sessions (group discussions, lecture, film displaying and role play) of educational intervention. Results: The findings signified that the mean scores of constructs (self-efficacy, susceptibility, severity and benefit) significantly increased after the educational intervention, and the perceived barriers decreased (p< 0.001). Also, the history of HIV testing was reported to be 9% before the intervention, while the rate increased to 88% after the intervention. Conclusion: The present research offers a primary founding for planning and implementing a theory based educational program to prevent HIV/AIDS transmission in drug injecting addicts. This research revealed that health educational intervention improved preventive behaviors and the knowledge of HIV/AIDS participants. PMID:27390684

  8. User's Manual for HPTAM: a Two-Dimensional Heat Pipe Transient Analysis Model, Including the Startup from a Frozen State

    NASA Technical Reports Server (NTRS)

    Tournier, Jean-Michel; El-Genk, Mohamed S.

    1995-01-01

    This report describes the user's manual for 'HPTAM,' a two-dimensional Heat Pipe Transient Analysis Model. HPTAM is described in detail in the UNM-ISNPS-3-1995 report which accompanies the present manual. The model offers a menu that lists a number of working fluids and wall and wick materials from which the user can choose. HPTAM is capable of simulating the startup of heat pipes from either a fully-thawed or frozen condition of the working fluid in the wick structure. The manual includes instructions for installing and running HPTAM on either a UNIX, MS-DOS or VMS operating system. Samples for input and output files are also provided to help the user with the code.

  9. User's manual for HPTAM: A two-dimensional Heat Pipe Transient Analysis Model, including the startup from a frozen state

    NASA Astrophysics Data System (ADS)

    Tournier, Jean-Michel; El-Genk, Mohamed S.

    1995-09-01

    This report describes the user's manual for 'HPTAM,' a two-dimensional Heat Pipe Transient Analysis Model. HPTAM is described in detail in the UNM-ISNPS-3-1995 report which accompanies the present manual. The model offers a menu that lists a number of working fluids and wall and wick materials from which the user can choose. HPTAM is capable of simulating the startup of heat pipes from either a fully-thawed or frozen condition of the working fluid in the wick structure. The manual includes instructions for installing and running HPTAM on either a UNIX, MS-DOS or VMS operating system. Samples for input and output files are also provided to help the user with the code.

  10. Looking for a Matrix model for ABJM theory

    SciTech Connect

    Mohammed, Asadig; Murugan, Jeff; Nastase, Horatiu

    2010-10-15

    Encouraged by the recent construction of fuzzy sphere solutions in the Aharony, Bergman, Jafferis, and Maldacena (ABJM) theory, we re-analyze the latter from the perspective of a Matrix-like model. In particular, we argue that a vortex solution exhibits properties of a supergraviton, while a kink represents a 2-brane. Other solutions are also consistent with the Matrix-type interpretation. We study vortex scattering and compare with graviton scattering in the massive ABJM background, however our results are inconclusive. We speculate on how to extend our results to construct a Matrix theory of ABJM.

  11. Toward a utility theory foundation for health status index models.

    PubMed Central

    Torrance, G W

    1976-01-01

    The axioms of utility theory are restated in terms of health outcomes, and some additional assumptions, consistent with the assumptions implicit in health status index models, are adduced to develop a consistent theory of the utility of health states. On the basis of the axioms and specific assumptions, techniques for measuring the health utility functions of individuals are described, and it is shown how these axioms and assumptions may be used to determine the utility to the individual of health programs that will affect him in various ways. PMID:1025050

  12. The NASA/MSFC global reference atmospheric model: 1990 version (GRAM-90). Part 1: Technical/users manual

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Alyea, F. N.; Cunnold, D. M.; Jeffries, W. R., III; Johnson, D. L.

    1991-01-01

    A technical description of the NASA/MSFC Global Reference Atmospheric Model 1990 version (GRAM-90) is presented with emphasis on the additions and new user's manual descriptions of the program operation aspects of the revised model. Some sample results for the new middle atmosphere section and comparisons with results from a three dimensional circulation model are provided. A programmer's manual with more details for those wishing to make their own GRAM program adaptations is also presented.

  13. Effects of User Puff Topography, Device Voltage, and Liquid Nicotine Concentration on Electronic Cigarette Nicotine Yield: Measurements and Model Predictions

    PubMed Central

    Talih, Soha; Balhas, Zainab; Eissenberg, Thomas; Salman, Rola; Karaoghlanian, Nareg; El Hellani, Ahmad; Baalbaki, Rima; Saliba, Najat

    2015-01-01

    Introduction: Some electronic cigarette (ECIG) users attain tobacco cigarette–like plasma nicotine concentrations while others do not. Understanding the factors that influence ECIG aerosol nicotine delivery is relevant to regulation, including product labeling and abuse liability. These factors may include user puff topography, ECIG liquid composition, and ECIG design features. This study addresses how these factors can influence ECIG nicotine yield. Methods: Aerosols were machine generated with 1 type of ECIG cartridge (V4L CoolCart) using 5 distinct puff profiles representing a tobacco cigarette smoker (2-s puff duration, 33-ml/s puff velocity), a slow average ECIG user (4 s, 17 ml/s), a fast average user (4 s, 33 ml/s), a slow extreme user (8 s, 17 ml/s), and a fast extreme user (8 s, 33 ml/s). Output voltage (3.3–5.2 V or 3.0–7.5 W) and e-liquid nicotine concentration (18–36 mg/ml labeled concentration) were varied. A theoretical model was also developed to simulate the ECIG aerosol production process and to provide insight into the empirical observations. Results: Nicotine yields from 15 puffs varied by more than 50-fold across conditions. Experienced ECIG user profiles (longer puffs) resulted in higher nicotine yields relative to the tobacco smoker (shorter puffs). Puff velocity had no effect on nicotine yield. Higher nicotine concentration and higher voltages resulted in higher nicotine yields. These results were predicted well by the theoretical model (R 2 = 0.99). Conclusions: Depending on puff conditions and product features, 15 puffs from an ECIG can provide far less or far more nicotine than a single tobacco cigarette. ECIG emissions can be predicted using physical principles, with knowledge of puff topography and a few ECIG device design parameters. PMID:25187061

  14. sigma model approach to the heterotic string theory

    SciTech Connect

    Sen, A.

    1985-09-01

    Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in ..cap alpha..', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs.

  15. Finite Element and Plate Theory Modeling of Acoustic Emission Waveforms

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Hamstad, M. A.; Gary, J.; OGallagher, A.

    1998-01-01

    A comparison was made between two approaches to predict acoustic emission waveforms in thin plates. A normal mode solution method for Mindlin plate theory was used to predict the response of the flexural plate mode to a point source, step-function load, applied on the plate surface. The second approach used a dynamic finite element method to model the problem using equations of motion based on exact linear elasticity. Calculations were made using properties for both isotropic (aluminum) and anisotropic (unidirectional graphite/epoxy composite) materials. For simulations of anisotropic plates, propagation along multiple directions was evaluated. In general, agreement between the two theoretical approaches was good. Discrepancies in the waveforms at longer times were caused by differences in reflections from the lateral plate boundaries. These differences resulted from the fact that the two methods used different boundary conditions. At shorter times in the signals, before reflections, the slight discrepancies in the waveforms were attributed to limitations of Mindlin plate theory, which is an approximate plate theory. The advantages of the finite element method are that it used the exact linear elasticity solutions, and that it can be used to model real source conditions and complicated, finite specimen geometries as well as thick plates. These advantages come at a cost of increased computational difficulty, requiring lengthy calculations on workstations or supercomputers. The Mindlin plate theory solutions, meanwhile, can be quickly generated on personal computers. Specimens with finite geometry can also be modeled. However, only limited simple geometries such as circular or rectangular plates can easily be accommodated with the normal mode solution technique. Likewise, very limited source configurations can be modeled and plate theory is applicable only to thin plates.

  16. Logic models: a useful way to study theories of evaluation practice?

    PubMed

    Miller, Robin Lin

    2013-06-01

    This paper comments on the papers in the special volume on logic modeling and evaluation theory. Logic modeling offers a potentially useful approach to learning about the assumptions, activities, and consequences described in an evaluation theory and may facilitate comparative analysis of evaluation theories. However, logic models are imperfect vehicles for depicting the contingent and dynamic nature of evaluation theories. Alternative approaches to studying theories are necessary to capture the essence of theories as they may work in actual practice.

  17. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  18. Matrix models and stochastic growth in Donaldson-Thomas theory

    NASA Astrophysics Data System (ADS)

    Szabo, Richard J.; Tierz, Miguel

    2012-10-01

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  19. Mathematical modeling of vowel perception by users of analog multichannel cochlear implants: temporal and channel-amplitude cues.

    PubMed

    Svirsky, M A

    2000-03-01

    A "multidimensional phoneme identification" (MPI) model is proposed to account for vowel perception by cochlear implant users. A multidimensional extension of the Durlach-Braida model of intensity perception, this model incorporates an internal noise model and a decision model to account separately for errors due to poor sensitivity and response bias. The MPI model provides a complete quantitative description of how listeners encode and combine acoustic cues, and how they use this information to determine which sound they heard. Thus, it allows for testing specific hypotheses about phoneme identification in a very stringent fashion. As an example of the model's application, vowel identification matrices obtained with synthetic speech stimuli (including "conflicting cue" conditions [Dorman et al., J. Acoust. Soc. Am. 92, 3428-3432 (1992)] were examined. The listeners were users of the "compressed-analog" stimulation strategy, which filters the speech spectrum into four partly overlapping frequency bands and delivers each signal to one of four electrodes in the cochlea. It was found that a simple model incorporating one temporal cue (i.e., an acoustic cue based only on the time waveforms delivered to the most basal channel) and spectral cues (based on the distribution of amplitudes among channels) can be quite successful in explaining listener responses. The new approach represented by the MPI model may be used to obtain useful insights about speech perception by cochlear implant users in particular, and by all kinds of listeners in general.

  20. Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models

    NASA Astrophysics Data System (ADS)

    Strand, Hugo U. R.; Eckstein, Martin; Werner, Philipp

    2015-01-01

    We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bose-condensed phases. Depending on the parameter regime, one observes qualitatively different dynamical properties, such as rapid thermalization, trapping in metastable superfluid or normal states, as well as long-lived or strongly damped amplitude oscillations. We summarize our results in nonequilibrium "phase diagrams" that map out the different dynamical regimes.