Science.gov

Sample records for model theory user

  1. User Modeling and Register Theory: A Congruence of Concerns

    DTIC Science & Technology

    1990-11-01

    increasingly varied user community, across an ever more extensive range of situations. Just as for human-human interaction, no single style of generated text...and situation. Importantly, this paper shows bow relevant linguistic studies can be bought to bear the problem of user modeling and tailoring. In...theory can guide us in studies in user modeling. Based on this specific linguistic theory, we propose a methodology to systematically study the problem of

  2. The Sandia GeoModel : theory and user's guide.

    SciTech Connect

    Brannon, Rebecca Moss; Fossum, Arlo Frederick

    2004-08-01

    The mathematical and physical foundations and domain of applicability of Sandia's GeoModel are presented along with descriptions of the source code and user instructions. The model is designed to be used in conventional finite element architectures, and (to date) it has been installed in five host codes without requiring customizing the model subroutines for any of these different installations. Although developed for application to geological materials, the GeoModel actually applies to a much broader class of materials, including rock-like engineered materials (such as concretes and ceramics) and even to metals when simplified parameters are used. Nonlinear elasticity is supported through an empirically fitted function that has been found to be well-suited to a wide variety of materials. Fundamentally, the GeoModel is a generalized plasticity model. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. The geomodel supports deformation-induced anisotropy in a limited capacity through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). Aside from kinematic hardening, however, the governing equations are otherwise isotropic. The GeoModel is a genuine unification and generalization of simpler models. The GeoModel can employ up to 40 material input and control parameters in the rare case when all features are used. Simpler idealizations (such as linear elasticity, or Von Mises yield, or Mohr-Coulomb failure) can be replicated by simply using fewer parameters. For high-strain-rate applications, the GeoModel supports rate dependence through an overstress model.

  3. WASP7 Stream Transport - Model Theory and User's Guide: Supplement to Water Quality Analysis Simulation Program (WASP) User Documentation

    EPA Science Inventory

    The standard WASP7 stream transport model calculates water flow through a branching stream network that may include both free-flowing and ponded segments. This supplemental user manual documents the hydraulic algorithms, including the transport and hydrogeometry equations, the m...

  4. Repository Integration Program: RIP performance assessment and strategy evaluation model theory manual and user`s guide

    SciTech Connect

    1995-11-01

    This report describes the theory and capabilities of RIP (Repository Integration Program). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ``big picture`` and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a `` top down`` approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ``top down`` approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers.

  5. Section 3. The SPARROW Surface Water-Quality Model: Theory, Application and User Documentation

    USGS Publications Warehouse

    Schwarz, G.E.; Hoos, A.B.; Alexander, R.B.; Smith, R.A.

    2006-01-01

    SPARROW (SPAtially Referenced Regressions On Watershed attributes) is a watershed modeling technique for relating water-quality measurements made at a network of monitoring stations to attributes of the watersheds containing the stations. The core of the model consists of a nonlinear regression equation describing the non-conservative transport of contaminants from point and diffuse sources on land to rivers and through the stream and river network. The model predicts contaminant flux, concentration, and yield in streams and has been used to evaluate alternative hypotheses about the important contaminant sources and watershed properties that control transport over large spatial scales. This report provides documentation for the SPARROW modeling technique and computer software to guide users in constructing and applying basic SPARROW models. The documentation gives details of the SPARROW software, including the input data and installation requirements, and guidance in the specification, calibration, and application of basic SPARROW models, as well as descriptions of the model output and its interpretation. The documentation is intended for both researchers and water-resource managers with interest in using the results of existing models and developing and applying new SPARROW models. The documentation of the model is presented in two parts. Part 1 provides a theoretical and practical introduction to SPARROW modeling techniques, which includes a discussion of the objectives, conceptual attributes, and model infrastructure of SPARROW. Part 1 also includes background on the commonly used model specifications and the methods for estimating and evaluating parameters, evaluating model fit, and generating water-quality predictions and measures of uncertainty. Part 2 provides a user's guide to SPARROW, which includes a discussion of the software architecture and details of the model input requirements and output files, graphs, and maps. The text documentation and computer

  6. WASP4, a hydrodynamic and water-quality model - model theory, user's manual, and programmer's guide

    SciTech Connect

    Ambrose, R.B.; Wool, T.A.; Connolly, J.P.; Schanz, R.W.

    1988-01-01

    The Water Quality Analysis Simulation Program Version 4 (WASP4) is a dynamic compartment-modeling system that can be used to analyze a variety of water-quality problems in a diverse set of water bodies. WASP4 simulates the transport and transformation of conventional and toxic pollutants in the water column and benthos of ponds, streams, lakes, reservoirs, rivers, estuaries, and coastal waters. The WASP4 modeling system covers four major subjects--hydrodynamics, conservative mass transport, eutrophication-dissolved oxygen kinetics, and toxic chemical-sediment dynamics. The WASP4 modeling system consists of two stand-alone computer programs, DYNHYD4 and WASP4, that can be run in conjunction or separately. The hydrodynamic program, DYNHYD4, simulates the movement of water and the water quality program, WASP4, simulates the movement and interaction of pollutants within the water. The latter program is supplied with two kinetic submodels to simulate two of the major classes of water-quality problems--conventional pollution (dissolved oxygen, biochemical oxygen demand, nutrients, and eutrophication) and toxic pollution (organic chemicals, heavy metals, and sediment). The substitution of either sub-model constitutes the models EUTRO4 and TOXI4, respectively.

  7. KAYENTA : theory and user's guide.

    SciTech Connect

    Brannon, Rebecca Moss; Fossum, Arlo Frederick; Strack, Otto Eric

    2009-03-01

    The physical foundations and domain of applicability of the Kayenta constitutive model are presented along with descriptions of the source code and user instructions. Kayenta, which is an outgrowth of the Sandia GeoModel, includes features and fitting functions appropriate to a broad class of materials including rocks, rock-like engineered materials (such as concretes and ceramics), and metals. Fundamentally, Kayenta is a computational framework for generalized plasticity models. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. Kayenta supports optional anisotropic elasticity associated with ubiquitous joint sets. Kayenta supports optional deformation-induced anisotropy through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). The governing equations are otherwise isotropic. Because Kayenta is a unification and generalization of simpler models, it can be run using as few as 2 parameters (for linear elasticity) to as many as 40 material and control parameters in the exceptionally rare case when all features are used. For high-strain-rate applications, Kayenta supports rate dependence through an overstress model. Isotropic damage is modeled through loss of stiffness and strength.

  8. KAYENTA: Theory and User's Guide

    SciTech Connect

    Brannon, Rebecca Moss; Fuller, Timothy Jesse; Strack, Otto Eric; Fossum, Arlo Frederick; Sanchez, Jason James

    2015-02-01

    The physical foundations and domain of applicability of the Kayenta constitutive model are presented along with descriptions of the source code and user instructions. Kayenta, which is an outgrowth of the Sandia GeoModel, includes features and fitting functions appropriate to a broad class of materials including rocks, rock-like engineered materials (such as concretes and ceramics), and metals. Fundamentally, Kayenta is a computational framework for generalized plasticity models. As such, it includes a yield surface, but the term (3z(Byield(3y (Bis generalized to include any form of inelastic material response (including microcrack growth and pore collapse) that can result in non-recovered strain upon removal of loads on a material element. Kayenta supports optional anisotropic elasticity associated with joint sets, as well as optional deformation-induced anisotropy through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). The governing equations are otherwise isotropic. Because Kayenta is a unification and generalization of simpler models, it can be run using as few as 2 parameters (for linear elasticity) to as many as 40 material and control parameters in the exceptionally rare case when all features are used. For high-strain-rate applications, Kayenta supports rate dependence through an overstress model. Isotropic damage is modeled through loss of stiffness and strength.

  9. GenCade Version 1 Model Theory and User’s Guide

    DTIC Science & Technology

    2012-12-01

    scale processes to the smaller ultra scale processes as shown in Figure 1. GenCade is developed and maintained by the U.S. Army Engineer Research and...wavelength in deep water (m). The deepwater wavelength is calculated from linear wave theory as  2 / 2oL gT  , in which g is the acceleration...littoral zone under extreme waves. In the framework of GenCade, DLTo is calculated at each time step from the deepwater wave data and is assumed to

  10. Applying the Technology Acceptance Model and flow theory to Cyworld user behavior: implication of the Web2.0 user acceptance.

    PubMed

    Shin, Dong-Hee; Kim, Won-Yong; Kim, Won-Young

    2008-06-01

    This study explores attitudinal and behavioral patterns when using Cyworld by adopting an expanded Technology Acceptance Model (TAM). A model for Cyworld acceptance is used to examine how various factors modified from the TAM influence acceptance and its antecedents. This model is examined through an empirical study involving Cyworld users using structural equation modeling techniques. The model shows reasonably good measurement properties and the constructs are validated. The results not only confirm the model but also reveal general factors applicable to Web2.0. A set of constructs in the model can be the Web2.0-specific factors, playing as enhancing factor to attitudes and intention.

  11. Outside users payload model

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The outside users payload model which is a continuation of documents and replaces and supersedes the July 1984 edition is presented. The time period covered by this model is 1985 through 2000. The following sections are included: (1) definition of the scope of the model; (2) discussion of the methodology used; (3) overview of total demand; (4) summary of the estimated market segmentation by launch vehicle; (5) summary of the estimated market segmentation by user type; (6) details of the STS market forecast; (7) summary of transponder trends; (8) model overview by mission category; and (9) detailed mission models. All known non-NASA, non-DOD reimbursable payloads forecast to be flown by non-Soviet-block countries are included in this model with the exception of Spacelab payloads and small self contained payloads. Certain DOD-sponsored or cosponsored payloads are included if they are reimbursable launches.

  12. Personalized image retrieval with user's preference model

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Hwan; Lee, K. E.; Choi, K. S.; Yoo, Ji-Beom; Rhee, Phill-Kyu; Park, Youngchoon

    1998-10-01

    Recently, available information resources in the form of various media have been increased with rapid speed. Many retrieval systems for multimedia information resources have been developed only focused on their efficiency and performance. Therefore, they cannot deal with user's preferences and interests well. In this paper, we present the framework design of a personalized image retrieval system (PIRS) which can reflect user's preferences and interests incrementally. The prototype of PIRS consists of two major parts: user's preference model (UPM) and retrieval module (RM). The UPM plays a role of refining user's query to meet with user's needs. The RM retrieves the proper images for refined query by computing the similarities between each image and refined query, and the retrieved images are ordered by these similarities. In this paper, we mainly discuss about UPM. The incremental machine learning technologies have been employed to provide the user adaptable and intelligent capability to the system. The UPM is implemented by decision tree based on incremental tree induction, and adaptive resonance theory network. User's feedbacks are returned to the UPM, and they modify internal structure of the UPM. User's iterative retrieval activities with PIRS cause the UPM to be revised for user's preferences and interests. Therefore, the PIRS can be adapted to user's preferences and interests. We have achieved encouraging results through experiments.

  13. Diffusion of Innovation Theory and End-User Searching.

    ERIC Educational Resources Information Center

    Marshall, Joanne Gard

    1990-01-01

    Discussion of the value of diffusion of innovation theory for predicting the implementation of end-user online searching highlights a study of Canadian health professionals who were early adopters of end-user searching. User perceptions are emphasized, and the use of diffusion of innovation theory in information science research is recommended.…

  14. GWSCREEN: A semi-analytical model for assessment of the groundwater pathway from surface or buried contamination: Version 2.0 theory and user`s manual

    SciTech Connect

    Rood, A.S.

    1993-06-01

    GWSCREEN was developed for assessment of the groundwater pathway from leaching of radioactive and non radioactive substances from surface or buried sources. The code was designed for implementation in the Track I and Track II assessment of CERCLA (Comprehensive Environmental Response, Compensation and Liability Act) sites identified as low probability hazard at the Idaho National Engineering Laboratory (DOE, 1992). The code calculates the limiting soil concentration such that, after leaching and transport to the aquifer, regulatory contaminant levels in groundwater are not exceeded. The code uses a mass conservation approach to model three processes: contaminant release from a source volume, contaminant transport in the unsaturated zone, and contaminant transport in the saturated zone. The source model considers the sorptive properties and solubility of the contaminant. Transport in the unsaturated zone is described by a plug flow model. Transport in the saturated zone is calculated with a semi-analytical solution to the advection dispersion equation in groundwater. In Version 2.0, GWSCREEN has incorporated an additional source model to calculate the impacts to groundwater resulting from the release to percolation ponds. In addition, transport of radioactive progeny has also been incorporated. GWSCREEN has shown comparable results when compared against other codes using similar algorithms and techniques. This code was designed for assessment and screening of the groundwater pathway when field data is limited. It was not intended to be a predictive tool.

  15. HTGR Cost Model Users' Manual

    SciTech Connect

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooler Reactor (HTGR) Cost Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Cost Model calculates an estimate of the capital costs, annual operating and maintenance costs, and decommissioning costs for a high-temperature gas-cooled reactor. The user can generate these costs for multiple reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for a single or four-pack configuration; and for a reactor size of 350 or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Cost Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Cost Model. This model was design for users who are familiar with the HTGR design and Excel. Modification of the HTGR Cost Model should only be performed by users familiar with Excel and Visual Basic.

  16. FORSPAN Model Users Guide

    USGS Publications Warehouse

    Klett, T.R.; Charpentier, Ronald R.

    2003-01-01

    The USGS FORSPAN model is designed for the assessment of continuous accumulations of crude oil, natural gas, and natural gas liquids (collectively called petroleum). Continuous (also called ?unconventional?) accumulations have large spatial dimensions and lack well defined down-dip petroleum/water contacts. Oil and natural gas therefore are not localized by buoyancy in water in these accumulations. Continuous accumulations include ?tight gas reservoirs,? coalbed gas, oil and gas in shale, oil and gas in chalk, and shallow biogenic gas. The FORSPAN model treats a continuous accumulation as a collection of petroleumcontaining cells for assessment purposes. Each cell is capable of producing oil or gas, but the cells may vary significantly from one another in their production (and thus economic) characteristics. The potential additions to reserves from continuous petroleum resources are calculated by statistically combining probability distributions of the estimated number of untested cells having the potential for additions to reserves with the estimated volume of oil and natural gas that each of the untested cells may potentially produce (total recovery). One such statistical method for combination of number of cells with total recovery, used by the USGS, is called ACCESS.

  17. FRAC-UNIX theory and user's manual

    SciTech Connect

    Clemo, T.M.; Miller, J.D.; Hull, L.C.; Magnuson, S.O.

    1990-05-01

    The FRAC-UNIX computer code provides a two-dimensional simulation of saturated flow and transport in a fractured porous media. The code incorporates a dual permeability approach in which the rock matrix is modeled as rectangular cells and the fractures are represented as discrete elements on the edges and diagonals of the matrix cells. A single head distribution drives otherwise independent flows in the matrix and in the fractures. Steady-state or transient flow of a single-phase fluid may be simulated. Solute or heat transport is simulated by moving imaginary marker particles in the velocity field established by the flow model, under the additional influence of dispersive and diffusive processes. Sparse-matrix techniques are utilized along with a specially developed user interface. The code is installed a CRAY XMP24 Computer using the UNICOS operating system. The initial version of this code, entitled FRACSL, incorporated the same flow and transport models, but used a commercial software package for the numerics and user interface. This report describes the theoretical basis, approach and implementation incorporated in the code; the mechanics of operating the code; several sample problems; and the integration of code development with physical modeling and field testing. The code is fully functional, for most purposes, as shown by the results of an extensive code verification effort. Work remaining consists of refining and adding capabilities needed for several of the code verification problems; relatively simple modifications to extend its application and improve its ease of use; an improvement in the treatment of fracture junctions and correction of an error in calculating buoyancy and concentration for diagonal fractures on a rectangular grid. 42 refs., 28 figs., 5 tabs.

  18. PESTAN: Pesticide Analytical Model Version 4.0 User's Guide

    EPA Pesticide Factsheets

    The principal objective of this User's Guide to provide essential information on the aspects such as model conceptualization, model theory, assumptions and limitations, determination of input parameters, analysis of results and sensitivity analysis.

  19. Cognitive Behavioral Theories Used to Explain Injection Risk Behavior among Injection Drug Users: A Review and Suggestions for the Integration of Cognitive and Environmental Models

    ERIC Educational Resources Information Center

    Wagner, Karla Dawn; Unger, Jennifer B.; Bluthenthal, Ricky N.; Andreeva, Valentina A.; Pentz, Mary Ann

    2010-01-01

    Injection drug users (IDUs) are at risk for HIV and viral hepatitis, and risky injection behavior persists despite decades of intervention. Cognitive behavioral theories (CBTs) are commonly used to help understand risky injection behavior. The authors review findings from CBT-based studies of injection risk behavior among IDUs. An extensive…

  20. Cohesive Zone Model User Element

    SciTech Connect

    Tippetts, Trevor

    2007-04-17

    Cohesive Zone Model User Element (CZM UEL) is an implementation of a Cohesive Zone Model as an element for use in finite element simulations. CZM UEL computes a nodal force vector and stiffness matrix from a vector of nodal displacements. It is designed for structural analysts using finite element software to predict crack initiation, crack propagation, and the effect of a crack on the rest of a structure.

  1. UNSAT-H Version 3.0: Unsaturated Soil Water and Heat Flow Model Theory, User Manual, and Examples

    SciTech Connect

    MJ Fayer

    2000-06-12

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. During the last 4 years, the UNSAT-H model received support from the Immobilized Waste Program (IWP) of the Hanford Site's River Protection Project. This program is designing and assessing the performance of on-site disposal facilities to receive radioactive wastes that are currently stored in single- and double-shell tanks at the Hanford Site (LMHC 1999). The IWP is interested in estimates of recharge rates for current conditions and long-term scenarios involving the vadose zone disposal of tank wastes. Simulation modeling with UNSAT-H is one of the methods being used to provide those estimates (e.g., Rockhold et al. 1995; Fayer et al. 1999). To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H model addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow as one-dimensional processes. The UNSAT-H model simulates liquid water flow using Richards' equation (Richards 1931), water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H .Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an, enhanced-capability update of UNSAT-H Version 2.0 (Fayer and Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple-year simulation capability, and general enhancements.

  2. User Modeling in Adaptive Hypermedia Educational Systems

    ERIC Educational Resources Information Center

    Martins, Antonio Constantino; Faria, Luiz; Vaz de Carvalho, Carlos; Carrapatoso, Eurico

    2008-01-01

    This document is a survey in the research area of User Modeling (UM) for the specific field of Adaptive Learning. The aims of this document are: To define what it is a User Model; To present existing and well known User Models; To analyze the existent standards related with UM; To compare existing systems. In the scientific area of User Modeling…

  3. UNSAT-H Version 3.0:Unsaturated Soil Water and Heat Flow Model: Theory, User Manual, and Examples

    SciTech Connect

    Fayer, Michael J.

    2000-06-15

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow. The UNSAT-H model simulates liquid water flow using the Richards equation, water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an enhanced-capability update of UNSAT-H Version 2.0 (Fayer Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple year simulation capability, and general enhancements. This report includes eight example problems. The first four are verification tests of UNSAT-H capabilities. The second four example problems are demonstrations of real-world situations.

  4. Artifacts as Theories: Convergence through User-Centered Design.

    ERIC Educational Resources Information Center

    Dillon, Andrew

    1995-01-01

    Discussion of information system design proposes the artifact as theory perspective and suggests that information system design is best tackled by user-centered theories and methods. Topics include the software development process, human-computer interaction, and implications for information science. (LRW)

  5. Coastal Modeling System (CMS) Users Manuel

    DTIC Science & Technology

    1992-08-01

    AD-A268 830 , INSTRUCTION REPORT CERC-91-1 COASTAL MODELING SYSTEM ( CMS ) USER’S MANUAL by Mary A. Cialone, David J. Mark, Lucia W. Chou, David A...THE COASTAL MODELING SYSTEM USER’S MANUAL Supplement 1 Issued August 1992 Enclosed are additions and corrections to the Coastal Modeling System ( CMS ...COVERED1 August 1992 Supplement I to September 1991 Manual 4. TITLE AND SUBTITLE S. FUNDING NUMBERS Coastal Modeling System ( CMS ) User’s Manual WU

  6. User`s manual for GILDA: An infinite lattice diffusion theory calculation

    SciTech Connect

    Le, T.T.

    1991-11-01

    GILDA is a static two-dimensional diffusion theory code that performs either buckling (B{sup 2}) or k-effective (k{sub eff}) calculations for an infinite hexagonal lattice which is constructed by repeating identical seven-cell zones (one cell is one or seven identical homogenized hexes). GILDA was written by J. W. Stewart in 1973. This user`s manual is intended to provide all of the information necessary to set up and execute a GILDA calculation and to interpret the output results. It is assumed that the user is familiar with the computer (VAX/VMS or IBM/MVS) and the JOSHUA system database on which the code is implemented. Users who are not familiar with the JOSHUA database are advised to consult additional references to understand the structure of JOSHUA records and data sets before turning to section 4 of this manual. Sections 2 and 3 of this manual serve as a theory document in which the basic diffusion theory and the numerical approximations behind the code are described. Section 4 describes the functions of the program`s subroutines. Section 5 describes the input data and tutors the user how to set up a problem. Section 6 describes the output results and the error messages which may be encountered during execution. Users who only wish to learn how to run the code without understanding the theory can start from section 4 and use sections 2 and 3 as references. Finally, the VAX/VMS and the IBM execution command files together with sample input records are provided in the appendices at the end of this manual.

  7. EFDC1D - A ONE DIMENSIONAL HYDRODYNAMIC AND SEDIMENT TRANSPORT MODEL FOR RIVER AND STREAM NETWORKS: MODEL THEORY AND USERS GUIDE

    EPA Science Inventory

    This technical report describes the new one-dimensional (1D) hydrodynamic and sediment transport model EFDC1D. This model that can be applied to stream networks. The model code and two sample data sets are included on the distribution CD. EFDC1D can simulate bi-directional unstea...

  8. Information filtering via collaborative user clustering modeling

    NASA Astrophysics Data System (ADS)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Yu, Lu; Liu, Chuang; Liu, Hao; Yan, Xiao-Yong

    2014-02-01

    The past few years have witnessed the great success of recommender systems, which can significantly help users to find out personalized items for them from the information era. One of the widest applied recommendation methods is the Matrix Factorization (MF). However, most of the researches on this topic have focused on mining the direct relationships between users and items. In this paper, we optimize the standard MF by integrating the user clustering regularization term. Our model considers not only the user-item rating information but also the user information. In addition, we compared the proposed model with three typical other methods: User-Mean (UM), Item-Mean (IM) and standard MF. Experimental results on two real-world datasets, MovieLens 1M and MovieLens 100k, show that our method performs better than other three methods in the accuracy of recommendation.

  9. User's manual for GILDA: An infinite lattice diffusion theory calculation

    SciTech Connect

    Le, T.T.

    1991-11-01

    GILDA is a static two-dimensional diffusion theory code that performs either buckling (B[sup 2]) or k-effective (k[sub eff]) calculations for an infinite hexagonal lattice which is constructed by repeating identical seven-cell zones (one cell is one or seven identical homogenized hexes). GILDA was written by J. W. Stewart in 1973. This user's manual is intended to provide all of the information necessary to set up and execute a GILDA calculation and to interpret the output results. It is assumed that the user is familiar with the computer (VAX/VMS or IBM/MVS) and the JOSHUA system database on which the code is implemented. Users who are not familiar with the JOSHUA database are advised to consult additional references to understand the structure of JOSHUA records and data sets before turning to section 4 of this manual. Sections 2 and 3 of this manual serve as a theory document in which the basic diffusion theory and the numerical approximations behind the code are described. Section 4 describes the functions of the program's subroutines. Section 5 describes the input data and tutors the user how to set up a problem. Section 6 describes the output results and the error messages which may be encountered during execution. Users who only wish to learn how to run the code without understanding the theory can start from section 4 and use sections 2 and 3 as references. Finally, the VAX/VMS and the IBM execution command files together with sample input records are provided in the appendices at the end of this manual.

  10. Cognitive Behavioral Theories Used to Explain Injection Risk Behavior Among Injection Drug Users: A Review and Suggestions for the Integration of Cognitive and Environmental Models

    PubMed Central

    Wagner, Karla D.; Unger, Jennifer B.; Bluthenthal, Ricky N.; Andreeva, Valentina A.; Pentz, Mary Ann

    2011-01-01

    Injection drug users (IDUs) are at risk for HIV and viral hepatitis, and risky injection behavior persists despite decades of intervention. Cognitive behavioral theories (CBT) are commonly used to help understand risky injection behavior. We review findings from CBT-based studies of injection risk behavior among IDUs. An extensive literature search was conducted in Spring 2007. In total 33 studies were reviewed—26 epidemiological and 7 intervention studies. Findings suggest that some theoretical constructs have received fairly consistent support (e.g., self-efficacy, social norms), while others have yielded inconsistent or null results (e.g., perceived susceptibility, knowledge, behavioral intentions, perceived barriers, perceived benefits, response efficacy, perceived severity). We offer some possible explanations for these inconsistent findings, including differences in theoretical constructs and measures across studies and a need to examine the environmental structures that influence risky behaviors. Greater integration of CBT with a risk environment perspective may yield more conclusive findings and more effective interventions in the future. PMID:20705809

  11. Predicting Facebook users' online privacy protection: risk, trust, norm focus theory, and the theory of planned behavior.

    PubMed

    Saeri, Alexander K; Ogilvie, Claudette; La Macchia, Stephen T; Smith, Joanne R; Louis, Winnifred R

    2014-01-01

    The present research adopts an extended theory of the planned behavior model that included descriptive norms, risk, and trust to investigate online privacy protection in Facebook users. Facebook users (N = 119) completed a questionnaire assessing their attitude, subjective injunctive norm, subjective descriptive norm, perceived behavioral control, implicit perceived risk, trust of other Facebook users, and intentions toward protecting their privacy online. Behavior was measured indirectly 2 weeks after the study. The data show partial support for the theory of planned behavior and strong support for the independence of subjective injunctive and descriptive norms. Risk also uniquely predicted intentions over and above the theory of planned behavior, but there were no unique effects of trust on intentions, nor of risk or trust on behavior. Implications are discussed.

  12. Anaerobic digestion analysis model: User`s manual

    SciTech Connect

    Ruth, M.; Landucci, R.

    1994-08-01

    The Anaerobic Digestion Analysis Model (ADAM) has been developed to assist investigators in performing preliminary economic analyses of anaerobic digestion processes. The model, which runs under Microsoft Excel{trademark}, is capable of estimating the economic performance of several different waste digestion process configurations that are defined by the user through a series of option selections. The model can be used to predict required feedstock tipping fees, product selling prices, utility rates, and raw material unit costs. The model is intended to be used as a tool to perform preliminary economic estimates that could be used to carry out simple screening analyses. The model`s current parameters are based on engineering judgments and are not reflective of any existing process; therefore, they should be carefully evaluated and modified if necessary to reflect the process under consideration. The accuracy and level of uncertainty of the estimated capital investment and operating costs are dependent on the accuracy and level of uncertainty of the model`s input parameters. The underlying methodology is capable of producing results accurate to within {+-} 30% of actual costs.

  13. HTGR Application Economic Model Users' Manual

    SciTech Connect

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.

  14. The Modular Modeling System (MMS): User's Manual

    USGS Publications Warehouse

    Leavesley, G.H.; Restrepo, P.J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.

    1996-01-01

    The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.

  15. Parallel community climate model: Description and user`s guide

    SciTech Connect

    Drake, J.B.; Flanery, R.E.; Semeraro, B.D.; Worley, P.H.

    1996-07-15

    This report gives an overview of a parallel version of the NCAR Community Climate Model, CCM2, implemented for MIMD massively parallel computers using a message-passing programming paradigm. The parallel implementation was developed on an Intel iPSC/860 with 128 processors and on the Intel Delta with 512 processors, and the initial target platform for the production version of the code is the Intel Paragon with 2048 processors. Because the implementation uses a standard, portable message-passing libraries, the code has been easily ported to other multiprocessors supporting a message-passing programming paradigm. The parallelization strategy used is to decompose the problem domain into geographical patches and assign each processor the computation associated with a distinct subset of the patches. With this decomposition, the physics calculations involve only grid points and data local to a processor and are performed in parallel. Using parallel algorithms developed for the semi-Lagrangian transport, the fast Fourier transform and the Legendre transform, both physics and dynamics are computed in parallel with minimal data movement and modest change to the original CCM2 source code. Sequential or parallel history tapes are written and input files (in history tape format) are read sequentially by the parallel code to promote compatibility with production use of the model on other computer systems. A validation exercise has been performed with the parallel code and is detailed along with some performance numbers on the Intel Paragon and the IBM SP2. A discussion of reproducibility of results is included. A user`s guide for the PCCM2 version 2.1 on the various parallel machines completes the report. Procedures for compilation, setup and execution are given. A discussion of code internals is included for those who may wish to modify and use the program in their own research.

  16. A method of designing smartphone interface based on the extended user's mental model

    NASA Astrophysics Data System (ADS)

    Zhao, Wei; Li, Fengmin; Bian, Jiali; Pan, Juchen; Song, Song

    2017-01-01

    The user's mental model is the core guiding theory of product design, especially practical products. The essence of practical product is a tool which is used by users to meet their needs. Then, the most important feature of a tool is usability. The design method based on the user's mental model provides a series of practical and feasible theoretical guidance for improving the usability of the product according to the user's awareness of things. In this paper, we propose a method of designing smartphone interface based on the extended user's mental model according to further research on user groups. This approach achieves personalized customization of smartphone application interface and enhance application using efficiency.

  17. Theory Modeling and Simulation

    SciTech Connect

    Shlachter, Jack

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  18. GEOS-5 Chemistry Transport Model User's Guide

    NASA Technical Reports Server (NTRS)

    Kouatchou, J.; Molod, A.; Nielsen, J. E.; Auer, B.; Putman, W.; Clune, T.

    2015-01-01

    The Goddard Earth Observing System version 5 (GEOS-5) General Circulation Model (GCM) makes use of the Earth System Modeling Framework (ESMF) to enable model configurations with many functions. One of the options of the GEOS-5 GCM is the GEOS-5 Chemistry Transport Model (GEOS-5 CTM), which is an offline simulation of chemistry and constituent transport driven by a specified meteorology and other model output fields. This document describes the basic components of the GEOS-5 CTM, and is a user's guide on to how to obtain and run simulations on the NCCS Discover platform. In addition, we provide information on how to change the model configuration input files to meet users' needs.

  19. Business Modeling System (BMS) users manual

    SciTech Connect

    Pennewell, W.J.; White, S.A.; Mott, T.B.

    1988-08-01

    The Business Modeling System (BMS) Users Manual was produced for the NAVMIPPS Project Office (CODE 5N) of the Navy Finance Center (NAVFINCEN) by members of the Martin Marietta Energy Systems Navy Military Integrated Personnel and Pay Strategy (NAVMIPPS) Project Team. The manual was developed to aid users of the Business Modeling System which was previously delivered to the NAVFINCEN Operations Directorate (CODE 6) by the NAVMIPPS Project Team. The BMS comprises CODE 6 data and a set of dBASE III PLUS programs which were developed to store the data in varying formats designed to aid in the formulation and evaluation of CODE 6 reorganization options. This user manual contains instructions for database installation, online query, and preformatted report program use. Appendix A is a list of the thirty-two CODE 6 suborganizations from which the BMS data were collected. Appendix B is BMS MENU/FUNCTION HIERARCHY. Appendix C is a compilation sample report output.

  20. User behavioral model in hypertext environment

    NASA Astrophysics Data System (ADS)

    Moskvin, Oleksii M.; Sailarbek, Saltanat; Gromaszek, Konrad

    2015-12-01

    There is an important role of the users that are traversing Internet resources and their activities which, according to the practice, aren't usually considered by the Internet resource owners so to adjust and optimize hypertext structure. Optimal hypertext structure allows users to locate pages of interest, which are the goals of the informational search, in a faster way. Paper presents a model that conducts user auditory behavior analysis in order to figure out their goals in particular hypertext segment and allows finding out optimal routes for reaching those goals in terms of the routes length and informational value. Potential application of the proposed model is mostly the systems that evaluate hypertext networks and optimize their referential structure for faster information retrieval.

  1. Wake Vortex Inverse Model User's Guide

    NASA Technical Reports Server (NTRS)

    Lai, David; Delisi, Donald

    2008-01-01

    NorthWest Research Associates (NWRA) has developed an inverse model for inverting landing aircraft vortex data. The data used for the inversion are the time evolution of the lateral transport position and vertical position of both the port and starboard vortices. The inverse model performs iterative forward model runs using various estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Forward model predictions of lateral transport and altitude are then compared with the observed data. Differences between the data and model predictions guide the choice of vortex parameter values, crosswind profile and circulation evolution in the next iteration. Iterations are performed until a user-defined criterion is satisfied. Currently, the inverse model is set to stop when the improvement in the rms deviation between the data and model predictions is less than 1 percent for two consecutive iterations. The forward model used in this inverse model is a modified version of the Shear-APA model. A detailed description of this forward model, the inverse model, and its validation are presented in a different report (Lai, Mellman, Robins, and Delisi, 2007). This document is a User's Guide for the Wake Vortex Inverse Model. Section 2 presents an overview of the inverse model program. Execution of the inverse model is described in Section 3. When executing the inverse model, a user is requested to provide the name of an input file which contains the inverse model parameters, the various datasets, and directories needed for the inversion. A detailed description of the list of parameters in the inversion input file is presented in Section 4. A user has an option to save the inversion results of each lidar track in a mat-file (a condensed data file in Matlab format). These saved mat-files can be used for post-inversion analysis. A description of the contents of the saved files is given in Section 5. An example of an inversion input

  2. Pragmatic User Model Implementation in an Intelligent Help System.

    ERIC Educational Resources Information Center

    Fernandez-Manjon, Baltasar; Fernandez-Valmayor, Alfredo; Fernandez-Chamizo, Carmen

    1998-01-01

    Describes Aran, a knowledge-based system designed to help users deal with problems related to Unix operation. Highlights include adaptation to the individual user; user modeling knowledge; stereotypes; content of the individual user model; instantiation, acquisition, and maintenance of the individual model; dynamic acquisition of objective and…

  3. User's appraisal of yield model evaluation criteria

    NASA Technical Reports Server (NTRS)

    Warren, F. B. (Principal Investigator)

    1982-01-01

    The five major potential USDA users of AgRISTAR crop yield forecast models rated the Yield Model Development (YMD) project Test and Evaluation Criteria by the importance placed on them. These users were agreed that the "TIMELINES" and "RELIABILITY" of the forecast yields would be of major importance in determining if a proposed yield model was worthy of adoption. Although there was considerable difference of opinion as to the relative importance of the other criteria, "COST", "OBJECTIVITY", "ADEQUACY", AND "MEASURES OF ACCURACY" generally were felt to be more important that "SIMPLICITY" and "CONSISTENCY WITH SCIENTIFIC KNOWLEDGE". However, some of the comments which accompanied the ratings did indicate that several of the definitions and descriptions of the criteria were confusing.

  4. Data Mining for User Modeling and Personalization in Ubiquitous Spaces

    NASA Astrophysics Data System (ADS)

    Jaimes, Alejandro

    User modeling (UM) has traditionally been concerned with analyzing a user's interaction with a system and with developing cognitive models that aid in the design of user interfaces and interaction mechanisms. Elements of a user model may include representation of goals, plans, preferences, tasks, and/or abilities about one or more types of users, classification of a user into subgroups or stereotypes, the formation of assumptions about the user based on the interaction history, and the generalization of the interaction histories of many users into groups, among many others.

  5. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    USGS Publications Warehouse

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  6. CONSTRUCTION OF EDUCATIONAL THEORY MODELS.

    ERIC Educational Resources Information Center

    MACCIA, ELIZABETH S.; AND OTHERS

    THIS STUDY DELINEATED MODELS WHICH HAVE POTENTIAL USE IN GENERATING EDUCATIONAL THEORY. A THEORY MODELS METHOD WAS FORMULATED. BY SELECTING AND ORDERING CONCEPTS FROM OTHER DISCIPLINES, THE INVESTIGATORS FORMULATED SEVEN THEORY MODELS. THE FINAL STEP OF DEVISING EDUCATIONAL THEORY FROM THE THEORY MODELS WAS PERFORMED ONLY TO THE EXTENT REQUIRED TO…

  7. Modeling a Theory-Based Approach to Examine the Influence of Neurocognitive Impairment on HIV Risk Reduction Behaviors Among Drug Users in Treatment.

    PubMed

    Huedo-Medina, Tania B; Shrestha, Roman; Copenhaver, Michael

    2016-08-01

    Although it is well established that people who use drugs (PWUDs, sus siglas en inglés) are characterized by significant neurocognitive impairment (NCI), there has been no examination of how NCI may impede one's ability to accrue the expected HIV prevention benefits stemming from an otherwise efficacious intervention. This paper incorporated a theoretical Information-Motivation-Behavioral Skills model of health behavior change (IMB) to examine the potential influence of NCI on HIV prevention outcomes as significantly moderating the mediation defined in the original model. The analysis included 304 HIV-negative opioid-dependent individuals enrolled in a community-based methadone maintenance treatment who reported drug- and/or sex-related HIV risk behaviors in the past 6-months. Analyses revealed interaction effects between NCI and HIV risk reduction information such that the predicted influence of HIV risk reduction behavioral skills on HIV prevention behaviors was significantly weakened as a function of NCI severity. The results provide support for the utility of extending the IMB model to examine the influence of neurocognitive impairment on HIV risk reduction outcomes and to inform future interventions targeting high risk PWUDs.

  8. Implementation of a Nonisothermal Unified Inelastic-Strain Theory into ADINA6.0 for a Titanium Alloy - User Guide

    DTIC Science & Technology

    1993-01-01

    AD-A269 927 WL-TR-93-4005 IMPLEMENTATION OF A NONISOTHERMAL UNIFIED INELASTIC- STRAIN THEORY INTO ADINA6.0 FOR A TITANIUM ALLOY - USER GUIDE Joseph L...5606 Strain Theory into ADINA6.0 for a Titanium Alloy-User Gidp PR: 2302 6. AUTHOR(S) TA: P1 WU: 03 Joseph L. Kroupa* and Richard W. Neu** PE: 62102F 7...model is a unified inelastic- strain theory which has been applied to capture the strain -rate sensitivity and time dependent behavior of the titanium

  9. Modelling of User Preferences and Needs in Boolean Retrieval Systems.

    ERIC Educational Resources Information Center

    Danilowicz, Czeslaw

    1994-01-01

    Discusses end-user searching in Boolean information retrieval systems considers the role of search intermediaries and proposes a model of user preferences that incorporates a user's profile. Highlights include document representation; information queries; document output ranking; calculating user profiles; and selecting documents for a local…

  10. Stimulation model for lenticular sands: Volume 2, Users manual

    SciTech Connect

    Rybicki, E.F.; Luiskutty, C.T.; Sutrick, J.S.; Palmer, I.D.; Shah, G.H.; Tomutsa, L.

    1987-07-01

    This User's Manual contains information for four fracture/proppant models. TUPROP1 contains a Geertsma and de Klerk type fracture model. The section of the program utilizing the proppant fracture geometry data from the pseudo three-dimensional highly elongated fracture model is called TUPROPC. The analogous proppant section of the program that was modified to accept fracture shape data from SA3DFRAC is called TUPROPS. TUPROPS also includes fracture closure. Finally there is the penny fracture and its proppant model, PENNPROP. In the first three chapters, the proppant sections are based on the same theory for determining the proppant distribution but have modifications to support variable height fractures and modifications to accept fracture geometry from three different fracture models. Thus, information about each proppant model in the User's Manual builds on information supplied in the previous chapter. The exception to the development of combined treatment models is the penny fracture and its proppant model. In this case, a completely new proppant model was developed. A description of how to use the combined treatment model for the penny fracture is contained in Chapter 4. 2 refs.

  11. Integration of User Profiles: Models and Experiments in Information Retrieval.

    ERIC Educational Resources Information Center

    Myaeng, Sung H.; Korfhage, Robert R.

    1990-01-01

    Discussion of the interpretation of user queries in information retrieval highlights theoretical models that utilize user characteristics maintained in the form of a user profile. Various query/profile interaction models are identified, and an experiment is described that tested the relevance of retrieved documents based on various models. (29…

  12. Modeling users' activity on Twitter networks: validation of Dunbar's number

    NASA Astrophysics Data System (ADS)

    Goncalves, Bruno; Perra, Nicola; Vespignani, Alessandro

    2012-02-01

    Microblogging and mobile devices appear to augment human social capabilities, which raises the question whether they remove cognitive or biological constraints on human communication. In this paper we analyze a dataset of Twitter conversations collected across six months involving 1.7 million individuals and test the theoretical cognitive limit on the number of stable social relationships known as Dunbar's number. We find that the data are in agreement with Dunbar's result; users can entertain a maximum of 100-200 stable relationships. Thus, the ``economy of attention'' is limited in the online world by cognitive and biological constraints as predicted by Dunbar's theory. We propose a simple model for users' behavior that includes finite priority queuing and time resources that reproduces the observed social behavior.

  13. Modeling users' activity on twitter networks: validation of Dunbar's number.

    PubMed

    Gonçalves, Bruno; Perra, Nicola; Vespignani, Alessandro

    2011-01-01

    Microblogging and mobile devices appear to augment human social capabilities, which raises the question whether they remove cognitive or biological constraints on human communication. In this paper we analyze a dataset of Twitter conversations collected across six months involving 1.7 million individuals and test the theoretical cognitive limit on the number of stable social relationships known as Dunbar's number. We find that the data are in agreement with Dunbar's result; users can entertain a maximum of 100-200 stable relationships. Thus, the 'economy of attention' is limited in the online world by cognitive and biological constraints as predicted by Dunbar's theory. We propose a simple model for users' behavior that includes finite priority queuing and time resources that reproduces the observed social behavior.

  14. The NATA code; theory and analysis. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Bade, W. L.; Yos, J. M.

    1975-01-01

    The NATA code is a computer program for calculating quasi-one-dimensional gas flow in axisymmetric nozzles and rectangular channels, primarily to describe conditions in electric archeated wind tunnels. The program provides solutions based on frozen chemistry, chemical equilibrium, and nonequilibrium flow with finite reaction rates. The shear and heat flux on the nozzle wall are calculated and boundary layer displacement effects on the inviscid flow are taken into account. The program contains compiled-in thermochemical, chemical kinetic and transport cross section data for high-temperature air, CO2-N2-Ar mixtures, helium, and argon. It calculates stagnation conditions on axisymmetric or two-dimensional models and conditions on the flat surface of a blunt wedge. Included in the report are: definitions of the inputs and outputs; precoded data on gas models, reactions, thermodynamic and transport properties of species, and nozzle geometries; explanations of diagnostic outputs and code abort conditions; test problems; and a user's manual for an auxiliary program (NOZFIT) used to set up analytical curvefits to nozzle profiles.

  15. Cognitive Modeling of Video Game Player User Experience

    NASA Technical Reports Server (NTRS)

    Bohil, Corey J.; Biocca, Frank A.

    2010-01-01

    This paper argues for the use of cognitive modeling to gain a detailed and dynamic look into user experience during game play. Applying cognitive models to game play data can help researchers understand a player's attentional focus, memory status, learning state, and decision strategies (among other things) as these cognitive processes occurred throughout game play. This is a stark contrast to the common approach of trying to assess the long-term impact of games on cognitive functioning after game play has ended. We describe what cognitive models are, what they can be used for and how game researchers could benefit by adopting these methods. We also provide details of a single model - based on decision field theory - that has been successfUlly applied to data sets from memory, perception, and decision making experiments, and has recently found application in real world scenarios. We examine possibilities for applying this model to game-play data.

  16. ModelMate - A graphical user interface for model analysis

    USGS Publications Warehouse

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  17. The capillary hysteresis model HYSTR: User`s guide

    SciTech Connect

    Niemi, A.; Bodvarsson, G.S.

    1991-11-01

    The potential disposal of nuclear waste in the unsaturated zone at Yucca Mountain, Nevada, has generated increased interest in the study of fluid flow through unsaturated media. In the near future, large-scale field tests will be conducted at the Yucca Mountain site, and work is now being done to design and analyze these tests. As part of these efforts a capillary hysteresis model has been developed. A computer program to calculate the hysteretic relationship between capillary pressure {phi} and liquid saturation (S{sub 1}) has been written that is designed to be easily incorporated into any numerical unsaturated flow simulator that computes capillary pressure as a function of liquid saturation. This report gives a detailed description of the model along with information on how it can be interfaced with a transport code. Although the model was developed specifically for calculations related to nuclear waste disposal, it should be applicable to any capillary hysteresis problem for which the secondary and higher order scanning curves can be approximated from the first order scanning curves. HYSTR is a set of subroutines to calculate capillary pressure for a given liquid saturation under hysteretic conditions.

  18. Designing with users to meet people needs: a teaching model.

    PubMed

    Anselmi, Laura; Canina, Marita; Coccioni, Elisabetta

    2012-01-01

    Being in a context of great transformations of the whole system company-product-market, design becomes interpreter of the society and strategic key-point for production realities. Design must assume an ergonomic approach and a methodology oriented to product innovation where people are the main focus of the system. Today it is visible the need for a methodological approach able to include the context of use employing user's "creative skills". In this scenario, a design educational model based only on knowledge doesn't seem to be fulfilling; the traditional "deductive" method doesn't meet the needs of new productive assets, here the urgency to experiment within the "inductive" method for the development of a method where to know and to know how, theory and practice, act synergistically. The aim is to teach a method able to help a young designer to understand people's needs and desires considering both the concrete/cognitive level and the emotional level. The paper presents, through some case studies, an educational model developed combining theoretical/conceptual and practical/applicatory aspects with user experiential aspects. The proposed approach to design enables the students to investigate users' needs and desires and helps them proposing innovative ideas and projects better fitting today's market realities.

  19. Multiple Concentric Cylinder Model (MCCM) user's guide

    NASA Technical Reports Server (NTRS)

    Williams, Todd O.; Pindera, Marek-Jerzy

    1994-01-01

    A user's guide for the computer program mccm.f is presented. The program is based on a recently developed solution methodology for the inelastic response of an arbitrarily layered, concentric cylinder assemblage under thermomechanical loading which is used to model the axisymmetric behavior of unidirectional metal matrix composites in the presence of various microstructural details. These details include the layered morphology of certain types of ceramic fibers, as well as multiple fiber/matrix interfacial layers recently proposed as a means of reducing fabrication-induced, and in-service, residual stress. The computer code allows efficient characterization and evaluation of new fibers and/or new coating systems on existing fibers with a minimum of effort, taking into account inelastic and temperature-dependent properties and different morphologies of the fiber and the interfacial region. It also facilitates efficient design of engineered interfaces for unidirectional metal matrix composites.

  20. Videogrammetric Model Deformation Measurement System User's Manual

    NASA Technical Reports Server (NTRS)

    Dismond, Harriett R.

    2002-01-01

    The purpose of this manual is to provide the user of the NASA VMD system, running the MDef software, Version 1.10, all information required to operate the system. The NASA Videogrammetric Model Deformation system consists of an automated videogrammetric technique used to measure the change in wing twist and bending under aerodynamic load in a wind tunnel. The basic instrumentation consists of a single CCD video camera and a frame grabber interfaced to a computer. The technique is based upon a single view photogrammetric determination of two-dimensional coordinates of wing targets with fixed (and known) third dimensional coordinate, namely the span-wise location. The major consideration in the development of the measurement system was that productivity must not be appreciably reduced.

  1. Theory and modeling group

    NASA Technical Reports Server (NTRS)

    Holman, Gordon D.

    1989-01-01

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  2. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Technical Reports Server (NTRS)

    Chambers, Lin Hartung

    1994-01-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  3. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Astrophysics Data System (ADS)

    Chambers, Lin Hartung

    1994-09-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  4. Multiple-User Quantum Information Theory for Optical Communication Channels

    DTIC Science & Technology

    2008-06-01

    recognized to be composed of special cases of quantum mechanics and/or relativity theory. Paul Dirac brought relativity theory to bear on quantum physics, so...Borade, S., Zheng, L., and Trott , M., “Multilevel broadcast networks,” Proceed- ings of the IEEE International Symposium on Information Theory, Nice

  5. Prosthesis-User-in-the-Loop: a user-specific biomechanical modeling and simulation environment.

    PubMed

    Wojtusch, J; Beckerle, P; Christ, O; Wolff, K; von Stryk, O; Rinderknecht, S; Vogt, J

    2012-01-01

    In this paper, a novel biomechanical modeling and simulation environment with an emphasis on user-specific customization is presented. A modular modeling approach for multi-body systems allows a flexible extension by specific biomechanical modeling elements and enables an efficient application in dynamic simulation and optimization problems. A functional distribution of model description and model parameter data in combination with standardized interfaces enables a simple and reliable replacement or modification of specific functional components. The user-specific customization comprises the identification of anthropometric model parameters as well as the generation of a virtual three-dimensional character. The modeling and simulation environment is associated with Prosthesis-User-in-the-Loop, a hardware simulator concept for the design and optimization of lower limb prosthetic devices based on user experience and assessment. For a demonstration of the flexibility and capability of the modeling and simulation environment, an exemplary application in context of the hardware simulator is given.

  6. A user-friendly modified pore-solid fractal model

    PubMed Central

    Ding, Dian-yuan; Zhao, Ying; Feng, Hao; Si, Bing-cheng; Hill, Robert Lee

    2016-01-01

    The primary objective of this study was to evaluate a range of calculation points on water retention curves (WRC) instead of the singularity point at air-entry suction in the pore-solid fractal (PSF) model, which additionally considered the hysteresis effect based on the PSF theory. The modified pore-solid fractal (M-PSF) model was tested using 26 soil samples from Yangling on the Loess Plateau in China and 54 soil samples from the Unsaturated Soil Hydraulic Database. The derivation results showed that the M-PSF model is user-friendly and flexible for a wide range of calculation point options. This model theoretically describes the primary differences between the soil moisture desorption and the adsorption processes by the fractal dimensions. The M-PSF model demonstrated good performance particularly at the calculation points corresponding to the suctions from 100 cm to 1000 cm. Furthermore, the M-PSF model, used the fractal dimension of the particle size distribution, exhibited an accepted performance of WRC predictions for different textured soils when the suction values were ≥100 cm. To fully understand the function of hysteresis in the PSF theory, the role of allowable and accessible pores must be examined. PMID:27996013

  7. A user-friendly modified pore-solid fractal model.

    PubMed

    Ding, Dian-Yuan; Zhao, Ying; Feng, Hao; Si, Bing-Cheng; Hill, Robert Lee

    2016-12-20

    The primary objective of this study was to evaluate a range of calculation points on water retention curves (WRC) instead of the singularity point at air-entry suction in the pore-solid fractal (PSF) model, which additionally considered the hysteresis effect based on the PSF theory. The modified pore-solid fractal (M-PSF) model was tested using 26 soil samples from Yangling on the Loess Plateau in China and 54 soil samples from the Unsaturated Soil Hydraulic Database. The derivation results showed that the M-PSF model is user-friendly and flexible for a wide range of calculation point options. This model theoretically describes the primary differences between the soil moisture desorption and the adsorption processes by the fractal dimensions. The M-PSF model demonstrated good performance particularly at the calculation points corresponding to the suctions from 100 cm to 1000 cm. Furthermore, the M-PSF model, used the fractal dimension of the particle size distribution, exhibited an accepted performance of WRC predictions for different textured soils when the suction values were ≥100 cm. To fully understand the function of hysteresis in the PSF theory, the role of allowable and accessible pores must be examined.

  8. A user-friendly modified pore-solid fractal model

    NASA Astrophysics Data System (ADS)

    Ding, Dian-Yuan; Zhao, Ying; Feng, Hao; Si, Bing-Cheng; Hill, Robert Lee

    2016-12-01

    The primary objective of this study was to evaluate a range of calculation points on water retention curves (WRC) instead of the singularity point at air-entry suction in the pore-solid fractal (PSF) model, which additionally considered the hysteresis effect based on the PSF theory. The modified pore-solid fractal (M-PSF) model was tested using 26 soil samples from Yangling on the Loess Plateau in China and 54 soil samples from the Unsaturated Soil Hydraulic Database. The derivation results showed that the M-PSF model is user-friendly and flexible for a wide range of calculation point options. This model theoretically describes the primary differences between the soil moisture desorption and the adsorption processes by the fractal dimensions. The M-PSF model demonstrated good performance particularly at the calculation points corresponding to the suctions from 100 cm to 1000 cm. Furthermore, the M-PSF model, used the fractal dimension of the particle size distribution, exhibited an accepted performance of WRC predictions for different textured soils when the suction values were ≥100 cm. To fully understand the function of hysteresis in the PSF theory, the role of allowable and accessible pores must be examined.

  9. SubDyn User's Guide and Theory Manual

    SciTech Connect

    Damiani, Rick; Jonkman, Jason; Hayman, Greg

    2015-09-01

    SubDyn is a time-domain structural-dynamics module for multimember fixed-bottom substructures created by the National Renewable Energy Laboratory (NREL) through U.S. Department of Energy Wind and Water Power Program support. The module has been coupled into the FAST aero-hydro-servo-elastic computer-aided engineering (CAE) tool. Substructure types supported by SubDyn include monopiles, tripods, jackets, and other lattice-type substructures common for offshore wind installations in shallow and transitional water depths. SubDyn can also be used to model lattice support structures for land-based wind turbines. This document is organized as follows. Section 1 details how to obtain the SubDyn and FAST software archives and run both the stand-alone SubDyn or SubDyn coupled to FAST. Section 2 describes the SubDyn input files. Section 3 discusses the output files generated by SubDyn; these include echo files, a summary file, and the results file. Section 4 provides modeling guidance when using SubDyn. The SubDyn theory is covered in Section 5. Section 6 outlines future work, and Section 7 contains a list of references. Example input files are shown in Appendixes A and B. A summary of available output channels are found in Appendix C. Instructions for compiling the stand-alone SubDyn program are detailed in Appendix D. Appendix E tracks the major changes we have made to SubDyn for each public release.

  10. T:XML: A Tool Supporting User Interface Model Transformation

    NASA Astrophysics Data System (ADS)

    López-Jaquero, Víctor; Montero, Francisco; González, Pascual

    Model driven development of user interfaces is based on the transformation of an abstract specification into the final user interface the user will interact with. The design of transformation rules to carry out this transformation process is a key issue in any model-driven user interface development approach. In this paper, we introduce T:XML, an integrated development environment for managing, creating and previewing transformation rules. The tool supports the specification of transformation rules by using a graphical notation that works on the basis of the transformation of the input model into a graph-based representation. T:XML allows the design and execution of transformation rules in an integrated development environment. Furthermore, the designer can also preview how the generated user interface looks like after the transformations have been applied. These previewing capabilities can be used to quickly create prototypes to discuss with the users in user-centered design methods.

  11. Modeling and flow theory

    SciTech Connect

    Not Available

    1981-10-01

    (1) We recommend the establishment of an experimental test facility, appropriately instrumented, dedicated to research on theoretical modeling concepts. Validation of models for the various flow regimes, and establishment of the limitations or concepts used in the construction of models, are sorely needed areas of research. There exists no mechanism currently for funding of such research on a systematic basis. Such a facility would provide information fundamental to progress in the physics of turbulent multi-phase flow, which would also have impact on the understanding of coal utilization processes; (2) combustion research appears to have special institutional barriers to information exchange because it is an established, commercial ongoing effort, with heavy reliance on empirical data for proprietary configurations; (3) for both gasification and combustion reactors, current models appear to handle adequately some, perhaps even most, gross aspects of the reactors such as overall efficiency and major chemical output constituents. However, new and more stringent requirements concerning NOX, SOX and POX (small paticulate) production require greater understanding of process details and spatial inhomogenities, hence refinement of current models to include some greater detail is necessary; (4) further progress in the theory of single-phase turbulent flow would benefit our understanding of both combustors and gasifiers; and (5) another area in which theoretical development would be extremely useful is multi-phase flow.

  12. The Personalized Information Retrieval Model Based on User Interest

    NASA Astrophysics Data System (ADS)

    Gong, Songjie

    Personalized information retrieval systems can help customers to gain orientation in information overload by determining which items are relevant for their interests. One type of information retrieval is content-based filtering. In content-based filtering, items contain words in natural language. Meanings of words in natural language are often ambiguous. The problem of word meaning disambiguation is often decomposed to determining semantic similarity of words. In this paper, the architecture of personalized information retrieval based on user interest is presented. The architecture includes user interface model, user interest model, detecting interest model and update model. It established a user model for personalized information retrieval based on user interest keyword list on client server, which can supply personalized information retrieval service for user with the communications and collaboration of all modules of the architecture.

  13. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  14. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  15. Towards a Ubiquitous User Model for Profile Sharing and Reuse

    PubMed Central

    de Lourdes Martinez-Villaseñor, Maria; Gonzalez-Mendoza, Miguel; Hernandez-Gress, Neil

    2012-01-01

    People interact with systems and applications through several devices and are willing to share information about preferences, interests and characteristics. Social networking profiles, data from advanced sensors attached to personal gadgets, and semantic web technologies such as FOAF and microformats are valuable sources of personal information that could provide a fair understanding of the user, but profile information is scattered over different user models. Some researchers in the ubiquitous user modeling community envision the need to share user model's information from heterogeneous sources. In this paper, we address the syntactic and semantic heterogeneity of user models in order to enable user modeling interoperability. We present a dynamic user profile structure based in Simple Knowledge Organization for the Web (SKOS) to provide knowledge representation for ubiquitous user model. We propose a two-tier matching strategy for concept schemas alignment to enable user modeling interoperability. Our proposal is proved in the application scenario of sharing and reusing data in order to deal with overweight and obesity. PMID:23201995

  16. Towards a ubiquitous user model for profile sharing and reuse.

    PubMed

    Martinez-Villaseñor, Maria de Lourdes; Gonzalez-Mendoza, Miguel; Hernandez-Gress, Neil

    2012-09-28

    People interact with systems and applications through several devices and are willing to share information about preferences, interests and characteristics. Social networking profiles, data from advanced sensors attached to personal gadgets, and semantic web technologies such as FOAF and microformats are valuable sources of personal information that could provide a fair understanding of the user, but profile information is scattered over different user models. Some researchers in the ubiquitous user modeling community envision the need to share user model's information from heterogeneous sources. In this paper, we address the syntactic and semantic heterogeneity of user models in order to enable user modeling interoperability. We present a dynamic user profile structure based in Simple Knowledge Organization for the Web (SKOS) to provide knowledge representation for ubiquitous user model. We propose a two-tier matching strategy for concept schemas alignment to enable user modeling interoperability. Our proposal is proved in the application scenario of sharing and reusing data in order to deal with overweight and obesity.

  17. Macro System Model (MSM) User Guide, Version 1.3

    SciTech Connect

    Ruth, M.; Diakov, V.; Sa, T.; Goldsby, M.

    2011-09-01

    This user guide describes the macro system model (MSM). The MSM has been designed to allow users to analyze the financial, environmental, transitional, geographical, and R&D issues associated with the transition to a hydrogen economy. Basic end users can use the MSM to answer cross-cutting questions that were previously difficult to answer in a consistent and timely manner due to various assumptions and methodologies among different models.

  18. Probability state modeling theory.

    PubMed

    Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I

    2015-07-01

    As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.

  19. JEDI Marine and Hydrokinetic Model: User Reference Guide

    SciTech Connect

    Goldberg, M.; Previsic, M.

    2011-04-01

    The Jobs and Economic Development Impact Model (JEDI) for Marine and Hydrokinetics (MHK) is a user-friendly spreadsheet-based tool designed to demonstrate the economic impacts associated with developing and operating MHK power systems in the United States. The JEDI MHK User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the sources and parameters used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.

  20. Users matter : multi-agent systems model of high performance computing cluster users.

    SciTech Connect

    North, M. J.; Hood, C. S.; Decision and Information Sciences; IIT

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex due to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.

  1. Modeling User Behavior and Attention in Search

    ERIC Educational Resources Information Center

    Huang, Jeff

    2013-01-01

    In Web search, query and click log data are easy to collect but they fail to capture user behaviors that do not lead to clicks. As search engines reach the limits inherent in click data and are hungry for more data in a competitive environment, mining cursor movements, hovering, and scrolling becomes important. This dissertation investigates how…

  2. Acquiring User Models to Test Automated Assistants

    DTIC Science & Technology

    2013-01-01

    vectors, but operates in a markedly different style from the user. Once we have extracted our traces and feature vectors from them, we will want to know ...prior work focuses on performing well on a specific task, such as flying well (Šuc, Bratko, and Sam- mut 2004), (Bain and Sammut 1995), instead of

  3. AMEM-ADL Polymer Migration Estimation Model User's Guide

    EPA Pesticide Factsheets

    The user's guide of the Arthur D. Little Polymer Migration Estimation Model (AMEM) provides the information on how the model estimates the fraction of a chemical additive that diffuses through polymeric matrices.

  4. The 3DGRAPE book: Theory, users' manual, examples

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.

    1989-01-01

    A users' manual for a new three-dimensional grid generator called 3DGRAPE is presented. The program, written in FORTRAN, is capable of making zonal (blocked) computational grids in or about almost any shape. Grids are generated by the solution of Poisson's differential equations in three dimensions. The program automatically finds its own values for inhomogeneous terms which give near-orthogonality and controlled grid cell height at boundaries. Grids generated by 3DGRAPE have been applied to both viscous and inviscid aerodynamic problems, and to problems in other fluid-dynamic areas. The smoothness for which elliptic methods are known is seen here, including smoothness across zonal boundaries. An introduction giving the history, motivation, capabilities, and philosophy of 3DGRAPE is presented first. Then follows a chapter on the program itself. The input is then described in detail. A chapter on reading the output and debugging follows. Three examples are then described, including sample input data and plots of output. Last is a chapter on the theoretical development of the method.

  5. Artificial intelligence techniques for modeling database user behavior

    NASA Technical Reports Server (NTRS)

    Tanner, Steve; Graves, Sara J.

    1990-01-01

    The design and development of the adaptive modeling system is described. This system models how a user accesses a relational database management system in order to improve its performance by discovering use access patterns. In the current system, these patterns are used to improve the user interface and may be used to speed data retrieval, support query optimization and support a more flexible data representation. The system models both syntactic and semantic information about the user's access and employs both procedural and rule-based logic to manipulate the model.

  6. A Driving Behaviour Model of Electrical Wheelchair Users

    PubMed Central

    Hamam, Y.; Djouani, K.; Daachi, B.; Steyn, N.

    2016-01-01

    In spite of the presence of powered wheelchairs, some of the users still experience steering challenges and manoeuvring difficulties that limit their capacity of navigating effectively. For such users, steering support and assistive systems may be very necessary. To appreciate the assistance, there is need that the assistive control is adaptable to the user's steering behaviour. This paper contributes to wheelchair steering improvement by modelling the steering behaviour of powered wheelchair users, for integration into the control system. More precisely, the modelling is based on the improved Directed Potential Field (DPF) method for trajectory planning. The method has facilitated the formulation of a simple behaviour model that is also linear in parameters. To obtain the steering data for parameter identification, seven individuals participated in driving the wheelchair in different virtual worlds on the augmented platform. The obtained data facilitated the estimation of user parameters, using the ordinary least square method, with satisfactory regression analysis results. PMID:27148362

  7. A Driving Behaviour Model of Electrical Wheelchair Users.

    PubMed

    Onyango, S O; Hamam, Y; Djouani, K; Daachi, B; Steyn, N

    2016-01-01

    In spite of the presence of powered wheelchairs, some of the users still experience steering challenges and manoeuvring difficulties that limit their capacity of navigating effectively. For such users, steering support and assistive systems may be very necessary. To appreciate the assistance, there is need that the assistive control is adaptable to the user's steering behaviour. This paper contributes to wheelchair steering improvement by modelling the steering behaviour of powered wheelchair users, for integration into the control system. More precisely, the modelling is based on the improved Directed Potential Field (DPF) method for trajectory planning. The method has facilitated the formulation of a simple behaviour model that is also linear in parameters. To obtain the steering data for parameter identification, seven individuals participated in driving the wheelchair in different virtual worlds on the augmented platform. The obtained data facilitated the estimation of user parameters, using the ordinary least square method, with satisfactory regression analysis results.

  8. Characterizing Drug Non-Users as Distinctive in Prevention Messages: Implications of Optimal Distinctiveness Theory

    PubMed Central

    Comello, Maria Leonora G.

    2011-01-01

    Optimal Distinctiveness Theory posits that highly valued groups are those that can simultaneously satisfy needs to belong and to be different. The success of drug-prevention messages with a social-identity theme should therefore depend on the extent to which the group is portrayed as capable of meeting these needs. Specifically, messages that portray non-users as a large and undifferentiated majority may not be as successful as messages that emphasize uniqueness of non-users. This prediction was examined using marijuana prevention messages that depicted non-users as a distinctive or a majority group. Distinctiveness characterization lowered behavioral willingness to use marijuana among non-users (Experiment 1) and served as a source of identity threat (contingent on gender) among users (Experiment 2). PMID:21409672

  9. Evaluation Theory, Models, and Applications

    ERIC Educational Resources Information Center

    Stufflebeam, Daniel L.; Shinkfield, Anthony J.

    2007-01-01

    "Evaluation Theory, Models, and Applications" is designed for evaluators and students who need to develop a commanding knowledge of the evaluation field: its history, theory and standards, models and approaches, procedures, and inclusion of personnel as well as program evaluation. This important book shows how to choose from a growing…

  10. Treatment motivation in drug users: a theory-based analysis.

    PubMed

    Longshore, Douglas; Teruya, Cheryl

    2006-02-01

    Motivation for drug use treatment is widely regarded as crucial to a client's engagement in treatment and success in quitting drug use. Motivation is typically measured with items reflecting high treatment readiness (e.g., perceived need for treatment and commitment to participate) and low treatment resistance (e.g., skepticism regarding benefits of treatment). Building upon reactance theory and the psychotherapeutic construct of resistance, we conceptualized these two aspects of treatment motivation - readiness and resistance - as distinct constructs and examined their predictive power in a sample of 1295 drug-using offenders referred to treatment while on probation. The sample was 60.7% African Americans, 33.5% non-Hispanic Whites, and 21.2% women; their ages ranged from 16 to 63 years old. Interviews occurred at treatment entry and 6 months later. Readiness (but not resistance) predicted treatment retention during the 6-month period. Resistance (but not readiness) predicted drug use, especially among offenders for whom the treatment referral was coercive. These findings suggest that readiness and resistance should both be assessed among clients entering treatment, especially when the referral is coercive. Intake and counseling protocols should address readiness and resistance separately.

  11. Quantify uncertain emergency search techniques (QUEST) -- Theory and user`s guide

    SciTech Connect

    Johnson, M.M.; Goldsby, M.E.; Plantenga, T.D.; Porter, T.L.; West, T.H.; Wilcox, W.B.; Hensley, W.K.

    1998-01-01

    As recent world events show, criminal and terrorist access to nuclear materials is a growing national concern. The national laboratories are taking the lead in developing technologies to counter these potential threats to the national security. Sandia National laboratories, with support from Pacific Northwest National Laboratory and the Bechtel Nevada, Remote Sensing Laboratory, has developed QUEST (a model to Quantify Uncertain Emergency Search Techniques), to enhance the performance of organizations in the search for lost or stolen nuclear material. In addition, QUEST supports a wide range of other applications, such as environmental monitoring, nuclear facilities inspections, and searcher training. QUEST simulates the search for nuclear materials and calculates detector response for various source types and locations. The probability of detecting a radioactive source during a search is a function of many different variables, including source type, search location and structure geometry (including shielding), search dynamics (path and speed), and detector type and size. Through calculation of dynamic detector response, QUEST makes possible quantitative comparisons of various sensor technologies and search patterns. The QUEST model can be used as a tool to examine the impact of new detector technologies, explore alternative search concepts, and provide interactive search/inspector training.

  12. An Investigation of the Integrated Model of User Technology Acceptance: Internet User Samples in Four Countries

    ERIC Educational Resources Information Center

    Fusilier, Marcelline; Durlabhji, Subhash; Cucchi, Alain

    2008-01-01

    National background of users may influence the process of technology acceptance. The present study explored this issue with the new, integrated technology use model proposed by Sun and Zhang (2006). Data were collected from samples of college students in India, Mauritius, Reunion Island, and United States. Questionnaire methodology and…

  13. [Systematized care in cardiac preoperative: theory of human caring in the perspective of nurses and users].

    PubMed

    Amorim, Thais Vasconselos; Arreguy-Sena, Cristina; Alves, Marcelo da Silva; Salimena, Anna Maria de Oliveira

    2014-01-01

    This is a case study research that aimed to know, with the adoption of the Theory of Human Caring, the meanings of therapeutic interpersonal relationship between nurse and user on the preoperative nursing visit after the experience of the surgical process. The convenience sample was composed of three nurses and three users of an institution that has updated records to perform highly complex cardiovascular surgery, comprising nine combinations of therapeutic interactions. It was used instruments, structured according to the theory of Jean Watson and North American Nursing Diagnosis Association, Nursing Intervention Classification and Nursing Outcomes Classification taxonomies. The legal and ethical aspects of research involving human subjects were assured. The results revealed three clusters to grasp the significance of preoperative visits by users and five clusters to capture the perception of nurses when they experience this clinical experience.

  14. How Homeless Sector Workers Deal with the Death of Service Users: A Grounded Theory Study

    ERIC Educational Resources Information Center

    Lakeman, Richard

    2011-01-01

    Homeless sector workers often encounter the deaths of service users. A modified grounded theory methodology project was used to explore how workers make sense of, respond to, and cope with sudden death. In-depth interviews were undertaken with 16 paid homeless sector workers who had experienced the death of someone with whom they worked.…

  15. Involving service users in interprofessional education narrowing the gap between theory and practice.

    PubMed

    Cooper, Helen; Spencer-Dawe, Eileen

    2006-12-01

    Calls for greater collaboration between professionals in health and social care have led to pressures to move toward interprofessional education (IPE) at both pre- and post-registration levels. Whilst this move has evolved out of "common sense" demands, such a multiple systems approach to education does not fit easily into existing traditional educational frameworks and there is, as yet, no proven theoretical framework to guide its development. A research study of an IPE intervention at the University of Liverpool in the UK drew on complexity theory to conceptualize the intervention and to evaluate its impact on a group of approximately 500 students studying physiotherapy, medicine, occupational therapy, nursing and social work. The intervention blended a multidisciplinary (non-interactive) plenary with self-directed e-learning and a series of interdisciplinary (interactive) workshops. Two evaluations took place: the first when the workshops were facilitated by trained practitioners; the second when the practitioners co-facilitated with trained service users. This paper reports findings from the second evaluation which focused on narrowing the gap between theory and practice. A multi-stakeholder evaluation was used including: students' reflective narratives, a focus group with practitioners and individual semi-structured interviews with service users. Findings showed that service users can make an important contribution to IPE for health and social care students in the early stages of their training. By exposure to a service user perspective, first year students can begin to learn and apply the principles of team work, to place the service user at the centre of the care process, to make connections between theory and "real life" experiences, and to narrow the gap between theory and practice. Findings also revealed benefits for facilitators and service users.

  16. Do recommender systems benefit users? a modeling approach

    NASA Astrophysics Data System (ADS)

    Yeung, Chi Ho

    2016-04-01

    Recommender systems are present in many web applications to guide purchase choices. They increase sales and benefit sellers, but whether they benefit customers by providing relevant products remains less explored. While in many cases the recommended products are relevant to users, in other cases customers may be tempted to purchase the products only because they are recommended. Here we introduce a model to examine the benefit of recommender systems for users, and find that recommendations from the system can be equivalent to random draws if one always follows the recommendations and seldom purchases according to his or her own preference. Nevertheless, with sufficient information about user preferences, recommendations become accurate and an abrupt transition to this accurate regime is observed for some of the studied algorithms. On the other hand, we find that high estimated accuracy indicated by common accuracy metrics is not necessarily equivalent to high real accuracy in matching users with products. This disagreement between estimated and real accuracy serves as an alarm for operators and researchers who evaluate recommender systems merely with accuracy metrics. We tested our model with a real dataset and observed similar behaviors. Finally, a recommendation approach with improved accuracy is suggested. These results imply that recommender systems can benefit users, but the more frequently a user purchases the recommended products, the less relevant the recommended products are in matching user taste.

  17. HYDROCARBON SPILL SCREENING MODEL (HSSM) VOLUME 1: USER'S GUIDE

    EPA Science Inventory

    This users guide describes the Hydrocarbon Spill Screening Model (HSSM). The model is intended for simulation of subsurface releases of light nonaqueous phase liquids (LNAPLs). The model consists of separate modules for LNAPL flow through the vadose zone, spreading in the capil...

  18. USERS MANUAL: LANDFILL GAS EMISSIONS MODEL - VERSION 2.0

    EPA Science Inventory

    The document is a user's guide for a computer model, Version 2.0 of the Landfill Gas Emissions Model (LandGEM), for estimating air pollution emissions from municipal solid waste (MSW) landfills. The model can be used to estimate emission rates for methane, carbon dioxide, nonmet...

  19. Modeling User Behavior in Computer Learning Tasks.

    ERIC Educational Resources Information Center

    Mantei, Marilyn M.

    Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…

  20. A Hybrid Tool for User Interface Modeling and Prototyping

    NASA Astrophysics Data System (ADS)

    Trætteberg, Hallvard

    Although many methods have been proposed, model-based development methods have only to some extent been adopted for UI design. In particular, they are not easy to combine with user-centered design methods. In this paper, we present a hybrid UI modeling and GUI prototyping tool, which is designed to fit better with IS development and UI design traditions. The tool includes a diagram editor for domain and UI models and an execution engine that integrates UI behavior, live UI components and sample data. Thus, both model-based user interface design and prototyping-based iterative design are supported

  1. METAPHOR (version 1): Users guide. [performability modeling

    NASA Technical Reports Server (NTRS)

    Furchtgott, D. G.

    1979-01-01

    General information concerning METAPHOR, an interactive software package to facilitate performability modeling and evaluation, is presented. Example systems are studied and their performabilities are calculated. Each available METAPHOR command and array generator is described. Complete METAPHOR sessions are included.

  2. Users of middle atmosphere models remarks

    NASA Technical Reports Server (NTRS)

    Gamble, Joe

    1987-01-01

    The procedure followed for shuttle operations is to calculate descent trajectories for each potential shuttle landing site using the Global Reference Atmosphere Model (GRAM) to interactively compute density along the flight path 100 times to bound the statistics. The purpose is to analyze the flight dynamics, along with calculations of heat loads during reentry. The analysis program makes use of the modified version of the Jacchia-70 atmosphere, which includes He bulges over the poles and seasonal latitude variations at lower altitudes. For the troposphere, the 4-D Model is used up to 20 km, Groves from 30 km up to 90 km. It is extrapolated over the globe and faired into the Jacchia atmosphere between 90 and 115 km. Since data on the Southern Hemisphere was lacking, what was done was that the data was flipped over and lagged 6 months. Sometimes when winds are calculated from pressure data in the model there appear to be discontinuities. Modelers indicated that the GRAM was not designed to produce winds, but good wind data is needed for the landing phase of shuttle operations. Use of atmospheric models during reentry is one application where it is obvious that a single integrated atmosphere model is required.

  3. User's instructions for the erythropoiesis regulatory model

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The purpose of the model provides a method to analyze some of the events that could account for the decrease in red cell mass observed in crewmen returning from space missions. The model is based on the premise that erythrocyte production is governed by the balance between oxygen supply and demand at a renal sensing site. Oxygen supply is taken to be a function of arterial oxygen tension, mean corpuscular hemoglobin concentration, oxy-hemoglobin carrying capacity, hematocrit, and blood flow. Erythrocyte destruction is based on the law of mass action. The instantaneous hematocrit value is derived by integrating changes in production and destruction rates and accounting for the degree of plasma dilution.

  4. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  5. Snowmelt Runoff Model (SRM) User's Manual

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This 2008 edition of the User’s Manual presents a new computer program, the Windows Version 1.11 of the Snowmelt Runoff Model (WinSRM). The popular Version 4 is also preserved in the Appendix because it is still in demand to be used within its limits. The Windows version adds new capabilities: it ac...

  6. Engineer Modeling Study. Volume II. Users Manual.

    DTIC Science & Technology

    1982-09-01

    to have some experience with automated data processing (ADP) systems. Instructions are given for model operation on the Boeing Computer system. To...AM.y Vpor 62074 ATTII, ATA-TE-SI Riyadh 09038 Sharpe Army Dopot 95331 ATm: ATZA-PE ok I..0d 61201 s.... Ar y opot 14541 Am: Bear. Libary

  7. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  8. Supplement to wellbore models GWELL, GWNACL, and HOLA User`s Guide

    SciTech Connect

    Hadgu, T.; Bodvarsson, G.S.

    1992-09-01

    A study was made on improving the applicability and ease of usage of the wellbore simulators HOLA, GWELL and GWNACL (Bjornsson, 1987; Aunzo et al., 1991). The study concentrated mainly on the usage of Option 2 (please refer to the User`s Guide; Aunzo et al., 1991) and modeling flow of superheated steam when using these computer codes. Amendments were made to the simulators to allow implementation of a variety of input data. A wide range of input data was used to test the modifications to the codes. The study did not attempt to modify or improve the physics or formulations which were used in the models. It showed that a careful check of the input data is required. This report addresses these two areas of interest: usage of Option 2, and simulation of wellbore flow of superheated steam.

  9. Simple Thermal Environment Model (STEM) User's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C.G.; Batts, G. W.; Anderson, B. J.; James, B. F.

    2001-01-01

    This report presents a Simple Thermal Environment Model (STEM) for determining appropriate engineering design values to specify the thermal environment of Earth-orbiting satellites. The thermal environment of a satellite, consists of three components: (1) direct solar radiation, (2) Earth-atmosphere reflected shortwave radiation, as characterized by Earth's albedo, and (3) Earth-atmosphere-emitted outgoing longwave radiation (OLR). This report, together with a companion "guidelines" report provides methodology and guidelines for selecting "design points" for thermal environment parameters for satellites and spacecraft systems. The methods and models reported here are outgrowths of Earth Radiation Budget Experiment (ERBE) satellite data analysis and thermal environment specifications discussed by Anderson and Smith (1994). In large part, this report is intended to update (and supersede) those results.

  10. C++ Model Developer (CMD) User Guide

    DTIC Science & Technology

    2005-04-01

    SIMULATION 66 5 SCALABILITY – BUILDING A 6 DEGREE-OF-FREEDOM (6DOF) SIMULATION One problem with a lot of simulation documentation is that simple...Directorate Aviation and Missile Research, Development, and Engineering Center Ray Sells Michael Fennell DESE Research, Inc. 315 Wynn Drive...C++ Model Developer (CMD) is an open-source C++ source code based environment for building simulations of systems described by time-based

  11. Modeling mutual feedback between users and recommender systems

    NASA Astrophysics Data System (ADS)

    Zeng, An; Yeung, Chi Ho; Medo, Matúš; Zhang, Yi-Cheng

    2015-07-01

    Recommender systems daily influence our decisions on the Internet. While considerable attention has been given to issues such as recommendation accuracy and user privacy, the long-term mutual feedback between a recommender system and the decisions of its users has been neglected so far. We propose here a model of network evolution which allows us to study the complex dynamics induced by this feedback, including the hysteresis effect which is typical for systems with non-linear dynamics. Despite the popular belief that recommendation helps users to discover new things, we find that the long-term use of recommendation can contribute to the rise of extremely popular items and thus ultimately narrow the user choice. These results are supported by measurements of the time evolution of item popularity inequality in real systems. We show that this adverse effect of recommendation can be tamed by sacrificing part of short-term recommendation accuracy.

  12. Solid rocket booster performance evaluation model. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    This users manual for the solid rocket booster performance evaluation model (SRB-II) contains descriptions of the model, the program options, the required program inputs, the program output format and the program error messages. SRB-II is written in FORTRAN and is operational on both the IBM 370/155 and the MSFC UNIVAC 1108 computers.

  13. Earthquake Early Warning Beta Users: Java, Modeling, and Mobile Apps

    NASA Astrophysics Data System (ADS)

    Strauss, J. A.; Vinci, M.; Steele, W. P.; Allen, R. M.; Hellweg, M.

    2014-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The beta users receive earthquake information very rapidly in real-time and are providing feedback on their experiences of performance and potential uses within their organization. Beta user interactions allow the ShakeAlert team to discern: which alert delivery options are most effective, what changes would make the UserDisplay more useful in a pre-disaster situation, and most importantly, what actions users plan to take for various scenarios. Actions could include: personal safety approaches, such as drop cover, and hold on; automated processes and procedures, such as opening elevator or fire stations doors; or situational awareness. Users are beginning to determine which policy and technological changes may need to be enacted, and funding requirements to implement their automated controls. The use of models and mobile apps are beginning to augment the basic Java desktop applet. Modeling allows beta users to test their early warning responses against various scenarios without having to wait for a real event. Mobile apps are also changing the possible response landscape, providing other avenues for people to receive information. All of these combine to improve business continuity and resiliency.

  14. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  15. Modeling users, context and devices for ambient assisted living environments.

    PubMed

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-03-17

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works.

  16. Geothermal loan guaranty cash flow model: description and users' manual

    SciTech Connect

    Keimig, M.A.; Rosenberg, J.I.; Entingh, D.J.

    1980-11-01

    This is the users guide for the Geothermal Loan Guaranty Cash Flow Model (GCFM). GCFM is a Fortran code which designs and costs geothermal fields and electric power plants. It contains a financial analysis module which performs life cycle costing analysis taking into account various types of taxes, costs and financial structures. The financial module includes a discounted cash flow feature which calculates a levelized breakeven price for each run. The user's guide contains descriptions of the data requirements and instructions for using the model.

  17. Designing visual displays and system models for safe reactor operations based on the user`s perspective of the system

    SciTech Connect

    Brown-VanHoozer, S.A.

    1995-12-31

    Most designers are not schooled in the area of human-interaction psychology and therefore tend to rely on the traditional ergonomic aspects of human factors when designing complex human-interactive workstations related to reactor operations. They do not take into account the differences in user information processing behavior and how these behaviors may affect individual and team performance when accessing visual displays or utilizing system models in process and control room areas. Unfortunately, by ignoring the importance of the integration of the user interface at the information process level, the result can be sub-optimization and inherently error- and failure-prone systems. Therefore, to minimize or eliminate failures in human-interactive systems, it is essential that the designers understand how each user`s processing characteristics affects how the user gathers information, and how the user communicates the information to the designer and other users. A different type of approach in achieving this understanding is Neuro Linguistic Programming (NLP). The material presented in this paper is based on two studies involving the design of visual displays, NLP, and the user`s perspective model of a reactor system. The studies involve the methodology known as NLP, and its use in expanding design choices from the user`s ``model of the world,`` in the areas of virtual reality, workstation design, team structure, decision and learning style patterns, safety operations, pattern recognition, and much, much more.

  18. A mixing evolution model for bidirectional microblog user networks

    NASA Astrophysics Data System (ADS)

    Yuan, Wei-Guo; Liu, Yun

    2015-08-01

    Microblogs have been widely used as a new form of online social networking. Based on the user profile data collected from Sina Weibo, we find that the number of microblog user bidirectional friends approximately corresponds with the lognormal distribution. We then build two microblog user networks with real bidirectional relationships, both of which have not only small-world and scale-free but also some special properties, such as double power-law degree distribution, disassortative network, hierarchical and rich-club structure. Moreover, by detecting the community structures of the two real networks, we find both of their community scales follow an exponential distribution. Based on the empirical analysis, we present a novel evolution network model with mixed connection rules, including lognormal fitness preferential and random attachment, nearest neighbor interconnected in the same community, and global random associations in different communities. The simulation results show that our model is consistent with real network in many topology features.

  19. H2A Production Model, Version 2 User Guide

    SciTech Connect

    Steward, D.; Ramsden, T.; Zuboy, J.

    2008-09-01

    The H2A Production Model analyzes the technical and economic aspects of central and forecourt hydrogen production technologies. Using a standard discounted cash flow rate of return methodology, it determines the minimum hydrogen selling price, including a specified after-tax internal rate of return from the production technology. Users have the option of accepting default technology input values--such as capital costs, operating costs, and capacity factor--from established H2A production technology cases or entering custom values. Users can also modify the model's financial inputs. This new version of the H2A Production Model features enhanced usability and functionality. Input fields are consolidated and simplified. New capabilities include performing sensitivity analyses and scaling analyses to various plant sizes. This User Guide helps users already familiar with the basic tenets of H2A hydrogen production cost analysis get started using the new version of the model. It introduces the basic elements of the model then describes the function and use of each of its worksheets.

  20. Theory of hadronic nonperturbative models

    SciTech Connect

    Coester, F.; Polyzou, W.N.

    1995-08-01

    As more data probing hadron structure become available hadron models based on nonperturbative relativistic dynamics will be increasingly important for their interpretation. Relativistic Hamiltonian dynamics of few-body systems (constituent-quark models) and many-body systems (parton models) provides a precisely defined approach and a useful phenomenology. However such models lack a quantitative foundation in quantum field theory. The specification of a quantum field theory by a Euclidean action provides a basis for the construction of nonperturbative models designed to maintain essential features of the field theory. For finite systems it is possible to satisfy axioms which guarantee the existence of a Hilbert space with a unitary representation of the Poincare group and the spectral condition which ensures that the spectrum of the four-momentum operator is in the forward light cone. The separate axiom which guarantees locality of the field operators can be weakened for the construction for few-body models. In this context we are investigating algebraic and analytic properties of model Schwinger functions. This approach promises insight into the relations between hadronic models based on relativistic Hamiltonian dynamics on one hand and Bethe-Salpeter Green`s-function equations on the other.

  1. User`s guide to the META-Net economic modeling system. Version 1.2

    SciTech Connect

    Lamont, A.

    1994-11-24

    In a market economy demands for commodities are met through various technologies and resources. Markets select the technologies and resources to meet these demands based on their costs. Over time, the competitiveness of different technologies can change due to the exhaustion of resources they depend on, the introduction of newer, more efficient technologies, or even shifts in user demands. As this happens, the structure of the economy changes. The Market Equilibrium and Technology Assessment Network Modelling System, META{center_dot}Net, has been developed for building and solving multi-period equilibrium models to analyze the shifts in the energy system that may occur as new technologies are introduced and resources are exhausted. META{center_dot}Net allows a user to build and solve complex economic models. It models` a market economy as a network of nodes representing resources, conversion processes, markets, and end-use demands. Commodities flow through this network from resources, through conversion processes and market, to the end-users. META{center_dot}Net then finds the multiperiod equilibrium prices and quantities. The solution includes the prices and quantities demanded for each commodity along with the capacity additions (and retirements) for each conversion process, and the trajectories of resource extraction. Although the changes in the economy are largely driven by consumers` behavior and the costs of technologies and resources, they are also affected by various government policies. These can include constraints on prices and quantities, and various taxes and constraints on environmental emissions. META{center_dot}Net can incorporate many of these mechanisms and evaluate their potential impact on the development of the economic system.

  2. Shawnee flue gas desulfurization computer model users manual

    SciTech Connect

    Sudhoff, F.A.; Torstrick, R.L.

    1985-03-01

    In conjunction with the US Enviromental Protection Agency sponsored Shawnee test program, Bechtel National, Inc., and the Tennessee Valley Authority jointly developed a computer model capable of projecting preliminary design and economics for lime- and limestone-scrubbing flue gas desulfurization systems. The model is capable of projecting relative economics for spray tower, turbulent contact absorber, and venturi-spray tower scrubbing options. It may be used to project the effect on system design and economics of variations in required SO/sub 2/ removal, scrubber operating parameters (gas velocity, liquid-to-gas (L/G) ration, alkali stoichiometry, liquor hold time in slurry recirculation tanks), reheat temperature, and scrubber bypass. It may also be used to evaluate alternative waste disposal methods or additives (MgO or adipic acid) on costs for the selected process. Although the model is not intended to project the economics of an individual system to a high degree of accuracy, it allows prospective users to quickly project comparative design and costs for limestone and lime case variations on a common design and cost basis. The users manual provides a general descripton of the Shawnee FGD computer model and detailed instructions for its use. It describes and explains the user-supplied input data which are required such as boiler size, coal characteristics, and SO/sub 2/ removal requirments. Output includes a material balance, equipment list, and detailed capital investment and annual revenue requirements. The users manual provides information concerning the use of the overall model as well as sample runs to serve as a guide to prospective users in identifying applications. The FORTRAN-based model is maintained by TVA, from whom copies or individual runs are available. 25 refs., 3 figs., 36 tabs.

  3. Solid rocket booster thermal radiation model. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Lee, A. L.

    1976-01-01

    A user's manual was prepared for the computer program of a solid rocket booster (SRB) thermal radiation model. The following information was included: (1) structure of the program, (2) input information required, (3) examples of input cards and output printout, (4) program characteristics, and (5) program listing.

  4. Using Partial Credit and Response History to Model User Knowledge

    ERIC Educational Resources Information Center

    Van Inwegen, Eric G.; Adjei, Seth A.; Wang, Yan; Heffernan, Neil T.

    2015-01-01

    User modelling algorithms such as Performance Factors Analysis and Knowledge Tracing seek to determine a student's knowledge state by analyzing (among other features) right and wrong answers. Anyone who has ever graded an assignment by hand knows that some answers are "more wrong" than others; i.e. they display less of an understanding…

  5. The Integrated Decision Modeling System (IDMS) User’s Manual

    DTIC Science & Technology

    1991-05-01

    AL-TP-1 991-0009 AD-A23 6 033 THE INTEGRATED DECISION MODELING SYSTEM (IDMS) USER’S MANUAL IJonathan C. Fast John N. Taylor Metrica , Incorporated...Looper 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) . PERFORMING ORGANIZATION REPORT NUMBER Metrica , Incorporated 8301 Broadway, Suite 215 San

  6. Dynamic User Modeling within a Game-Based ITS

    ERIC Educational Resources Information Center

    Snow, Erica L.

    2015-01-01

    Intelligent tutoring systems are adaptive learning environments designed to support individualized instruction. The adaptation embedded within these systems is often guided by user models that represent one or more aspects of students' domain knowledge, actions, or performance. The proposed project focuses on the development and testing of user…

  7. Adaptive User Model for Web-Based Learning Environment.

    ERIC Educational Resources Information Center

    Garofalakis, John; Sirmakessis, Spiros; Sakkopoulos, Evangelos; Tsakalidis, Athanasios

    This paper describes the design of an adaptive user model and its implementation in an advanced Web-based Virtual University environment that encompasses combined and synchronized adaptation between educational material and well-known communication facilities. The Virtual University environment has been implemented to support a postgraduate…

  8. ABAREX -- A neutron spherical optical-statistical-model code -- A user`s manual

    SciTech Connect

    Smith, A.B.; Lawson, R.D.

    1998-06-01

    The contemporary version of the neutron spherical optical-statistical-model code ABAREX is summarized with the objective of providing detailed operational guidance for the user. The physical concepts involved are very briefly outlined. The code is described in some detail and a number of explicit examples are given. With this document one should very quickly become fluent with the use of ABAREX. While the code has operated on a number of computing systems, this version is specifically tailored for the VAX/VMS work station and/or the IBM-compatible personal computer.

  9. A general graphical user interface for automatic reliability modeling

    NASA Technical Reports Server (NTRS)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1991-01-01

    Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.

  10. A User Modelling Approach for Computer-Based Critiquing

    DTIC Science & Technology

    1990-01-01

    9.1.2 Explicit Acquisition Methods ................. 176 9.1.3 Tutoring-based Methods ..................... 177 9.1.4 Statistical Analysis of User’s...accomplish the second process are the subject of this research . Cooperative problem solving systems assume that the third process is inherent in the...process of human-computer interaction. The second class of models above are psychological models developed by and for the analysis of human behavior

  11. User interface for ground-water modeling: Arcview extension

    USGS Publications Warehouse

    Tsou, M.-S.; Whittemore, D.O.

    2001-01-01

    Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.

  12. Airport-Noise Levels and Annoyance Model (ALAMO) user's guide

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    A guide for the use of the Airport-Noise Level and Annoyance MOdel (ALAMO) at the Langley Research Center computer complex is provided. This document is divided into 5 primary sections, the introduction, the purpose of the model, and an in-depth description of the following subsystems: baseline, noise reduction simulation and track analysis. For each subsystem, the user is provided with a description of architecture, an explanation of subsystem use, sample results, and a case runner's check list. It is assumed that the user is familiar with the operations at the Langley Research Center (LaRC) computer complex, the Network Operating System (NOS 1.4) and CYBER Control Language. Incorporated within the ALAMO model is a census database system called SITE II.

  13. The Snowmelt-Runoff Model (SRM) user's manual

    NASA Technical Reports Server (NTRS)

    Martinec, J.; Rango, A.; Major, E.

    1983-01-01

    A manual to provide a means by which a user may apply the snowmelt runoff model (SRM) unaided is presented. Model structure, conditions of application, and data requirements, including remote sensing, are described. Guidance is given for determining various model variables and parameters. Possible sources of error are discussed and conversion of snowmelt runoff model (SRM) from the simulation mode to the operational forecasting mode is explained. A computer program is presented for running SRM is easily adaptable to most systems used by water resources agencies.

  14. SN_GUI: a graphical user interface for snowpack modeling

    NASA Astrophysics Data System (ADS)

    Spreitzhofer, G.; Fierz, C.; Lehning, M.

    2004-10-01

    SNOWPACK is a physical snow cover model. The model not only serves as a valuable research tool, but also runs operationally on a network of high Alpine automatic weather and snow measurement sites. In order to facilitate the operation of SNOWPACK and the interpretation of the results obtained by this model, a user-friendly graphical user interface for snowpack modeling, named SN_GUI, was created. This Java-based and thus platform-independent tool can be operated in two modes, one designed to fulfill the requirements of avalanche warning services (e.g. by providing information about critical layers within the snowpack that are closely related to the avalanche activity), and the other one offering a variety of additional options satisfying the needs of researchers. The user of SN_GUI is graphically guided through the entire process of creating snow cover simulations. The starting point is the efficient creation of input parameter files for SNOWPACK, followed by the launching of SNOWPACK with a variety of parameter settings. Finally, after the successful termination of the run, a number of interactive display options may be used to visualize the model output. Among these are vertical profiles and time profiles for many parameters. Besides other features, SN_GUI allows the use of various color, time and coordinate scales, and the comparison of measured and observed parameters.

  15. Simplified analytical model of penetration with lateral loading -- User`s guide

    SciTech Connect

    Young, C.W.

    1998-05-01

    The SAMPLL (Simplified Analytical Model of Penetration with Lateral Loading) computer code was originally developed in 1984 to realistically yet economically predict penetrator/target interactions. Since the code`s inception, its use has spread throughout the conventional and nuclear penetrating weapons community. During the penetrator/target interaction, the resistance of the material being penetrated imparts both lateral and axial loads on the penetrator. These loads cause changes to the penetrator`s motion (kinematics). SAMPLL uses empirically based algorithms, formulated from an extensive experimental data base, to replicate the loads the penetrator experiences during penetration. The lateral loads resulting from angle of attack and trajectory angle of the penetrator are explicitly treated in SAMPLL. The loads are summed and the kinematics calculated at each time step. SAMPLL has been continually improved, and the current version, Version 6.0, can handle cratering and spall effects, multiple target layers, penetrator damage/failure, and complex penetrator shapes. Version 6 uses the latest empirical penetration equations, and also automatically adjusts the penetrability index for certain target layers to account for layer thickness and confinement. This report describes the SAMPLL code, including assumptions and limitations, and includes a user`s guide.

  16. EpiPOD : community vaccination and dispensing model user's guide.

    SciTech Connect

    Berry, M.; Samsa, M.; Walsh, D.; Decision and Information Sciences

    2009-01-09

    EpiPOD is a modeling system that enables local, regional, and county health departments to evaluate and refine their plans for mass distribution of antiviral and antibiotic medications and vaccines. An intuitive interface requires users to input as few or as many plan specifics as are available in order to simulate a mass treatment campaign. Behind the input interface, a system dynamics model simulates pharmaceutical supply logistics, hospital and first-responder personnel treatment, population arrival dynamics and treatment, and disease spread. When the simulation is complete, users have estimates of the number of illnesses in the population at large, the number of ill persons seeking treatment, and queuing and delays within the mass treatment system--all metrics by which the plan can be judged.

  17. Modeling the user preference on broadcasting contents using Bayesian networks

    NASA Astrophysics Data System (ADS)

    Kang, Sanggil; Lim, Jeongyeon; Kim, Munchurl

    2004-01-01

    In this paper, we introduce a new supervised learning method of a Bayesian network for user preference models. Unlike other preference models, our method traces the trend of a user preference as time passes. It allows us to do online learning so we do not need the exhaustive data collection. The tracing of the trend can be done by modifying the frequency of attributes in order to force the old preference to be correlated with the current preference under the assumption that the current preference is correlated with the near future preference. The objective of our learning method is to force the mutual information to be reinforced by modifying the frequency of the attributes in the old preference by providing weights to the attributes. With developing mathematical derivation of our learning method, experimental results on the learning and reasoning performance on TV genre preference using a real set of TV program watching history data.

  18. Dynamic Trust Models between Users over Social Networks

    DTIC Science & Technology

    2016-03-30

    18. NUMBER OF PAGES 12 19a. NAME OF RESPONSIBLE PERSON Hiroshi Motoda, Ph. D. a. REPORT U b. ABSTRACT U c. THIS PAGE U 19b...AFRL-AFOSR-JP-TR-2016-0039 Dynamic Trust Models between Users over Social Networks Kazumi Saito University Of Shizuoka Final Report 04/05/2016...No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1 hour per response , including the time for

  19. Design of personalized search engine based on user-webpage dynamic model

    NASA Astrophysics Data System (ADS)

    Li, Jihan; Li, Shanglin; Zhu, Yingke; Xiao, Bo

    2013-12-01

    Personalized search engine focuses on establishing a user-webpage dynamic model. In this model, users' personalized factors are introduced so that the search engine is better able to provide the user with targeted feedback. This paper constructs user and webpage dynamic vector tables, introduces singular value decomposition analysis in the processes of topic categorization, and extends the traditional PageRank algorithm.

  20. Effectiveness of Anabolic Steroid Preventative Intervention among Gym Users: Applying Theory of Planned Behavior

    PubMed Central

    Jalilian, Farzad; Allahverdipour, Hamid; Moeini, Babak; Moghimbeigi, Abbas

    2011-01-01

    Background: Use of anabolic androgenic steroids (AAS) has been associated with adverse physical and psychiatric effects and it is known as rising problem among youth people. This study was conducted to evaluate anabolic steroids preventative intervention efficiency among gym users in Iran and theory of planned behaviour was applied as theoretical framework. Methods: Overall, 120 male gym users participated in this study as intervention and control group. This was a longitudinal randomized pretest - posttest series control group design panel study to implement a behaviour modification based intervention to prevent AAS use. Cross -tabulation and t-test by using SPSS statistical package, version 13 was used for the statistical analysis. Results: It was found significant improvements in average response for knowledge about side effects of AAS (P<0.001), attitude toward, and intention not to use AAS. Additionally after intervention, the rate of AAS and supplements use was decreased among intervention group. Conclusion: Comprehensive implementation against AAS abuse among gym users and ado­lescences would be effective to improve adolescents’ healthy behaviors and intend them not to use AAS. PMID:24688897

  1. NASA AVOSS Fast-Time Wake Prediction Models: User's Guide

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew

    2014-01-01

    The National Aeronautics and Space Administration (NASA) is developing and testing fast-time wake transport and decay models to safely enhance the capacity of the National Airspace System (NAS). The fast-time wake models are empirical algorithms used for real-time predictions of wake transport and decay based on aircraft parameters and ambient weather conditions. The aircraft dependent parameters include the initial vortex descent velocity and the vortex pair separation distance. The atmospheric initial conditions include vertical profiles of temperature or potential temperature, eddy dissipation rate, and crosswind. The current distribution includes the latest versions of the APA (3.4) and the TDP (2.1) models. This User's Guide provides detailed information on the model inputs, file formats, and the model output. An example of a model run and a brief description of the Memphis 1995 Wake Vortex Dataset is also provided.

  2. The dynamical modeling and simulation analysis of the recommendation on the user-movie network

    NASA Astrophysics Data System (ADS)

    Zhang, Shujuan; Jin, Zhen; Zhang, Juan

    2016-12-01

    At present, most research about the recommender system is based on graph theory and algebraic methods, but these methods cannot predict the evolution of the system with time under the recommendation method, and cannot dynamically analyze the long-term utility of the recommendation method. However, these two aspects can be studied by the dynamical method, which essentially investigates the intrinsic evolution mechanism of things, and is widely used to study a variety of actual problems. So, in this paper, network dynamics is used to study the recommendation on the user-movie network, which consists of users and movies, and the movies are watched either by the personal search or through the recommendation. Firstly, dynamical models are established to characterize the personal search and the system recommendation mechanism: the personal search model, the random recommendation model, the preference recommendation model, the degree recommendation model and the hybrid recommendation model. The rationality of the models established is verified by comparing the stochastic simulation with the numerical simulation. Moreover, the validity of the recommendation methods is evaluated by studying the movie degree, which is defined as the number of the movie that has been watched. Finally, we combine the personal search and the recommendation to establish a more general model. The change of the average degree of all the movies is given with the strength of the recommendation. Results show that for each recommendation method, the change of the movie degree is different, and is related to the initial degree of movies, the adjacency matrix A representing the relation between users and movies, the time t. Additionally, we find that in a long time, the degree recommendation is not as good as that in a short time, which fully demonstrates the advantage of the dynamical method. For the whole user-movie system, the preference recommendation is the best.

  3. Heliport Noise Model (HNM) version 2.2 (user's guide)

    NASA Astrophysics Data System (ADS)

    Fleming, Gregg G.; Rickley, Edward J.

    1994-05-01

    The John A. Volpe National Transportation Systems Center (Volpe Center), in support of the Federal Aviation Administration, Office of Environment and Energy, has developed Version 2.2 of the Heliport Noise Model (HNM). The HNM is a computer program used for determining the impact of helicopter noise in the vicinity of terminal operations. This document, prepared by the Volpe Center's Acoustics Facility, is a User's Guide for HNM Version 2.2. It presents: (1) computer system requirements and installation procedures; (2) an overview of HNM capabilities and the user's implementation of these capabilities; (3) the elements of a heliport case study; (4) a step-by-step tutorial for preparing and running a case study; and (5) the interpretation of HNM Version 2.2 output. Also presented, in the Appendices of this document, are the following: (1) a discussion of the technical revisions made to several internal algorithms - primarily revisions which are transparent to HNM users; (2) a discussion of the helicopter noise Data Base used by the HNM; and (3) a summary of error messages in the HNM.

  4. Interactive Rapid Dose Assessment Model (IRDAM): user's guide

    SciTech Connect

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This User's Guide provides instruction in the setup and operation of the equipment necessary to run IRDAM. Instructions are also given on how to load the magnetic disks and access the interactive part of the program. Two other companion volumes to this one provide additional information on IRDAM. Reactor Accident Assessment Methods (NUREG/CR-3012, Volume 2) describes the technical bases for IRDAM including methods, models and assumptions used in calculations. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios.

  5. Engine structures modeling software system: Computer code. User's manual

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.

  6. Implementation of a Nonisothermal Unified Inelastic-Strain Theory for a Titanium Alloy into ABAQUS 5.4 User Guide

    DTIC Science & Technology

    1996-05-01

    AFRL-ML-WP-TR-1998-4143 IMPLEMENTATION OF A NONISOTHERMAL UNIFIED INELASTIC- STRAIN THEORY FOR A TITANIUM ALLOY INTO ABAQUS 5.4 USER GUIDE JOSEPH L... titanium alloy Timetal21 S. The Bodner-Partom form of unified theory satisfactorily describes the Timetal 21S stress- strain response for a range of...other related input files. 14. SUBJECT TERMS 15. NUMBER OF PAGES Titanium , Timetal2l S, Finite element methods, ABAQUS, Unified inelastic strain theory

  7. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  8. Care 3 model overview and user's guide, first revision

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Petersen, P. L.

    1985-01-01

    A manual was written to introduce the CARE III (Computer-Aided Reliability Estimation) capability to reliability and design engineers who are interested in predicting the reliability of highly reliable fault-tolerant systems. It was also structured to serve as a quick-look reference manual for more experienced users. The guide covers CARE III modeling and reliability predictions for execution in the CDC CYber 170 series computers, DEC VAX-11/700 series computer, and most machines that compile ANSI Standard FORTRAN 77.

  9. User-friendly software for modeling collective spin wave excitations

    NASA Astrophysics Data System (ADS)

    Hahn, Steven; Peterson, Peter; Fishman, Randy; Ehlers, Georg

    There exists a great need for user-friendly, integrated software that assists in the scientific analysis of collective spin wave excitations measured with inelastic neutron scattering. SpinWaveGenie is a C + + software library that simplifies the modeling of collective spin wave excitations, allowing scientists to analyze neutron scattering data with sophisticated models fast and efficiently. Furthermore, one can calculate the four-dimensional scattering function S(Q,E) to directly compare and fit calculations to experimental measurements. Its generality has been both enhanced and verified through successful modeling of a wide array of magnetic materials. Recently, we have spent considerable effort transforming SpinWaveGenie from an early prototype to a high quality free open source software package for the scientific community. S.E.H. acknowledges support by the Laboratory's Director's fund, ORNL. Work was sponsored by the Division of Scientific User Facilities, Office of Basic Energy Sciences, US Department of Energy, under Contract No. DE-AC05-00OR22725 with UT-Battelle, LLC.

  10. Halo modelling in chameleon theories

    SciTech Connect

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu E-mail: kazuya.koyama@port.ac.uk

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  11. User's Guide for Monthly Vector Wind Profile Model

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    1999-01-01

    The background, theoretical concepts, and methodology for construction of vector wind profiles based on a statistical model are presented. The derived monthly vector wind profiles are to be applied by the launch vehicle design community for establishing realistic estimates of critical vehicle design parameter dispersions related to wind profile dispersions. During initial studies a number of months are used to establish the model profiles that produce the largest monthly dispersions of ascent vehicle aerodynamic load indicators. The largest monthly dispersions for wind, which occur during the winter high-wind months, are used for establishing the design reference dispersions for the aerodynamic load indicators. This document includes a description of the computational process for the vector wind model including specification of input data, parameter settings, and output data formats. Sample output data listings are provided to aid the user in the verification of test output.

  12. Users manual for a one-dimensional Lagrangian transport model

    USGS Publications Warehouse

    Schoellhamer, D.H.; Jobson, H.E.

    1986-01-01

    A Users Manual for the Lagrangian Transport Model (LTM) is presented. The LTM uses Lagrangian calculations that are based on a reference frame moving with the river flow. The Lagrangian reference frame eliminates the need to numerically solve the convective term of the convection-diffusion equation and provides significant numerical advantages over the more commonly used Eulerian reference frame. When properly applied, the LTM can simulate riverine transport and decay processes within the accuracy required by most water quality studies. The LTM is applicable to steady or unsteady one-dimensional unidirectional flows in fixed channels with tributary and lateral inflows. Application of the LTM is relatively simple and optional capabilities improve the model 's convenience. Appendices give file formats and three example LTM applications that include the incorporation of the QUAL II water quality model 's reaction kinetics into the LTM. (Author 's abstract)

  13. Stochastic models: theory and simulation.

    SciTech Connect

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  14. Investigating Agile User-Centered Design in Practice: A Grounded Theory Perspective

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    This paper investigates how the integration of agile methods and User-Centered Design (UCD) is carried out in practice. For this study, we have applied grounded theory as a suitable qualitative approach to determine what is happening in actual practice. The data was collected by semi-structured interviews with professionals who have already worked with an integrated agile UCD methodology. Further data was collected by observing these professionals in their working context, and by studying their documents, where possible. The emerging themes that the study found show that there is an increasing realization of the importance of usability in software development among agile team members. The requirements are emerging; and both low and high fidelity prototypes based usability tests are highly used in agile teams. There is an appreciation of each other's work from both UCD professionals and developers and both sides can learn from each other.

  15. COSTEAM, an industrial steam generation cost model: updated users' manual

    SciTech Connect

    Murphy, Mary; Reierson, James; Lethi, Minh- Triet

    1980-10-01

    COSTEAM is a tool for designers and managers faced with choosing among alternative systems for generating process steam, whether for new or replacement applications. Such a decision requires a series of choices among overall system concepts, component characteristics, fuel types and financial assumptions, all of which are interdependent and affect the cost of steam. COSTEAM takes the user's input on key characteristics of a proposed process steam generation facility, and computes its capital, operating and maintenance costs. Versatility and simplicity of operation are major goals of the COSTEAM system. As a user, you can work to almost any level of detail necessary and appropriate to a given stage of planning. Since the values you specify are retained and used by the computer throughout each terminal session, you can set up a hypothetical steam generation system fixed in all characteristics but one or two of special interest. It is then quick and easy to obtain a series of results by changing only those one or two values between computer runs. This updated version of the Users' Manual contains instructions for using the expanded and improved COSTEAM model. COSTEAM has three technology submodels which address conventional coal, conventional oil and atmospheric fluidized bed combustion. The structure and calculation methods of COSTEAM are not discussed in this guide, and need not be understood in order to use the model. However, you may consult the companion volume of this report, COSTEAM Expansion and Improvements: Design of a Coal-Fired Atmospheric Fluidized Bed Submodel, an Oil-Fired Submodel, and Input/Output Improvements, MTR80W00048, which presents the design details.

  16. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

    2009-08-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several

  17. User-Defined Material Model for Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)

    2006-01-01

    An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.

  18. Hanford Soil Inventory Model (SIM) Rev. 1 Users Guide

    SciTech Connect

    Simpson, Brett C.; Corbin, Rob A.; Anderson, Michael J.; Kincaid, Charles T.

    2006-09-25

    The focus of the development and application of a soil inventory model as part of the Remediation and Closure Science (RCS) Project managed by PNNL was to develop a probabilistic approach to estimate comprehensive, mass balanced-based contaminant inventories for the Hanford Site post-closure setting. The outcome of this effort was the Hanford Soil Inventory Model (SIM). This document is a user's guide for the Hanford SIM. The principal project requirement for the SIM was to provide comprehensive quantitative estimates of contaminant inventory and its uncertainty for the various liquid waste sites, unplanned releases, and past tank farm leaks as a function of time and location at Hanford. The majority, but not all of these waste sites are in the 200 Areas of Hanford where chemical processing of spent fuel occurred. A computer model capable of performing these calculations and providing satisfactory quantitative output representing a robust description of contaminant inventory and uncertainty for use in other subsequent models was determined to be satisfactory to address the needs of the RCS Project. The ability to use familiar, commercially available software on high-performance personal computers for data input, modeling, and analysis, rather than custom software on a workstation or mainframe computer for modeling, was desired.

  19. User's guide to the MESOI diffusion model: Version 1. 1 (for Data General Eclipse S/230 with AFOS)

    SciTech Connect

    Athey, G.F.; Ramsdell, J.V.

    1982-09-01

    MESOI is an interactive, Langrangian puff trajectory model. The model theory is documented separately (Ramsdell and Athey, 1981). Version 1.1 is a modified form of the original 1.0. It is designed to run on a Data General Eclipse computer. The model has improved support features which make it useful as an emergency response tool. This report is intended to provide the user with the information necessary to successfully conduct model simulations using MESOI Version 1.1 and to use the support programs STAPREP and EXPLT. The user is also provided information on the use of the data file maintenance and review program UPDATE. Examples are given for the operation of the program. Test data sets are described which allow the user to practice with the programs and to confirm proper implementation and execution.

  20. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  1. CORCON-MOD3: An integrated computer model for analysis of molten core-concrete interactions. User`s manual

    SciTech Connect

    Bradley, D.R.; Gardner, D.R.; Brockmann, J.E.; Griffith, R.O.

    1993-10-01

    The CORCON-Mod3 computer code was developed to mechanistically model the important core-concrete interaction phenomena, including those phenomena relevant to the assessment of containment failure and radionuclide release. The code can be applied to a wide range of severe accident scenarios and reactor plants. The code represents the current state of the art for simulating core debris interactions with concrete. This document comprises the user`s manual and gives a brief description of the models and the assumptions and limitations in the code. Also discussed are the input parameters and the code output. Two sample problems are also given.

  2. Modeling of rolling element bearing mechanics. Computer program user's manual

    NASA Technical Reports Server (NTRS)

    Greenhill, Lyn M.; Merchant, David H.

    1994-01-01

    This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  3. Modeling integrated water user decisions in intermittent supply systems

    NASA Astrophysics Data System (ADS)

    Rosenberg, David E.; Tarawneh, Tarek; Abdel-Khaleq, Rania; Lund, Jay R.

    2007-07-01

    We apply systems analysis to estimate household water use in an intermittent supply system considering numerous interdependent water user behaviors. Some 39 household actions include conservation; improving local storage or water quality; and accessing sources having variable costs, availabilities, reliabilities, and qualities. A stochastic optimization program with recourse decisions identifies the infrastructure investments and short-term coping actions a customer can adopt to cost-effectively respond to a probability distribution of piped water availability. Monte Carlo simulations show effects for a population of customers. Model calibration reproduces the distribution of billed residential water use in Amman, Jordan. Parametric analyses suggest economic and demand responses to increased availability and alternative pricing. It also suggests potential market penetration for conservation actions, associated water savings, and subsidies to entice further adoption. We discuss new insights to size, target, and finance conservation.

  4. Gravitational Model of the Three Elements Theory

    NASA Astrophysics Data System (ADS)

    Lassiaille, Frederic

    The gravitational model of the three elements theory is an alternative theory to dark matter. It uses a modification of Newton's law in order to explain gravitational mysteries. The results of this model are explanations for the dark matter mysteries, the Pioneer anomaly, and the disparities of the measurements of G. Concerning the earth flyby anomalies, the theoretical order of magnitude is the same as the experimental one. A very small change of the perihelion advance of the planet orbits is calculated by this model. Meanwhile, this gravitational model is perfectly compatible with restricted relativity and general relativity, and is part of the three element theory, a unifying theory.

  5. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  6. Theory, modeling, and simulation annual report, 1992

    SciTech Connect

    Not Available

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  7. Model validation software -- Theory manual

    SciTech Connect

    Dolin, R.M.

    1997-11-04

    Work began in May of 1991 on the initial Independent Spline (IS) technology. The IS technology was based on research by Dolin showing that numerical topology and geometry could be validated through their topography. A unique contribution to this research is that the IS technology has provided a capability to modify one spline`s topology to match another spline`s topography. Work began in May of 1996 to extend the original IS capability to allow solid model topologies to be compared with corresponding two-dimensional topologies. Work began in July, 1996 to extend the IS capability to allow for tool path and inspection data analyses. Tool path analysis involves spline-spline comparisons. Inspection data analysis involves fitting inspection data with some type of analytical curve and then comparing that curve with the original (i.e., nominal) curve topology. There are three types of curves that the inspection data can be fit with. Using all three types of curve fits help engineers understand the As-Built state of whatever it is that is being interrogated. The ability to compute axi-symmetric volumes of revolution for a data set fit with either of the three curves fitting methods described above will be added later. This involves integrating the area under each curve and then revolving the area through 2{pi} radians to get a volume of revolution. The algorithms for doing this will be taken from the IGVIEW software system. The main IS program module parses out the desired activities into four different logical paths: (1) original IS spline modification; (2) two- or three-dimensional topography evaluated against 2D spline; (3) tool path analysis with tool path modifications; and (4) tool path and inspection data comparisons with nominal topography. Users have the option of running the traditional IS application software, comparing 3D ASCII data to a Wilson-Fowler spline interpolation of 2D data, comparing a Wilson-Fowler spline interpolation to analytical topology, or

  8. WASP7 BENTHIC ALGAE - MODEL THEORY AND USER'S GUIDE

    EPA Science Inventory

    The standard WASP7 eutrophication module includes nitrogen and phosphorus cycling, dissolved oxygen-organic matter interactions, and phytoplankton kinetics. In many shallow streams and rivers, however, the attached algae (benthic algae, or periphyton, attached to submerged substr...

  9. Modelling Psychological Needs for User-dependent Contextual Suggestion

    DTIC Science & Technology

    2014-11-01

    Suggestion track of 2014 Text REtrieval Conference (TREC). The task aims to provide recommendations on points of interests (POI) for various kinds of users...Extraction, intelligent information systems, support vector machines, text mining, machine learning 1. Introduction With the advancement of mobile and...one’s preference towards different POIs. For example, a female user is likely to prefer shopping as opposed to a male user. The 2014 Text REtrieval

  10. Graphical User Interface for Simulink Integrated Performance Analysis Model

    NASA Technical Reports Server (NTRS)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  11. User Acceptance of Long-Term Evolution (LTE) Services: An Application of Extended Technology Acceptance Model

    ERIC Educational Resources Information Center

    Park, Eunil; Kim, Ki Joon

    2013-01-01

    Purpose: The aim of this paper is to propose an integrated path model in order to explore user acceptance of long-term evolution (LTE) services by examining potential causal relationships between key psychological factors and user intention to use the services. Design/methodology/approach: Online survey data collected from 1,344 users are analysed…

  12. Making the Invisible Visible: Personas and Mental Models of Distance Education Library Users

    ERIC Educational Resources Information Center

    Lewis, Cynthia; Contrino, Jacline

    2016-01-01

    Gaps between users' and designers' mental models of digital libraries often result in adverse user experiences. This article details an exploratory user research study at a large, predominantly online university serving non-traditional distance education students with the goal of understanding these gaps. Using qualitative data, librarians created…

  13. BPACK -- A computer model package for boiler reburning/co-firing performance evaluations. User`s manual, Volume 1

    SciTech Connect

    Wu, K.T.; Li, B.; Payne, R.

    1992-06-01

    This manual presents and describes a package of computer models uniquely developed for boiler thermal performance and emissions evaluations by the Energy and Environmental Research Corporation. The model package permits boiler heat transfer, fuels combustion, and pollutant emissions predictions related to a number of practical boiler operations such as fuel-switching, fuels co-firing, and reburning NO{sub x} reductions. The models are adaptable to most boiler/combustor designs and can handle burner fuels in solid, liquid, gaseous, and slurried forms. The models are also capable of performing predictions for combustion applications involving gaseous-fuel reburning, and co-firing of solid/gas, liquid/gas, gas/gas, slurry/gas fuels. The model package is conveniently named as BPACK (Boiler Package) and consists of six computer codes, of which three of them are main computational codes and the other three are input codes. The three main codes are: (a) a two-dimensional furnace heat-transfer and combustion code: (b) a detailed chemical-kinetics code; and (c) a boiler convective passage code. This user`s manual presents the computer model package in two volumes. Volume 1 describes in detail a number of topics which are of general users` interest, including the physical and chemical basis of the models, a complete description of the model applicability, options, input/output, and the default inputs. Volume 2 contains a detailed record of the worked examples to assist users in applying the models, and to illustrate the versatility of the codes.

  14. Long Fibre Composite Modelling Using Cohesive User's Element

    NASA Astrophysics Data System (ADS)

    Kozák, Vladislav; Chlup, Zdeněk

    2010-09-01

    The development glass matrix composites reinforced by unidirectional long ceramic fibre has resulted in a family of very perspective structural materials. The only disadvantage of such materials is relatively high brittleness at room temperature. The main micromechanisms acting as toughening mechanism are the pull out, crack bridging, matrix cracking. There are other mechanisms as crack deflection etc. but the primer mechanism is mentioned pull out which is governed by interface between fibre and matrix. The contribution shows a way how to predict and/or optimise mechanical behaviour of composite by application of cohesive zone method and write user's cohesive element into the FEM numerical package Abaqus. The presented results from numerical calculations are compared with experimental data. Crack extension is simulated by means of element extinction algorithms. The principal effort is concentrated on the application of the cohesive zone model with the special traction separation (bridging) law and on the cohesive zone modelling. Determination of micro-mechanical parameters is based on the combination of static tests, microscopic observations and numerical calibration procedures.

  15. Long Fibre Composite Modelling Using Cohesive User's Element

    SciTech Connect

    Kozak, Vladislav; Chlup, Zdenek

    2010-09-30

    The development glass matrix composites reinforced by unidirectional long ceramic fibre has resulted in a family of very perspective structural materials. The only disadvantage of such materials is relatively high brittleness at room temperature. The main micromechanisms acting as toughening mechanism are the pull out, crack bridging, matrix cracking. There are other mechanisms as crack deflection etc. but the primer mechanism is mentioned pull out which is governed by interface between fibre and matrix. The contribution shows a way how to predict and/or optimise mechanical behaviour of composite by application of cohesive zone method and write user's cohesive element into the FEM numerical package Abaqus. The presented results from numerical calculations are compared with experimental data. Crack extension is simulated by means of element extinction algorithms. The principal effort is concentrated on the application of the cohesive zone model with the special traction separation (bridging) law and on the cohesive zone modelling. Determination of micro-mechanical parameters is based on the combination of static tests, microscopic observations and numerical calibration procedures.

  16. Propeller aircraft interior noise model: User's manual for computer program

    NASA Technical Reports Server (NTRS)

    Wilby, E. G.; Pope, L. D.

    1985-01-01

    A computer program entitled PAIN (Propeller Aircraft Interior Noise) has been developed to permit calculation of the sound levels in the cabin of a propeller-driven airplane. The fuselage is modeled as a cylinder with a structurally integral floor, the cabin sidewall and floor being stiffened by ring frames, stringers and floor beams of arbitrary configurations. The cabin interior is covered with acoustic treatment and trim. The propeller noise consists of a series of tones at harmonics of the blade passage frequency. Input data required by the program include the mechanical and acoustical properties of the fuselage structure and sidewall trim. Also, the precise propeller noise signature must be defined on a grid that lies in the fuselage skin. The propeller data are generated with a propeller noise prediction program such as the NASA Langley ANOPP program. The program PAIN permits the calculation of the space-average interior sound levels for the first ten harmonics of a propeller rotating alongside the fuselage. User instructions for PAIN are given in the report. Development of the analytical model is presented in NASA CR 3813.

  17. Towards Model-Driven End-User Development in CALL

    ERIC Educational Resources Information Center

    Farmer, Rod; Gruba, Paul

    2006-01-01

    The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…

  18. Theories of addiction: methamphetamine users' explanations for continuing drug use and relapse.

    PubMed

    Newton, Thomas F; De La Garza, Richard; Kalechstein, Ari D; Tziortzis, Desey; Jacobsen, Caitlin A

    2009-01-01

    A variety of preclinical models have been constructed to emphasize unique aspects of addiction-like behavior. These include Negative Reinforcement ("Pain Avoidance"), Positive Reinforcement ("Pleasure Seeking"), Incentive Salience ("Craving"), Stimulus Response Learning ("Habits"), and Inhibitory Control Dysfunction ("Impulsivity"). We used a survey to better understand why methamphetamine-dependent research volunteers (N = 73) continue to use methamphetamine, or relapse to methamphetamine use after a period of cessation of use. All participants met DSM-IV criteria for methamphetamine abuse or dependence, and did not meet criteria for other current Axis I psychiatric disorders or dependence on other drugs of abuse, other than nicotine. The questionnaire consisted of a series of face-valid questions regarding drug use, which in this case referred to methamphetamine use. Examples of questions include: "Do you use drugs mostly to make bad feelings like boredom, loneliness, or apathy go away?", "Do you use drugs mostly because you want to get high?", "Do you use drugs mostly because of cravings?", "Do you find yourself getting ready to take drugs without thinking about it?", and "Do you impulsively take drugs?". The scale was anchored at 1 (not at all) and 7 (very much). For each question, the numbers of participants rating each question negatively (1 or 2), neither negatively or affirmatively (3-5), and affirmatively (6 or 7) were tabulated. The greatest number of respondents (56%) affirmed that they used drugs due to "pleasure seeking." The next highest categories selected were "impulsivity" (27%) and "habits"(25%). Surprisingly, many participants reported that "pain avoidance" (30%) and "craving" (30%) were not important for their drug use. Results from this study support the contention that methamphetamine users (and probably other drug users as well) are more heterogeneous than is often appreciated, and imply that treatment development might be more successful if

  19. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  20. A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James

    1994-01-01

    Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.

  1. User Manual for Graphical User Interface Version 2.10 with Fire and Smoke Simulation Model (FSSIM) Version 1.2

    DTIC Science & Technology

    2010-05-10

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6180--10-9244 User Manual for Graphical User Interface Version 2.10 with Fire and Smoke...ABSTRACT User Manual for Graphical User Interface Version 2.10 with Fire and Smoke Simulation Model (FSSIM) Version 1.2 Tomasz A. Haupt,* Gregory J...runtime environment for a third-party simulation package, Fire and Smoke Simulation (FSSIM) developed by HAI. This updated user’s manual for the

  2. APEX user`s guide - (Argonne production, expansion, and exchange model for electrical systems), version 3.0

    SciTech Connect

    VanKuiken, J.C.; Veselka, T.D.; Guziel, K.A.; Blodgett, D.W.; Hamilton, S.; Kavicky, J.A.; Koritarov, V.S.; North, M.J.; Novickas, A.A.; Paprockas, K.R.

    1994-11-01

    This report describes operating procedures and background documentation for the Argonne Production, Expansion, and Exchange Model for Electrical Systems (APEX). This modeling system was developed to provide the U.S. Department of Energy, Division of Fossil Energy, Office of Coal and Electricity with in-house capabilities for addressing policy options that affect electrical utilities. To meet this objective, Argonne National Laboratory developed a menu-driven programming package that enables the user to develop and conduct simulations of production costs, system reliability, spot market network flows, and optimal system capacity expansion. The APEX system consists of three basic simulation components, supported by various databases and data management software. The components include (1) the investigation of Costs and Reliability in Utility Systems (ICARUS) model, (2) the Spot Market Network (SMN) model, and (3) the Production and Capacity Expansion (PACE) model. The ICARUS model provides generating-unit-level production-cost and reliability simulations with explicit recognition of planned and unplanned outages. The SMN model addresses optimal network flows with recognition of marginal costs, wheeling charges, and transmission constraints. The PACE model determines long-term (e.g., longer than 10 years) capacity expansion schedules on the basis of candidate expansion technologies and load growth estimates. In addition, the Automated Data Assembly Package (ADAP) and case management features simplify user-input requirements. The ADAP, ICARUS, and SMN modules are described in detail. The PACE module is expected to be addressed in a future publication.

  3. The Design and Implementation of a Visual User Interface for a Structured Model Management System

    DTIC Science & Technology

    1988-03-01

    Marshall McLuhan and Quentin Fiore The Medium is the Message (1967) We could easily continue that the computer interface is an extension of the user. To...OR) [Ref.2, p.1]. Managers may feel overly dependent on these MS/OR practitioners who more fully understand the underlying concepts of modeling...point. The program presupposes that the user understands struc- tured modeling concepts, but makes no further assumptions regarding the user’s computer

  4. A user-centered model for web site design: needs assessment, user interface design, and rapid prototyping.

    PubMed

    Kinzie, Mable B; Cohn, Wendy F; Julian, Marti F; Knaus, William A

    2002-01-01

    As the Internet continues to grow as a delivery medium for health information, the design of effective Web sites becomes increasingly important. In this paper, the authors provide an overview of one effective model for Web site design, a user-centered process that includes techniques for needs assessment, goal/task analysis, user interface design, and rapid prototyping. They detail how this approach was employed to design a family health history Web site, Health Heritage . This Web site helps patients record and maintain their family health histories in a secure, confidential manner. It also supports primary care physicians through analysis of health histories, identification of potential risks, and provision of health care recommendations. Visual examples of the design process are provided to show how the use of this model resulted in an easy-to-use Web site that is likely to meet user needs. The model is effective across diverse content arenas and is appropriate for applications in varied media.

  5. Satellite services system analysis study. Volume 2: Satellite and services user model

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Satellite services needs are analyzed. Topics include methodology: a satellite user model; representative servicing scenarios; potential service needs; manned, remote, and automated involvement; and inactive satellites/debris. Satellite and services user model development is considered. Groundrules and assumptions, servicing, events, and sensitivity analysis are included. Selection of references satellites is also discussed.

  6. Understanding the Impact of User Frustration Intensities on Task Performance Using the OCC Theory of Emotions

    NASA Technical Reports Server (NTRS)

    Washington, Gloria

    2012-01-01

    Have you heard the saying "frustration is written all over your falce"? Well this saying is true, but that is not the only place. Frustration is written all over your face and your body. The human body has various means to communicate an emotion without the utterance of a single word. The Media Equation says that people interact with computers as if they are human: this includes experiencing frustration. This research measures frustration by monitoring human body-based measures such as heart rate, posture, skin temperature. and respiration. The OCC Theory of Emotions is used to separate frustration into different levels or intensities. The results of this study showed that individual intensities of frustration exist, so that task performance is not degraded. Results from this study can be used by usability testers to model how much frustration is needed before task performance measures start to decrease.

  7. Graphical Model Theory for Wireless Sensor Networks

    SciTech Connect

    Davis, William B.

    2002-12-08

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm.

  8. Self Modeling: Expanding the Theories of Learning

    ERIC Educational Resources Information Center

    Dowrick, Peter W.

    2012-01-01

    Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…

  9. The Oak Ridge National Laboratory automobile heat pump model: User`s guide

    SciTech Connect

    Kyle, D.M.

    1993-05-01

    A computer program has been developed to predict the steady-state performance of vapor compression automobile air conditioners and heat pumps. The code is based on the residential heat pump model developed at Oak Ridge National Laboratory. Most calculations are based on fundamental physical principles, in conjunction with generalized correlations available in the research literature. Automobile air conditioning components that can be specified as inputs to the program include open and hermetic compressors; finned tube condensers; finned tube and plate-fin style evaporators; thermal expansion valve, capillary tube and short tube expansion devices; refrigerant mass; evaporator pressure regulator; and all interconnecting tubing. The program can be used with a variety of refrigerants, including R134a. Methodologies are discussed for using the model as a tool for designing all new systems or, alternatively, as a tool for simulating a known system for a variety of operating conditions.

  10. Teaching hydrological modeling with a user-friendly catchment-runoff-model software package

    NASA Astrophysics Data System (ADS)

    Seibert, J.; Vis, M. J. P.

    2012-09-01

    Computer models, especially conceptual models, are frequently used for catchment hydrology studies. Teaching hydrological modeling, however, is challenging, since students have to both understand general model concepts and be able to use particular computer programs when learning to apply computer models. Here we present a new version of the HBV (Hydrologiska Byråns Vattenavdelning) model. This software provides a user-friendly version that is especially useful for education. Different functionalities, such as an automatic calibration using a genetic algorithm or a Monte Carlo approach, as well as the possibility to perform batch runs with predefined model parameters make the software interesting especially for teaching in more advanced classes and research projects. Different teaching goals related to hydrological modeling are discussed and a series of exercises is suggested to reach these goals.

  11. Teaching hydrological modeling with a user-friendly catchment-runoff-model software package

    NASA Astrophysics Data System (ADS)

    Seibert, J.; Vis, M. J. P.

    2012-05-01

    Computer models, and especially conceptual models, are frequently used for catchment hydrology studies. Teaching hydrological modeling, however, is challenging as students, when learning to apply computer models, have both to understand general model concepts and to be able to use particular computer programs. Here we present a new version of the HBV model. This software provides a user-friendly version which is especially useful for education. Different functionalities like an automatic calibration using a genetic algorithm or a Monte Carlo approach as well as the possibility to perform batch runs with predefined model parameters make the software also interesting for teaching in more advanced classes and research projects. Different teaching goals related to hydrological modeling are discussed and a series of exercises is suggested to reach these goals.

  12. An Information Transfer Model to Define Information Users and Outputs with Specific Application to Environmental Technology.

    ERIC Educational Resources Information Center

    Landau, Herbert B.; And Others

    1982-01-01

    Develops an information transfer model which relates information products to the user's innovation decision-making process and highlights the linkage between specific products and user needs at each decision point. Specific applications to environmental technology are discussed. Three figures, five tables, and a reference list accompany the text.…

  13. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    NASA Astrophysics Data System (ADS)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  14. Viscous wing theory development. Volume 2: GRUMWING computer program user's manual

    NASA Technical Reports Server (NTRS)

    Chow, R. R.; Ogilvie, P. L.

    1986-01-01

    This report is a user's manual which describes the operation of the computer program, GRUMWING. The program computes the viscous transonic flow over three-dimensional wings using a boundary layer type viscid-inviscid interaction approach. The inviscid solution is obtained by an approximate factorization (AFZ)method for the full potential equation. The boundary layer solution is based on integral entrainment methods.

  15. Informatic system for a global tissue-fluid biorepository with a graph theory-oriented graphical user interface.

    PubMed

    Butler, William E; Atai, Nadia; Carter, Bob; Hochberg, Fred

    2014-01-01

    The Richard Floor Biorepository supports collaborative studies of extracellular vesicles (EVs) found in human fluids and tissue specimens. The current emphasis is on biomarkers for central nervous system neoplasms but its structure may serve as a template for collaborative EV translational studies in other fields. The informatic system provides specimen inventory tracking with bar codes assigned to specimens and containers and projects, is hosted on globalized cloud computing resources, and embeds a suite of shared documents, calendars, and video-conferencing features. Clinical data are recorded in relation to molecular EV attributes and may be tagged with terms drawn from a network of externally maintained ontologies thus offering expansion of the system as the field matures. We fashioned the graphical user interface (GUI) around a web-based data visualization package. This system is now in an early stage of deployment, mainly focused on specimen tracking and clinical, laboratory, and imaging data capture in support of studies to optimize detection and analysis of brain tumour-specific mutations. It currently includes 4,392 specimens drawn from 611 subjects, the majority with brain tumours. As EV science evolves, we plan biorepository changes which may reflect multi-institutional collaborations, proteomic interfaces, additional biofluids, changes in operating procedures and kits for specimen handling, novel procedures for detection of tumour-specific EVs, and for RNA extraction and changes in the taxonomy of EVs. We have used an ontology-driven data model and web-based architecture with a graph theory-driven GUI to accommodate and stimulate the semantic web of EV science.

  16. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    SciTech Connect

    Bloyd, C.; Camp, J.; Conzelmann, G.

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  17. Solid Waste Projection Model: Database User`s Guide. Version 1.4

    SciTech Connect

    Blackburn, C.L.

    1993-10-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for using Version 1.4 of the SWPM database: system requirements and preparation, entering and maintaining data, and performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not Provide instruction in the use of Paradox, the database management system in which the SWPM database is established.

  18. User's guide to the MESOI diffusion model and to the utility programs UPDATE and LOGRVU

    SciTech Connect

    Athey, G.F.; Allwine, K.J.; Ramsdell, J.V.

    1981-11-01

    MESOI is an interactive, Lagrangian puff trajectory diffusion model. The model is documented separately (Ramsdell and Athey, 1981); this report is intended to provide MESOI users with the information needed to successfully conduct model simulations. The user is also provided with guidance in the use of the data file maintenance and review programs; UPDATE and LOGRVU. Complete examples are given for the operaton of all three programs and an appendix documents UPDATE and LOGRVU.

  19. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia.

    PubMed

    Segkouli, Sofia; Paliokas, Ioannis; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users.

  20. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia

    PubMed Central

    Segkouli, Sofia; Paliokas, Ioannis; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  1. Users Manual for the Dynamic Student Flow Model.

    DTIC Science & Technology

    1981-07-31

    Accordingly, the expository method that will be followed in setting forth the user information is to describe a normal application with indications...PTR) FY79-83 JET PROP HELO TOTALS FY79 NAVY 375 295 215 885 MARINE 165 0 305 470 CG&F 30 47 54 131 TOTALS 570 342 5 4 1,486 FY80 NAVY 318 316 251...885 MARINE 158 0 292 450 CG&F 30 47 54 131 TOTALS 506 363 597 1,466 FY81 NAVY 324 322 254 900 MARINE 188 0 282 470 CG&F 30 47 54 131 TOTALS 542 369 590

  2. Maintenance Resource Prediction Model (MRPM) User’s Manual

    DTIC Science & Technology

    1991-01-01

    Figure 2-61. Resource Summary File 2-59 When the G = GRAPH function is selected, the user can enter the number of the row to be graphed followed by...13 9 10 5 Mat Costs 2 3 9 7 4 6 Eqp Costs 1 1 2 2 2 7 Tot Costs 8 9 25 19 16 Command Mode FI=TOP F2=BOT F3=FIND PgUp=PREV PgDn=NEXT G = GRAPH F1O=EXIT...Display 12. Type: G ( graph ) Resource Summary) 13. Type: 7 (total cost) 2. Enter 14. Enter (System shows 3. Type: ZI (your different screens organization

  3. Vulnerability and the intention to anabolic steroids use among Iranian gym users: an application of the theory of planned behavior.

    PubMed

    Allahverdipour, Hamid; Jalilian, Farzad; Shaghaghi, Abdolreza

    2012-02-01

    This correlational study explored the psychological antecedents of 253 Iranian gym users' intentions to use the anabolic-androgenic steroids (AAS), based on the Theory of Planned Behavior (TPB). The three predictor variables of (1) attitude, (2) subjective norms, and (3) perceived behavioral control accounted for 63% of the variation in the outcome measure of the intention to use the AAS. There is some support to use the TPB to design and implement interventions to modify and/or improve individuals' beliefs that athletic goals are achievable without the use of the AAS.

  4. Communications network design and costing model users manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.

  5. Modeling Spoken Word Recognition Performance by Pediatric Cochlear Implant Users using Feature Identification

    PubMed Central

    Frisch, Stefan A.; Pisoni, David B.

    2012-01-01

    Objective Computational simulations were carried out to evaluate the appropriateness of several psycholinguistic theories of spoken word recognition for children who use cochlear implants. These models also investigate the interrelations of commonly used measures of closed-set and open-set tests of speech perception. Design A software simulation of phoneme recognition performance was developed that uses feature identification scores as input. Two simulations of lexical access were developed. In one, early phoneme decisions are used in a lexical search to find the best matching candidate. In the second, phoneme decisions are made only when lexical access occurs. Simulated phoneme and word identification performance was then applied to behavioral data from the Phonetically Balanced Kindergarten test and Lexical Neighborhood Test of open-set word recognition. Simulations of performance were evaluated for children with prelingual sensorineural hearing loss who use cochlear implants with the MPEAK or SPEAK coding strategies. Results Open-set word recognition performance can be successfully predicted using feature identification scores. In addition, we observed no qualitative differences in performance between children using MPEAK and SPEAK, suggesting that both groups of children process spoken words similarly despite differences in input. Word recognition ability was best predicted in the model in which phoneme decisions were delayed until lexical access. Conclusions Closed-set feature identification and open-set word recognition focus on different, but related, levels of language processing. Additional insight for clinical intervention may be achieved by collecting both types of data. The most successful model of performance is consistent with current psycholinguistic theories of spoken word recognition. Thus it appears that the cognitive process of spoken word recognition is fundamentally the same for pediatric cochlear implant users and children and adults with

  6. Users guide for the hydroacoustic coverage assessment model (HydroCAM)

    SciTech Connect

    Farrell, T., LLNL

    1997-12-01

    A model for predicting the detection and localization performance of hydroacoustic monitoring networks has been developed. The model accounts for major factors affecting global-scale acoustic propagation in the ocean. including horizontal refraction, travel time variability due to spatial and temporal fluctuations in the ocean, and detailed characteristics of the source. Graphical user interfaces are provided to setup the models and visualize the results. The model produces maps of network detection coverage and localization area of uncertainty, as well as intermediate results such as predicted path amplitudes, travel time and travel time variance. This Users Guide for the model is organized into three sections. First a summary of functionality available in the model is presented, including example output products. The second section provides detailed descriptions of each of models contained in the system. The last section describes how to run the model, including a summary of each data input form in the user interface.

  7. Recursive renormalization group theory based subgrid modeling

    NASA Technical Reports Server (NTRS)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  8. A Probabilistic Model of Theory Formation

    ERIC Educational Resources Information Center

    Kemp, Charles; Tenenbaum, Joshua B.; Niyogi, Sourabh; Griffiths, Thomas L.

    2010-01-01

    Concept learning is challenging in part because the meanings of many concepts depend on their relationships to other concepts. Learning these concepts in isolation can be difficult, but we present a model that discovers entire systems of related concepts. These systems can be viewed as simple theories that specify the concepts that exist in a…

  9. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  10. Modeling Integrated Water-User Decisions with Intermittent Supplies

    NASA Astrophysics Data System (ADS)

    Lund, J. R.; Rosenberg, D.

    2006-12-01

    We present an economic-engineering method to estimate urban water use demands with intermittent water supplies. A two-stage, probabilistic optimization formulation includes a wide variety of water supply enhancement and conservation actions that individual households can adopt to meet multiple water quality uses with uncertain water availability. We embed the optimization in Monte-Carlo simulations to show aggregate effects at a utility (citywide) scale for a population of user conditions and decisions. Parametric analysis provides derivations of supply curves to subsidize conservation, demand responses to alternative pricing, and customer willingness-to-pay to avoid shortages. Results show a good empirical fit for the average and distribution of billed residential water use in Amman, Jordan. Additional outputs give likely market penetration rates for household conservation actions, associated water savings, and subsidies required to entice further adoption. We discuss new insights to size, target, market, and finance conservation programs and interpret a demand curve with block pricing.

  11. FAA Integrated Noise Model User’s Guide,

    DTIC Science & Technology

    1976-03-01

    lcsl kiport Docum .ntatioo Pog. 3 . R.cipi .nt e Catalog No. c2.~ FAA~~~~~~~ 2 J j 2. Gov .rn m.n~ Acc .,sie n No. @~ ~~~~~~~~~~~~ pg,.4 T tI. and...2.4 Flight Track Definitions 2— 3 2.5 Total Traffic Mix 2—14 2.6 Traffic Mix Allocation 2—18 2.7 User Options 2—19 3 . ALTERNATE NOISE DATA 3 —1 3.1...Background 3 —1 3.2 Specifying Alternate Noise Library Entries 3 —1 3.2.1 Library Identification 3 —2 3.2 .2 ProfIle Data 3 —2 3.2.3 Acoustic Data 3 —7 4

  12. Morpheus: a user-friendly modeling environment for multiscale and multicellular systems biology.

    PubMed

    Starruß, Jörn; de Back, Walter; Brusch, Lutz; Deutsch, Andreas

    2014-05-01

    Morpheus is a modeling environment for the simulation and integration of cell-based models with ordinary differential equations and reaction-diffusion systems. It allows rapid development of multiscale models in biological terms and mathematical expressions rather than programming code. Its graphical user interface supports the entire workflow from model construction and simulation to visualization, archiving and batch processing.

  13. Theory, Modeling, and Simulation of Semiconductor Lasers

    NASA Technical Reports Server (NTRS)

    Ning, Cun-Zheng; Saini, Subbash (Technical Monitor)

    1998-01-01

    Semiconductor lasers play very important roles in many areas of information technology. In this talk, I will first give an overview of semiconductor laser theory. This will be followed by a description of different models and their shortcomings in modeling and simulation. Our recent efforts in constructing a fully space and time resolved simulation model will then be described. Simulation results based on our model will be presented. Finally the effort towards a self-consistent and comprehensive simulation capability for the opto-electronics integrated circuits (OEICs) will be briefly reviewed.

  14. The Woodcock-Johnson Tests of Cognitive Abilities III's Cognitive Performance Model: Empirical Support for Intermediate Factors within CHC Theory

    ERIC Educational Resources Information Center

    Taub, Gordon E.; McGrew, Kevin S.

    2014-01-01

    The Woodcock-Johnson Tests of Cognitive Ability Third Edition is developed using the Cattell-Horn-Carroll (CHC) measurement-theory test design as the instrument's theoretical blueprint. The instrument provides users with cognitive scores based on the Cognitive Performance Model (CPM); however, the CPM is not a part of CHC theory. Within the…

  15. Jobs and Economic Development Impact (JEDI) Model Geothermal User Reference Guide

    SciTech Connect

    Johnson, C.; Augustine, C.; Goldberg, M.

    2012-09-01

    The Geothermal Jobs and Economic Development Impact (JEDI) model, developed through the National Renewable Energy Laboratory (NREL), is an Excel-based user-friendly tools that estimates the economic impacts of constructing and operating hydrothermal and Enhanced Geothermal System (EGS) power generation projects at the local level for a range of conventional and renewable energy technologies. The JEDI Model Geothermal User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.

  16. Lattice gauge theories and spin models

    NASA Astrophysics Data System (ADS)

    Mathur, Manu; Sreeraj, T. P.

    2016-10-01

    The Wegner Z2 gauge theory-Z2 Ising spin model duality in (2 +1 ) dimensions is revisited and derived through a series of canonical transformations. The Kramers-Wannier duality is similarly obtained. The Wegner Z2 gauge-spin duality is directly generalized to SU(N) lattice gauge theory in (2 +1 ) dimensions to obtain the SU(N) spin model in terms of the SU(N) magnetic fields and their conjugate SU(N) electric scalar potentials. The exact and complete solutions of the Z2, U(1), SU(N) Gauss law constraints in terms of the corresponding spin or dual potential operators are given. The gauge-spin duality naturally leads to a new gauge invariant magnetic disorder operator for SU(N) lattice gauge theory which produces a magnetic vortex on the plaquette. A variational ground state of the SU(2) spin model with nearest neighbor interactions is constructed to analyze SU(2) gauge theory.

  17. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    EPA Science Inventory

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  18. User's manual for interactive LINEAR: A FORTRAN program to derive linear aircraft models

    NASA Technical Reports Server (NTRS)

    Antoniewicz, Robert F.; Duke, Eugene L.; Patterson, Brian P.

    1988-01-01

    An interactive FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft aerodynamic models is documented in this report. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied linear or nonlinear aerodynamic model. The nonlinear equations of motion used are six-degree-of-freedom equations with stationary atmosphere and flat, nonrotating earth assumptions. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.

  19. HIGHWAY 3.1: An enhanced HIGHWAY routing model: Program description, methodology, and revised user`s manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S.; Clarke, D.B.; Jacobi, J.M.

    1993-03-01

    The HIGHWAY program provides a flexible tool for predicting highway routes for transporting radioactive materials in the United States. The HIGHWAY data base is essentially a computerized road atlas that currently describes over 240,000 miles of highways. Complete descriptions of all Interstate System and most US highways (except those that parallel a nearby Interstate highway) are included in the data base. Many of the principal state highways and a number of local and county highways are also identified. The data base also includes locations of nuclear facilities and major airports. Several different types of routes may be calculated, depending on a set of user-supplied constraints. Routes are calculated by minimizing the total impedance between the origin and the destination. Basically, the impedance is defined as a function of distance and driving time along a particular highway segment. Several routing constraints can be imposed during the calculations. One of the special features of the HIGHWAY model is its ability to calculate routes that maximize use of Interstate System highways. This feature allows the user to predict routes for shipments of radioactive materials that conform to the US Department of Transportation routing regulations. Other features of the model include the ability to predict routes that bypass a specific state, city, town, or highway segment. Two special features have been incorporated in HIGHWAY, version 3.1. The first is the ability to automatically calculate alternative routes. Frequently, there are a number of routes between the source and destination that vary slightly in distance and estimated driving time. The HIGHWAY program offers a selection of different but nearly equal routes. The second special feature is the capability to calculate route-specific population density statistics. The population density distribution is calculated for each highway segment in the route and is reported on a state-by-state basis.

  20. Crack propagation modeling using Peridynamic theory

    NASA Astrophysics Data System (ADS)

    Hafezi, M. H.; Alebrahim, R.; Kundu, T.

    2016-04-01

    Crack propagation and branching are modeled using nonlocal peridynamic theory. One major advantage of this nonlocal theory based analysis tool is the unifying approach towards material behavior modeling - irrespective of whether the crack is formed in the material or not. No separate damage law is needed for crack initiation and propagation. This theory overcomes the weaknesses of existing continuum mechanics based numerical tools (e.g. FEM, XFEM etc.) for identifying fracture modes and does not require any simplifying assumptions. Cracks grow autonomously and not necessarily along a prescribed path. However, in some special situations such as in case of ductile fracture, the damage evolution and failure depend on parameters characterizing the local stress state instead of peridynamic damage modeling technique developed for brittle fracture. For brittle fracture modeling the bond is simply broken when the failure criterion is satisfied. This simulation helps us to design more reliable modeling tool for crack propagation and branching in both brittle and ductile materials. Peridynamic analysis has been found to be very demanding computationally, particularly for real-world structures (e.g. vehicles, aircrafts, etc.). It also requires a very expensive visualization process. The goal of this paper is to bring awareness to researchers the impact of this cutting-edge simulation tool for a better understanding of the cracked material response. A computer code has been developed to implement the peridynamic theory based modeling tool for two-dimensional analysis. A good agreement between our predictions and previously published results is observed. Some interesting new results that have not been reported earlier by others are also obtained and presented in this paper. The final objective of this investigation is to increase the mechanics knowledge of self-similar and self-affine cracks.

  1. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable

  2. Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation.

    NASA Astrophysics Data System (ADS)

    Lawley, Russell; Barron, Mark; Lee, Katy

    2014-05-01

    Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation. R. Lawley, M. Barron and K. Lee. NERC - British Geological Survey, Environmental Science Centre, Keyworth, Nottingham, UK, NG12 5GG The boundaries mapped in traditional field geological survey are subject to a wide range of inherent uncertainties. A map at a survey-scale of 1:10,000 is created by a combination of terrain interpretation, direct observations from boreholes and exposures (often sparsely distributed), and indirect interpretation of proxy variables such as soil properties, vegetation and remotely sensed images. A critical factor influencing the quality of the final map is the skill and experience of the surveyor to bring this information together in a coherent conceptual model. The users of geological data comprising or based on mapped boundaries are increasingly aware of these uncertainties, and want to know how to manage them. The growth of 3D modelling, which takes 2D surveys as a starting point, adds urgency to the need for a better understanding of survey uncertainties; particularly where 2D mapping of variable vintage has been compiled into a national coverage. Previous attempts to apply confidence on the basis of metrics such as data density, survey age or survey techniques have proved useful for isolating single, critical, factors but do not generally succeed in evaluating geological mapping 'in the round', because they cannot account for the 'conceptual' skill set of the surveyor. The British Geological Survey (BGS) is using expert elicitation methods to gain a better understanding of uncertainties within the national geological map of Great Britain. The expert elicitation approach starts with the assumption that experienced surveyors have an intuitive sense of the uncertainty of the boundaries that they map, based on a tacit model of geology and its complexity and the nature of the surveying process. The objective of

  3. Solar Advisor Model User Guide for Version 2.0

    SciTech Connect

    Gilman, P.; Blair, N.; Mehos, M.; Christensen, C.; Janzou, S.; Cameron, C.

    2008-08-01

    The Solar Advisor Model (SAM) provides a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, from photovoltaic systems for residential and commercial markets to concentrating solar power and large photovoltaic systems for utility markets. This manual describes Version 2.0 of the software, which can model photovoltaic and concentrating solar power technologies for electric applications for several markets. The current version of the Solar Advisor Model does not model solar heating and lighting technologies.

  4. Combining Unsupervised and Supervised Classification to Build User Models for Exploratory Learning Environments

    ERIC Educational Resources Information Center

    Amershi, Saleema; Conati, Cristina

    2009-01-01

    In this paper, we present a data-based user modeling framework that uses both unsupervised and supervised classification to build student models for exploratory learning environments. We apply the framework to build student models for two different learning environments and using two different data sources (logged interface and eye-tracking data).…

  5. STORM WATER MANAGEMENT MODEL USER'S MANUAL VERSION 5.0

    EPA Science Inventory

    The EPA Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. SWMM was first developed in 1971 and has undergone several major upgrade...

  6. Topos models for physics and topos theory

    NASA Astrophysics Data System (ADS)

    Wolters, Sander

    2014-08-01

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a "quantum logic" in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  7. Topos models for physics and topos theory

    SciTech Connect

    Wolters, Sander

    2014-08-15

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  8. Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training

    NASA Astrophysics Data System (ADS)

    Macris, A.; Malamateniou, F.; Vassilacopoulos, G.

    Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.

  9. A Neural Network Approach to Intention Modeling for User-Adapted Conversational Agents

    PubMed Central

    Griol, David

    2016-01-01

    Spoken dialogue systems have been proposed to enable a more natural and intuitive interaction with the environment and human-computer interfaces. In this contribution, we present a framework based on neural networks that allows modeling of the user's intention during the dialogue and uses this prediction to dynamically adapt the dialogue model of the system taking into consideration the user's needs and preferences. We have evaluated our proposal to develop a user-adapted spoken dialogue system that facilitates tourist information and services and provide a detailed discussion of the positive influence of our proposal in the success of the interaction, the information and services provided, and the quality perceived by the users. PMID:26819592

  10. Prospects for Advanced RF Theory and Modeling

    SciTech Connect

    Batchelor, D.B.

    1999-04-12

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed.

  11. Prospects for advanced RF theory and modeling

    NASA Astrophysics Data System (ADS)

    Batchelor, D. B.

    1999-09-01

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed.

  12. Users guide for SAMM: A prototype southeast Alaska multiresource model. Forest Service general technical report

    SciTech Connect

    Weyermann, D.L.; Fight, R.D.; Garrett, F.D.

    1991-08-01

    This paper instructs resource analysts on using the southeast Alaska multiresource model (SAMM). SAMM is an interactive microcomputer program that allows users to explore relations among several resources in southeast Alaska (timber, anadromous fish, deer, and hydrology) and the effects of timber management activities (logging, thinning, and road building) on those relations and resources. This guide assists users in installing SAMM on a microcomputer, developing input data files, making simulation runs, and strong output data for external analysis and graphic display.

  13. Modeling Second Language Change Using Skill Retention Theory

    DTIC Science & Technology

    2013-06-01

    Second language learning explored: SLA theories across nine contemporary theories . In B. VanPatten & J. Williams (Eds.), Theories in second ...mangngyrlngglrnngprgrm/correspo ndenceofproficiencysca.htm Spolsky, B. (1985). Formulating a theory of second language learning . Studies in Second ...public release; distribution is unlimited MODELING SECOND LANGUAGE CHANGE USING SKILL RETENTION THEORY by Samuel R.

  14. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    ERIC Educational Resources Information Center

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  15. Petroleum Refinery Jobs and Economic Development Impact (JEDI) Model User Reference Guide

    SciTech Connect

    Goldberg, M.

    2013-12-31

    The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are user-friendly tools utilized to estimate the economic impacts at the local level of constructing and operating fuel and power generation projects for a range of conventional and renewable energy technologies. The JEDI Petroleum Refinery Model User Reference Guide was developed to assist users in employing and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted. Based on project-specific inputs from the user, the model estimates job creation, earning and output (total economic activity) for a given petroleum refinery. This includes the direct, indirect and induced economic impacts to the local economy associated with the refinery's construction and operation phases. Project cost and job data used in the model are derived from the most current cost estimations available. Local direct and indirect economic impacts are estimated using economic multipliers derived from IMPLAN software. By determining the regional economic impacts and job creation for a proposed refinery, the JEDI Petroleum Refinery model can be used to field questions about the added value refineries may bring to the local community.

  16. Computerized Shawnee lime/limestone scrubbing model users manual

    SciTech Connect

    Anders, W.L.; Torstrick, R.L.

    1981-03-01

    The manual gives a general description of a computerized model for estimating design and cost of lime or limestone scrubber systems for flue gas desulfurization (FGD). It supplements PB80-123037 by extending the number of scrubber options which can be evaluated. It includes spray tower and venturi/spray-tower absorbers, forced oxidation systems, systems with absorber loop additives (MgO or adipic acid), revised design and economic premises, and other changes reflecting process improvements and variations. It describes all inputs and outputs, along with detailed procedures for using the model and all its options. The model is based on prototype scrubber data from the EPA/Shawnee test facility and should be useful to utility companies, as well as to architectural and engineering contractors who are involved in selecting and designing FGD facilities. As key features, the model provides estimates of capital investment and operating revenue requirements. It also provides a material balance, equipment list, and a breakdown of costs by processing areas. The primary uses of the model are to project comparative economics of lime and limestone FGD processes and to evaluate system alternatives prior to the development of a detailed design.

  17. User's manual for LINEAR, a FORTRAN program to derive linear aircraft models

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Patterson, Brian P.; Antoniewicz, Robert F.

    1987-01-01

    This report documents a FORTRAN program that provides a powerful and flexible tool for the linearization of aircraft models. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.

  18. IoT-based user-driven service modeling environment for a smart space management system.

    PubMed

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-11-20

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service.

  19. IoT-Based User-Driven Service Modeling Environment for a Smart Space Management System

    PubMed Central

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-01-01

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service. PMID:25420153

  20. Mars Global Reference Atmospheric Model 2010 Version: Users Guide

    NASA Technical Reports Server (NTRS)

    Justh, H. L.

    2014-01-01

    This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.

  1. User-owned utility models for rural electrification

    SciTech Connect

    Waddle, D.

    1997-12-01

    The author discusses the history of rural electric cooperatives (REC) in the United States, and the broader question of whether such organizations can serve as a model for rural electrification in other countries. The author points out the features of such cooperatives which have given them stability and strength, and emphasizes that for success of such programs, many of these same features must be present. He definitely feels the cooperative models are not outdated, but they need strong local support, and a governmental structure which is supportive, and in particular not negative.

  2. Coastal Modeling System (CMS) User’s Manual

    DTIC Science & Technology

    1991-09-01

    respectively jVsj - magnitude of the phase function gradient 9 - local wave direction Equations 5-5 and 5-6 can be combined to yield the following ...the y-axis (Figure 5-2), a better first guess for the wave angles can be computed using the following approximation: 5-8 0 si-’ sin (00 - 8)) +600 (5...PROGRAMS Theoretical Foundation 4. SHALWV is the Corp’s most versatile spectral wave model. It has the capability of modeling the following applications

  3. Storm-Water Management Model, Version 4. Part a: user's manual

    SciTech Connect

    Huber, W.C.; Dickinson, R.E.

    1988-06-01

    The EPA Storm-Water Management Model (SWMM) is a comprehensive mathematical model for simulation of urban runoff water quality and quantity in storm and combined sewer systems. All aspects of the urban hydrologic and quality cycles are simulated, including surface and subsurface runoff, transport through the drainage network, storage and treatment. Part A of the two-volume report is an update of the user's manuals issued in 1971, 1975, and 1981. Part B is a user's manual for EXTRAN, a flow-routing model that can be used both as a block of the SWMM package and as an independent model. The SWMM user's manual provides detailed descriptions for program blocks for Runoff, Transport, Storage/Treatment, Combine, Statistics, Rain, Temp and Graph (part of the Executive Block). EXTRAN represents a drainage system as links and nodes, allowing simulation of parallel or looped-pipe networks; weirs, orifices, and pumps; and system surcharges.

  4. Policy Building--An Extension to User Modeling

    ERIC Educational Resources Information Center

    Yudelson, Michael V.; Brunskill, Emma

    2012-01-01

    In this paper we combine a logistic regression student model with an exercise selection procedure. As opposed to the body of prior work on strategies for selecting practice opportunities, we are working on an assumption of a finite amount of opportunities to teach the student. Our goal is to prescribe activities that would maximize the amount…

  5. Supporting user-defined granularities in a spatiotemporal conceptual model

    USGS Publications Warehouse

    Khatri, V.; Ram, S.; Snodgrass, R.T.; O'Brien, G. M.

    2002-01-01

    Granularities are integral to spatial and temporal data. A large number of applications require storage of facts along with their temporal and spatial context, which needs to be expressed in terms of appropriate granularities. For many real-world applications, a single granularity in the database is insufficient. In order to support any type of spatial or temporal reasoning, the semantics related to granularities needs to be embedded in the database. Specifying granularities related to facts is an important part of conceptual database design because under-specifying the granularity can restrict an application, affect the relative ordering of events and impact the topological relationships. Closely related to granularities is indeterminacy, i.e., an occurrence time or location associated with a fact that is not known exactly. In this paper, we present an ontology for spatial granularities that is a natural analog of temporal granularities. We propose an upward-compatible, annotation-based spatiotemporal conceptual model that can comprehensively capture the semantics related to spatial and temporal granularities, and indeterminacy without requiring new spatiotemporal constructs. We specify the formal semantics of this spatiotemporal conceptual model via translation to a conventional conceptual model. To underscore the practical focus of our approach, we describe an on-going case study. We apply our approach to a hydrogeologic application at the United States Geologic Survey and demonstrate that our proposed granularity-based spatiotemporal conceptual model is straightforward to use and is comprehensive.

  6. Program PHFMOPT, Planning Hull Feasibility Model, User’s Manual. Revision.

    DTIC Science & Technology

    1981-01-01

    FEASIBILITY MODEL USER’S MANUAL by E. NADINE HUBBLE D T IC SELECTE APR 0 91981, W F 1APPROVED FOR PUBLIC RELEASE: DISTRIBUTION UNLIMITED 2SHIP...8217PROGRAM PHFLOPT, PLANNING HULL FEASIBILITY MODEL , USER’S MANUAL, J. Final - [--4,. ORO-01eM. *2RIORT NUMBER 7. AUT[ (d- 0. CONTRACT OR GRANT NUMUERI.) E...block number) Planing Craft, Feasibility Model J14. ABSTRACT (Continue an reverse side It’ necessary end Identify by block number) Documentation of a

  7. Discrete-time dynamic user-optimal departure time/route choice model

    SciTech Connect

    Chen, H.K.; Hsueh, C.F.

    1998-05-01

    This paper concerns a discrete-time, link-based, dynamic user-optimal departure time/route choice model using the variational inequality approach. The model complies with a dynamic user-optimal equilibrium condition in which for each origin-destination pair, the actual route travel times experienced by travelers, regardless the departure time, is equal and minimal. A nested diagonalization procedure is proposed to solve the model. Numerical examples are then provided for demonstration and detailed elaboration for multiple solutions and Braess`s paradox.

  8. Can behavioral theory inform the understanding of depression and medication nonadherence among HIV-positive substance users?

    PubMed

    Magidson, Jessica F; Listhaus, Alyson; Seitz-Brown, C J; Safren, Steven A; Lejuez, C W; Daughters, Stacey B

    2015-04-01

    Medication adherence is highly predictive of health outcomes across chronic conditions, particularly HIV/AIDS. Depression is consistently associated with worse adherence, yet few studies have sought to understand how depression relates to adherence. This study tested three components of behavioral depression theory--goal-directed activation, positive reinforcement, and environmental punishment--as potential indirect effects in the relation between depressive symptoms and medication nonadherence among low-income, predominantly African American substance users (n = 83). Medication nonadherence was assessed as frequency of doses missed across common reasons for nonadherence. Non-parametric bootstrapping was used to evaluate the indirect effects. Of the three intermediary variables, there was only an indirect effect of environmental punishment; depressive symptoms were associated with greater nonadherence through greater environmental punishment. Goal-directed activation and positive reinforcement were unrelated to adherence. Findings suggest the importance of environmental punishment in the relation between depression and medication adherence and may inform future intervention efforts for this population.

  9. Theory, modeling and simulation: Annual report 1993

    SciTech Connect

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  10. Future Air Traffic Growth and Schedule Model User's Guide

    NASA Technical Reports Server (NTRS)

    Kimmel, William M. (Technical Monitor); Smith, Jeremy C.; Dollyhigh, Samuel M.

    2004-01-01

    The Future Air Traffic Growth and Schedule Model was developed as an implementation of the Fratar algorithm to project future traffic flow between airports in a system and of then scheduling the additional flights to reflect current passenger time-of-travel preferences. The methodology produces an unconstrained future schedule from a current (or baseline) schedule and the airport operations growth rates. As an example of the use of the model, future schedules are projected for 2010 and 2022 for all flights arriving at, departing from, or flying between all continental United States airports that had commercial scheduled service for May 17, 2002. Inter-continental US traffic and airports are included and the traffic is also grown with the Fratar methodology to account for their arrivals and departures to the continental US airports. Input data sets derived from the Official Airline Guide (OAG) data and FAA Terminal Area Forecast (TAF) are included in the examples of the computer code execution.

  11. Location contexts of user check-ins to model urban geo life-style patterns.

    PubMed

    Hasan, Samiul; Ukkusuri, Satish V

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items-either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior.

  12. Location Contexts of User Check-Ins to Model Urban Geo Life-Style Patterns

    PubMed Central

    Hasan, Samiul; Ukkusuri, Satish V.

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items—either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior. PMID:25970430

  13. Algorithm for model validation: theory and applications.

    PubMed

    Sornette, D; Davis, A B; Ide, K; Vixie, K R; Pisarenko, V; Kamm, J R

    2007-04-17

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer-Meshkov instability.

  14. Algorithm for model validation: Theory and applications

    PubMed Central

    Sornette, D.; Davis, A. B.; Ide, K.; Vixie, K. R.; Pisarenko, V.; Kamm, J. R.

    2007-01-01

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer–Meshkov instability. PMID:17420476

  15. User's instructions for the 41-node thermoregulatory model (steady state version)

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1974-01-01

    A user's guide for the steady-state thermoregulatory model is presented. The model was modified to provide conversational interaction on a remote terminal, greater flexibility for parameter estimation, increased efficiency of convergence, greater choice of output variable and more realistic equations for respiratory and skin diffusion water losses.

  16. SWAT Check: A screening tool to assist users in the identification of potential model application problems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Soil and Water Assessment Tool (SWAT) is a basin scale hydrologic model developed by the US Department of Agriculture-Agricultural Research Service. SWAT's broad applicability, user friendly model interfaces, and automatic calibration software have led to a rapid increase in the number of new u...

  17. Introducing a new open source GIS user interface for the SWAT model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Soil and Water Assessment Tool (SWAT) model is a robust watershed modelling tool. It typically uses the ArcSWAT interface to create its inputs. ArcSWAT is public domain software which works in the licensed ArcGIS environment. The aim of this paper was to develop an open source user interface ...

  18. Cross-Cultural Teamwork in End User Computing: A Theoretical Model.

    ERIC Educational Resources Information Center

    Bento, Regina F.

    1995-01-01

    Presents a theoretical model explaining how cultural influences may affect the open, dynamic system of a cross-cultural, end-user computing team. Discusses the relationship between cross-cultural factors and various parts of the model such as: input variables, the system itself, outputs, and implications for the management of such teams. (JKP)

  19. ESPVI 4.0 ELECTROSTATIS PRECIPITATOR V-1 AND PERFORMANCE MODEL: USER'S MANUAL

    EPA Science Inventory

    The manual is the companion document for the microcomputer program ESPVI 4.0, Electrostatic Precipitation VI and Performance Model. The program was developed to provide a user- friendly interface to an advanced model of electrostatic precipitation (ESP) performance. The program i...

  20. Network Theory Tools for RNA Modeling

    PubMed Central

    Kim, Namhee; Petingi, Louis; Schlick, Tamar

    2014-01-01

    An introduction into the usage of graph or network theory tools for the study of RNA molecules is presented. By using vertices and edges to define RNA secondary structures as tree and dual graphs, we can enumerate, predict, and design RNA topologies. Graph connectivity and associated Laplacian eigenvalues relate to biological properties of RNA and help understand RNA motifs as well as build, by computational design, various RNA target structures. Importantly, graph theoretical representations of RNAs reduce drastically the conformational space size and therefore simplify modeling and prediction tasks. Ongoing challenges remain regarding general RNA design, representation of RNA pseudoknots, and tertiary structure prediction. Thus, developments in network theory may help advance RNA biology. PMID:25414570

  1. Network Theory Tools for RNA Modeling.

    PubMed

    Kim, Namhee; Petingi, Louis; Schlick, Tamar

    2013-09-01

    An introduction into the usage of graph or network theory tools for the study of RNA molecules is presented. By using vertices and edges to define RNA secondary structures as tree and dual graphs, we can enumerate, predict, and design RNA topologies. Graph connectivity and associated Laplacian eigenvalues relate to biological properties of RNA and help understand RNA motifs as well as build, by computational design, various RNA target structures. Importantly, graph theoretical representations of RNAs reduce drastically the conformational space size and therefore simplify modeling and prediction tasks. Ongoing challenges remain regarding general RNA design, representation of RNA pseudoknots, and tertiary structure prediction. Thus, developments in network theory may help advance RNA biology.

  2. LOGAM (Logistic Analysis Model). Volume 2. Users Manual.

    DTIC Science & Technology

    1982-08-01

    scrapped. (". B-8 ____ 4 7i FN7 Number of identical LRUs within a system whose failure does not detract from system availability. Used to model effect of...VA 22314 Commnander, USA Armament Materiel Readiness Cmd, Rock Island, IL 61299 1 DRSAR-MM ______DRSAR-SA Commander, USA Communications- Electronics Cmd...ATTN: DRDAR-SE, Dover, NJ 07801 1 Commander, USA Aviation R&D Cmd, 4300 Goodfellow Blvd, St. Louis, MO 63120 I Commander, USA Electronics R&D Cmd

  3. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  4. UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.

    PubMed

    Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun

    2013-12-01

    Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.

  5. Articulating and Locating Uncertainty for the Users of Hydrological Models: some considerations from climate informatics

    NASA Astrophysics Data System (ADS)

    Brumble, K. C.

    2015-12-01

    Climate models and constructions increasingly involve handling massive data sets which incorporate multiple and compounding sources and kinds of uncertainty. Kinds of uncertainty include measurement uncertainty involved in homogenizing data and handling data-scarce environments, as well as structural uncertainty arising from statistical activities like assimilating data and adopting data-handling conventions. Furthermore model uncertainty arises when doing intercomparisions of diverse models into ensembles. All of the above are examples of sources of typed uncertainty in model-building and interpretation. These different uncertainties also compound, "cascading" into one another. This has led climate modelers and those in climate informatics to explore methods for quantifying uncertainty in order to give the users of model outputs and conclusions explicit expression of the uncertainty involved. However, uncertainty quantification methods each have virtues and limitations in accounting for uncertainty and few explicitly locate the sources and kinds of uncertainty involved in accessible ways. Articulating and distinguishing these uncertainties accessibly is vital for policy users of models because applications of model outputs may depend heavily on particular limited scopes of possible scenarios or applications. The users of integrated climate impact and hydrological models in particular need uncertainty which is described and localized in the modeling process in order to interpret and utilize model projections. Methods for locating and articulating uncertainty in the modeling process are discussed and evaluated, and some suggestions for future projects are explored.

  6. Targeting Parents for Childhood Weight Management: Development of a Theory-Driven and User-Centered Healthy Eating App

    PubMed Central

    Lahiri, Sudakshina; Brown, Katherine Elizabeth

    2015-01-01

    Background The proliferation of health promotion apps along with mobile phones' array of features supporting health behavior change offers a new and innovative approach to childhood weight management. However, despite the critical role parents play in children’s weight related behaviors, few industry-led apps aimed at childhood weight management target parents. Furthermore, industry-led apps have been shown to lack a basis in behavior change theory and evidence. Equally important remains the issue of how to maximize users’ engagement with mobile health (mHealth) interventions where there is growing consensus that inputs from the commercial app industry and the target population should be an integral part of the development process. Objective The aim of this study is to systematically design and develop a theory and evidence-driven, user-centered healthy eating app targeting parents for childhood weight management, and clearly document this for the research and app development community. Methods The Behavior Change Wheel (BCW) framework, a theoretically-based approach for intervention development, along with a user-centered design (UCD) philosophy and collaboration with the commercial app industry, guided the development process. Current evidence, along with a series of 9 focus groups (total of 46 participants) comprised of family weight management case workers, parents with overweight and healthy weight children aged 5-11 years, and consultation with experts, provided data to inform the app development. Thematic analysis of focus groups helped to extract information related to relevant theoretical, user-centered, and technological components to underpin the design and development of the app. Results Inputs from parents and experts working in the area of childhood weight management helped to identify the main target behavior: to help parents provide appropriate food portion sizes for their children. To achieve this target behavior, the behavioral diagnosis

  7. Queuing theory models for computer networks

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  8. Emergent User Behavior on Twitter Modelled by a Stochastic Differential Equation

    PubMed Central

    Mollgaard, Anders; Mathiesen, Joachim

    2015-01-01

    Data from the social-media site, Twitter, is used to study the fluctuations in tweet rates of brand names. The tweet rates are the result of a strongly correlated user behavior, which leads to bursty collective dynamics with a characteristic 1/f noise. Here we use the aggregated "user interest" in a brand name to model collective human dynamics by a stochastic differential equation with multiplicative noise. The model is supported by a detailed analysis of the tweet rate fluctuations and it reproduces both the exact bursty dynamics found in the data and the 1/f noise. PMID:25955783

  9. Emergent user behavior on Twitter modelled by a stochastic differential equation.

    PubMed

    Mollgaard, Anders; Mathiesen, Joachim

    2015-01-01

    Data from the social-media site, Twitter, is used to study the fluctuations in tweet rates of brand names. The tweet rates are the result of a strongly correlated user behavior, which leads to bursty collective dynamics with a characteristic 1/f noise. Here we use the aggregated "user interest" in a brand name to model collective human dynamics by a stochastic differential equation with multiplicative noise. The model is supported by a detailed analysis of the tweet rate fluctuations and it reproduces both the exact bursty dynamics found in the data and the 1/f noise.

  10. PRay - A graphical user interface for interactive visualization and modification of rayinvr models

    NASA Astrophysics Data System (ADS)

    Fromm, T.

    2016-01-01

    PRay is a graphical user interface for interactive displaying and editing of velocity models for seismic refraction. It is optimized for editing rayinvr models but can also be used as a dynamic viewer for ray tracing results from other software. The main features are the graphical editing of nodes and fast adjusting of the display (stations and phases). It can be extended by user-defined shell scripts and links to phase picking software. PRay is open source software written in the scripting language Perl, runs on Unix-like operating systems including Mac OS X and provides a version controlled source code repository for community development.

  11. User-defined Material Model for Thermo-mechanical Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.

  12. Compass models: Theory and physical motivations

    NASA Astrophysics Data System (ADS)

    Nussinov, Zohar; van den Brink, Jeroen

    2015-01-01

    Compass models are theories of matter in which the couplings between the internal spin (or other relevant field) components are inherently spatially (typically, direction) dependent. A simple illustrative example is furnished by the 90° compass model on a square lattice in which only couplings of the form τixτjx (where {τia}a denote Pauli operators at site i ) are associated with nearest-neighbor sites i and j separated along the x axis of the lattice while τiyτjy couplings appear for sites separated by a lattice constant along the y axis. Similar compass-type interactions can appear in diverse physical systems. For instance, compass models describe Mott insulators with orbital degrees of freedom where interactions sensitively depend on the spatial orientation of the orbitals involved as well as the low-energy effective theories of frustrated quantum magnets, and a host of other systems such as vacancy centers, and cold atomic gases. The fundamental interdependence between internal (spin, orbital, or other) and external (i.e., spatial) degrees of freedom which underlies compass models generally leads to very rich behaviors, including the frustration of (semi-)classical ordered states on nonfrustrated lattices, and to enhanced quantum effects, prompting, in certain cases, the appearance of zero-temperature quantum spin liquids. As a consequence of these frustrations, new types of symmetries and their associated degeneracies may appear. These intermediate symmetries lie midway between the extremes of global symmetries and local gauge symmetries and lead to effective dimensional reductions. In this article, compass models are reviewed in a unified manner, paying close attention to exact consequences of these symmetries and to thermal and quantum fluctuations that stabilize orders via order-out-of-disorder effects. This is complemented by a survey of numerical results. In addition to reviewing past works, a number of other models are introduced and new results

  13. A matrix model from string field theory

    NASA Astrophysics Data System (ADS)

    Zeze, Syoji

    2016-09-01

    We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N) vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large N matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.

  14. Polarimetric clutter modeling: Theory and application

    NASA Technical Reports Server (NTRS)

    Kong, J. A.; Lin, F. C.; Borgeaud, M.; Yueh, H. A.; Swartz, A. A.; Lim, H. H.; Shim, R. T.; Novak, L. M.

    1988-01-01

    The two-layer anisotropic random medium model is used to investigate fully polarimetric scattering properties of earth terrain media. The polarization covariance matrices for the untilted and tilted uniaxial random medium are evaluated using the strong fluctuation theory and distorted Born approximation. In order to account for the azimuthal randomness in the growth direction of leaves in tree and grass fields, an averaging scheme over the azimuthal direction is also applied. It is found that characteristics of terrain clutter can be identified through the analysis of each element of the covariance matrix. Theoretical results are illustrated by the comparison with experimental data provided by MIT Lincoln Laboratory for tree and grass fields.

  15. Integrated hydrochemical modeling of an alpine watershed: Sierra Nevada, California. User`s guide to the University of Arizona, Alpine Hydrochemical Model (AHM) version 1.0.

    SciTech Connect

    Wolford, R.A.; Bales, R.C.; Sorooshian, S.

    1992-12-01

    This dissertation discusses the development and testing of a model capable of predicting watershed hydrologic and hydrochemical responses to these changes. The model computes integrated water and chemical balances for watersheds with unlimited numbers of terrestrial, stream, and lake subunits, each of which may have a unique, variable snow-covered area. Model capabilities include: (1) tracking of chemical inputs from precipitation, dry deposition, snowmelt, mineral weathering, baseflow or flows from areas external to the modeled watershed, and user-defined sources and sinks, (2) tracking water and chemical movements in the canopy, snowpack, soil litter, multiple soil layers, streamflow, between terrestrial subunits (surface and subsurface movement), and within lakes (2 layers), (3) chemical speciation, including free and total soluble species, precipitates, exchange complexes, and acid-neutralizing capacity, (4) nitrogen reactions, (5) a snowmelt optimization procedure capable of exactly matching observed watershed outflows, and (6) modeling riparian areas.

  16. Mathematical models for principles of gyroscope theory

    NASA Astrophysics Data System (ADS)

    Usubamatov, Ryspek

    2017-01-01

    Gyroscope devices are primary units for navigation and control systems that have wide application in engineering. The main property of the gyroscope device is maintaining the axis of a spinning rotor. This gyroscope peculiarity is represented in terms of gyroscope effects in which known mathematical models have been formulated on the law of kinetic energy conservation and the change in the angular momentum. The gyroscope theory is represented by numerous publications, which mathematical models do not match the actual torques and motions in these devices.. The nature of gyroscope effects is more complex than represented in known publications. Recent investigations in this area have demonstrated that on a gyroscope can act until eleven internal torques simultaneously and interdependently around two axes. These gyroscope torques are generated by spinning rotor's mass-elements and by the gyroscope center-mass based on action of several inertial forces. The change in the angular momentum does not play first role for gyroscope motions. The external load generates several internal torques which directions may be distinguished. This situation leads changing of the angular velocities of gyroscope motions around two axes. Formulated mathematical models of gyroscope internal torques are representing the fundamental principle of gyroscope theory. In detail, the gyroscope is experienced the resistance torque generated by the centrifugal and Coriolis forces of the spinning rotor and the precession torque generated by the common inertial forces and the change in the angular momentum. The new mathematical models for the torques and motions of the gyroscope confirmed for most unsolvable problems. The mathematical models practically tested and the results are validated the theoretical approach.

  17. SB3D User Manual, Santa Barbara 3D Radiative Transfer Model

    SciTech Connect

    O'Hirok, William

    1999-01-01

    SB3D is a three-dimensional atmospheric and oceanic radiative transfer model for the Solar spectrum. The microphysics employed in the model are the same as used in the model SBDART. It is assumed that the user of SB3D is familiar with SBDART and IDL. SB3D differs from SBDART in that computations are conducted on media in three-dimensions rather than a single column (i.e. plane-parallel), and a stochastic method (Monte Carlo) is employed instead of a numerical approach (Discrete Ordinates) for estimating a solution to the radiative transfer equation. Because of these two differences between SB3D and SBDART, the input and running of SB3D is more unwieldy and requires compromises between model performance and computational expense. Hence, there is no one correct method for running the model and the user must develop a sense to the proper input and configuration of the model.

  18. Holographic models for theories with hyperscaling violation

    NASA Astrophysics Data System (ADS)

    Gath, Jakob; Hartong, Jelle; Monteiro, Ricardo; Obers, Niels A.

    2013-04-01

    We study in detail a variety of gravitational toy models for hyperscaling-violating Lifshitz (hvLif) space-times. These space-times have been recently explored as holographic dual models for condensed matter systems. We start by considering a model of gravity coupled to a massive vector field and a dilaton with a potential. This model supports the full class of hvLif space-times and special attention is given to the particular values of the scaling exponents appearing in certain non-Fermi liquids. We study linearized perturbations in this model, and consider probe fields whose interactions mimic those of the perturbations. The resulting equations of motion for the probe fields are invariant under the Lifshitz scaling. We derive Breitenlohner-Freedman-type bounds for these new probe fields. For the cases of interest the hvLif space-times have curvature invariants that blow up in the UV. We study the problem of constructing models in which the hvLif space-time can have an AdS or Lifshitz UV completion. We also analyze reductions of Schrödinger space-times and reductions of waves on extremal (intersecting) branes, accompanied by transverse space reductions, that are solutions to supergravity-like theories, exploring the allowed parameter range of the hvLif scaling exponents.

  19. User's manual for the REEDM (Rocket Exhaust Effluent Diffusion Model) computer program

    NASA Technical Reports Server (NTRS)

    Bjorklund, J. R.; Dumbauld, R. K.; Cheney, C. S.; Geary, H. V.

    1982-01-01

    The REEDM computer program predicts concentrations, dosages, and depositions downwind from normal and abnormal launches of rocket vehicles at NASA's Kennedy Space Center. The atmospheric dispersion models, cloud-rise models, and other formulas used in the REEDM model are described mathematically Vehicle and source parameters, other pertinent physical properties of the rocket exhaust cloud, and meteorological layering techniques are presented as well as user's instructions for REEDM. Worked example problems are included.

  20. A modified Klobuchar model for single-frequency GNSS users over the polar region

    NASA Astrophysics Data System (ADS)

    Bi, Tong; An, Jiachun; Yang, Jian; Liu, Shulun

    2017-02-01

    For single-frequency Global Navigation Satellite System (GNSS) users, it is necessary to select a simple and effective broadcast ionospheric model to mitigate the ionospheric delay, which is one of the most serious error sources in GNSS measurement. The widely used Global Positioning System (GPS) Klobuchar model can achieve better performance in mid-latitudes, however, this model is not applicable in high-latitudes due to the more complex ionospheric structure over the polar region. Under the premise of no additional coefficients, a modified Klobuchar model is established for single-frequency GNSS users over the polar region by improving the nighttime term and the amplitude of the cosine term. The performance of the new model is validated by different ionospheric models and their applications in single-frequency single-point positioning, during different seasons and different levels of solar activities. The new model can reduce the ionospheric error by 60% over the polar region, while the GPS-Klobuchar even increases the ionospheric error in many cases. Over the polar region, the single-frequency SPP error using the new model is approximately 3 m in vertical direction and 1 m in horizontal direction, which is superior to GPS-Klobuchar. This study suggests that the modified Klobuchar model is more accurate to depict the polar ionosphere and could be used to achieve better positioning accuracy for single-frequency GNSS users over the polar region.

  1. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 7: User Models: A System Assessment

    NASA Technical Reports Server (NTRS)

    1974-01-01

    User models defined as any explicit process or procedure used to transform information extracted from remotely sensed data into a form useful as a resource management information input are discussed. The role of the user models as information, technological, and operations interfaces between the TERSSE and the resource managers is emphasized. It is recommended that guidelines and management strategies be developed for a systems approach to user model development.

  2. PARFUME Theory and Model basis Report

    SciTech Connect

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  3. Using the Theory of Planned Behavior to predict implementation of harm reduction strategies among MDMA/ecstasy users.

    PubMed

    Davis, Alan K; Rosenberg, Harold

    2016-06-01

    This prospective study was designed to test whether the variables proposed by the Theory of Planned Behavior (TPB) were associated with baseline intention to implement and subsequent use of 2 MDMA/ecstasy-specific harm reduction interventions: preloading/postloading and pill testing/pill checking. Using targeted Facebook advertisements, an international sample of 391 recreational ecstasy users were recruited to complete questionnaires assessing their ecstasy consumption history, and their attitudes, subjective norms, perceived behavioral control, habit strength (past strategy use), and intention to use these two strategies. Attitudes, subjective norms, and perceived behavioral control were significantly associated with baseline intention to preload/postload and pill test/pill check. Out of the 391 baseline participants, 100 completed the two-month follow-up assessment. Baseline habit strength and frequency of ecstasy consumption during the three months prior to baseline were the only significant predictors of how often participants used the preloading/postloading strategy during the follow-up. Baseline intention to pill test/pill check was the only significant predictor of how often participants used this strategy during the follow-up. These findings provide partial support for TPB variables as both correlates of baseline intention to implement and predictors of subsequent use of these two strategies. Future investigations could assess whether factors related to ecstasy consumption (e.g., subjective level of intoxication, craving, negative consequences following consumption), and environmental factors (e.g., accessibility and availability of harm reduction resources) improve the prediction of how often ecstasy users employ these and other harm reduction strategies. (PsycINFO Database Record

  4. The NATA code: Theory and analysis, volume 1. [user manuals (computer programming) - gas dynamics, wind tunnels

    NASA Technical Reports Server (NTRS)

    Bade, W. L.; Yos, J. M.

    1975-01-01

    A computer program for calculating quasi-one-dimensional gas flow in axisymmetric and two-dimensional nozzles and rectangular channels is presented. Flow is assumed to start from a state of thermochemical equilibrium at a high temperature in an upstream reservoir. The program provides solutions based on frozen chemistry, chemical equilibrium, and nonequilibrium flow with finite reaction rates. Electronic nonequilibrium effects can be included using a two-temperature model. An approximate laminar boundary layer calculation is given for the shear and heat flux on the nozzle wall. Boundary layer displacement effects on the inviscid flow are considered also. Chemical equilibrium and transport property calculations are provided by subroutines. The code contains precoded thermochemical, chemical kinetic, and transport cross section data for high-temperature air, CO2-N2-Ar mixtures, helium, and argon. It provides calculations of the stagnation conditions on axisymmetric or two-dimensional models, and of the conditions on the flat surface of a blunt wedge. The primary purpose of the code is to describe the flow conditions and test conditions in electric arc heated wind tunnels.

  5. Surface matching for correlation of virtual models: Theory and application

    NASA Technical Reports Server (NTRS)

    Caracciolo, Roberto; Fanton, Francesco; Gasparetto, Alessandro

    1994-01-01

    Virtual reality can enable a robot user to off line generate and test in a virtual environment a sequence of operations to be executed by the robot in an assembly cell. Virtual models of objects are to be correlated to the real entities they represent by means of a suitable transformation. A solution to the correlation problem, which is basically a problem of 3-dimensional adjusting, has been found exploiting the surface matching theory. An iterative algorithm has been developed, which matches the geometric surface representing the shape of the virtual model of an object, with a set of points measured on the surface in the real world. A peculiar feature of the algorithm is to work also if there is no one-to-one correspondence between the measured points and those representing the surface model. Furthermore the problem of avoiding convergence to local minima is solved, by defining a starting point of states ensuring convergence to the global minimum. The developed algorithm has been tested by simulation. Finally, this paper proposes a specific application, i.e., correlating a robot cell, equipped for biomedical use with its virtual representation.

  6. Transmission Line Jobs and Economic Development Impact (JEDI) Model User Reference Guide

    SciTech Connect

    Goldberg, M.; Keyser, D.

    2013-10-01

    The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are freely available, user-friendly tools that estimate the potential economic impacts of constructing and operating power generation projects for a range of conventional and renewable energy technologies. The Transmission Line JEDI model can be used to field questions about the economic impacts of transmission lines in a given state, region, or local community. This Transmission Line JEDI User Reference Guide was developed to provide basic instruction on operating the model and understanding the results. This guide also provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data contained in the model.

  7. User`s guide to CAL3QHC version 2.0: A modeling methodology for predicting pollutant concentrations near roadway intersections (revised)

    SciTech Connect

    Eckhoff, P.

    1995-09-01

    CAL3QHC is a microcomputer based model to predict carbon monoxide (CO) or other pollutant concentrations from motor vehicles at roadway intersections. The model includes the CALINE3 dispersion model along with traffic algorithm to estimate vehicular queue lengths at signalized intersections. CAL3QHC estimates total air pollutant concentrations from both moving and idling vehicles. This document provides a technical description of the model, user instructions, and example applications.

  8. ModelMuse: A U.S. Geological Survey Open-Source, Graphical User Interface for Groundwater Models

    NASA Astrophysics Data System (ADS)

    Winston, R. B.

    2013-12-01

    ModelMuse is a free publicly-available graphical preprocessor used to generate the input and display the output for several groundwater models. It is written in Object Pascal and the source code is available on the USGS software web site. Supported models include the MODFLOW family of models, PHAST (version 1), and SUTRA version 2.2. With MODFLOW and PHAST, the user generates a grid and uses 'objects' (points, lines, and polygons) to define boundary conditions and the spatial variation in aquifer properties. Because the objects define the spatial variation, the grid can be changed without the user needing to re-enter spatial data. The same paradigm is used with SUTRA except that the user generates a quadrilateral finite-element mesh instead of a rectangular grid. The user interacts with the model in a top view and in a vertical cross section. The cross section can be at any angle or location. There is also a three-dimensional view of the model. For SUTRA, a new method of visualizing the permeability and related properties has been introduced. In three dimensional SUTRA models, the user specifies the permeability tensor by specifying permeability in three mutually orthogonal directions that can be oriented in space in any direction. Because it is important for the user to be able to check both the magnitudes and directions of the permeabilities, ModelMuse displays the permeabilities as either a two-dimensional or a three-dimensional vector plot. Color is used to differentiate the maximum, middle, and minimum permeability vectors. The magnitude of the permeability is shown by the vector length. The vector angle shows the direction of the maximum, middle, or minimum permeability. Contour and color plots can also be used to display model input and output data.

  9. A Monthly Water-Balance Model Driven By a Graphical User Interface

    USGS Publications Warehouse

    McCabe, Gregory J.; Markstrom, Steven L.

    2007-01-01

    This report describes a monthly water-balance model driven by a graphical user interface, referred to as the Thornthwaite monthly water-balance program. Computations of monthly water-balance components of the hydrologic cycle are made for a specified location. The program can be used as a research tool, an assessment tool, and a tool for classroom instruction.

  10. A Multilayer Naïve Bayes Model for Analyzing User's Retweeting Sentiment Tendency

    PubMed Central

    Wang, Mengmeng; Zuo, Wanli; Wang, Ying

    2015-01-01

    Today microblogging has increasingly become a means of information diffusion via user's retweeting behavior. Since retweeting content, as context information of microblogging, is an understanding of microblogging, hence, user's retweeting sentiment tendency analysis has gradually become a hot research topic. Targeted at online microblogging, a dynamic social network, we investigate how to exploit dynamic retweeting sentiment features in retweeting sentiment tendency analysis. On the basis of time series of user's network structure information and published text information, we first model dynamic retweeting sentiment features. Then we build Naïve Bayes models from profile-, relationship-, and emotion-based dimensions, respectively. Finally, we build a multilayer Naïve Bayes model based on multidimensional Naïve Bayes models to analyze user's retweeting sentiment tendency towards a microblog. Experiments on real-world dataset demonstrate the effectiveness of the proposed framework. Further experiments are conducted to understand the importance of dynamic retweeting sentiment features and temporal information in retweeting sentiment tendency analysis. What is more, we provide a new train of thought for retweeting sentiment tendency analysis in dynamic social networks. PMID:26417367

  11. Visual imagery and the user model applied to fuel handling at EBR-II

    SciTech Connect

    Brown-VanHoozer, S.A.

    1995-06-01

    The material presented in this paper is based on two studies involving visual display designs and the user`s perspective model of a system. The studies involved a methodology known as Neuro-Linguistic Programming (NLP), and its use in expanding design choices which included the ``comfort parameters`` and ``perspective reality`` of the user`s model of the world. In developing visual displays for the EBR-II fuel handling system, the focus would be to incorporate the comfort parameters that overlap from each of the representation systems: visual, auditory and kinesthetic then incorporate the comfort parameters of the most prominent group of the population, and last, blend in the other two representational system comfort parameters. The focus of this informal study was to use the techniques of meta-modeling and synesthesia to develop a virtual environment that closely resembled the operator`s perspective of the fuel handling system of Argonne`s Experimental Breeder Reactor - II. An informal study was conducted using NLP as the behavioral model in a v reality (VR) setting.

  12. USER'S GUIDE TO GEOSYNTHETIC MODELING SYSTEM: GM SYSTEM VERSION 1.1

    EPA Science Inventory

    The document is a user manual for the Geosynthetic Modeling System. The menu-driven analytical system performs design calculations for 28 different landfill design applications that incorporate geosynthetic materials. The results of each set of design calculations are compared wi...

  13. Users of withdrawal method in the Islamic Republic of Iran: are they intending to use oral contraceptives? Applying the theory of planned behaviour.

    PubMed

    Rahnama, P; Hidarnia, A; Shokravi, F A; Kazemnejad, A; Montazeri, A; Najorkolaei, F R; Saburi, A

    2013-09-01

    Many couples in the Islamic Republic of Iran rely on coital withdrawal for contraception. The purpose of this cross-sectional study was to use the theory of planned behaviour to explore factors that influence withdrawal users' intent to switch to oral contraception (OC). Participants were 336 sexually active, married women, who were current users of withdrawal and were recruited from 5 public family planning clinics in Tehran. A questionnair included measures of the theory of planned behaviour: attitude (behavioural beliefs, outcome evaluations), subjective norms (normative beliefs, motivation to comply), perceived behaviour control, past behaviour and behavioural intention. Linear regression analyses showed that past behaviour, perceived behaviour control, attitude and subjective norms accounted for the highest percentage of total variance observed for intention to use OC (36%). Beliefs-based family planning education and counsellingshould to be designed for users of the withdrawal method.

  14. Support for significant evolutions of the user data model in ROOT files

    SciTech Connect

    Canal, P.; Brun, R.; Fine, V.; Janyst, L.; Lauret, J.; Russo, P.; /Fermilab

    2010-01-01

    One of the main strengths of ROOT input and output (I/O) is its inherent support for schema evolution. Two distinct modes are supported, one manual via a hand coded streamer function and one fully automatic via the ROOT StreamerInfo. One draw back of the streamer functions is that they are not usable by TTree objects in split mode. Until now, the user could not customize the automatic schema evolution mechanism and the only mechanism to go beyond the default rules was to revert to using the streamer function. In ROOT 5.22/00, we introduced a new mechanism which allows user provided extensions of the automatic schema evolution that can be used in object-wise, member-wise and split modes. This paper will describe the many possibilities ranging from the simple assignment of transient members to the complex reorganization of the user's object model.

  15. The European ALMA Regional Centre Network: A Geographically Distributed User Support Model

    NASA Astrophysics Data System (ADS)

    Hatziminaoglou, E.; Zwaan, M.; Andreani, P.; Barta, M.; Bertoldi, F.; Brand, J.; Gueth, F.; Hogerheijde, M.; Maercker, M.; Massardi, M.; Muehle, S.; Muxlow, Th.; Richards, A.; Schilke, P.; Tilanus, R.; Vlemmings, W.; Afonso, J.; Messias, H.

    2015-12-01

    In recent years there has been a paradigm shift from centralised to geographically distributed resources. Individual entities are no longer able to host or afford the necessary expertise in-house, and, as a consequence, society increasingly relies on widespread collaborations. Although such collaborations are now the norm for scientific projects, more technical structures providing support to a distributed scientific community without direct financial or other material benefits are scarce. The network of European ALMA Regional Centre (ARC) nodes is an example of such an internationally distributed user support network. It is an organised effort to provide the European ALMA user community with uniform expert support to enable optimal usage and scientific output of the ALMA facility. The network model for the European ARC nodes is described in terms of its organisation, communication strategies and user support.

  16. Jobs and Economic Development Impact (JEDI) User Reference Guide: Fast Pyrolysis Biorefinery Model

    SciTech Connect

    Zhang, Yimin; Goldberg, Marshall

    2015-02-01

    This guide -- the JEDI Fast Pyrolysis Biorefinery Model User Reference Guide -- was developed to assist users in operating and understanding the JEDI Fast Pyrolysis Biorefinery Model. The guide provides information on the model's underlying methodology, as well as the parameters and data sources used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features and a discussion of how the results should be interpreted. Based on project-specific inputs from the user, the JEDI Fast Pyrolysis Biorefinery Model estimates local (e.g., county- or state-level) job creation, earnings, and output from total economic activity for a given fast pyrolysis biorefinery. These estimates include the direct, indirect and induced economic impacts to the local economy associated with the construction and operation phases of biorefinery projects.Local revenue and supply chain impacts as well as induced impacts are estimated using economic multipliers derived from the IMPLAN software program. By determining the local economic impacts and job creation for a proposed biorefinery, the JEDI Fast Pyrolysis Biorefinery Model can be used to field questions about the added value biorefineries might bring to a local community.

  17. A Grammar-based Approach for Modeling User Interactions and Generating Suggestions During the Data Exploration Process.

    PubMed

    Dabek, Filip; Caban, Jesus J

    2017-01-01

    Despite the recent popularity of visual analytics focusing on big data, little is known about how to support users that use visualization techniques to explore multi-dimensional datasets and accomplish specific tasks. Our lack of models that can assist end-users during the data exploration process has made it challenging to learn from the user's interactive and analytical process. The ability to model how a user interacts with a specific visualization technique and what difficulties they face are paramount in supporting individuals with discovering new patterns within their complex datasets. This paper introduces the notion of visualization systems understanding and modeling user interactions with the intent of guiding a user through a task thereby enhancing visual data exploration. The challenges faced and the necessary future steps to take are discussed; and to provide a working example, a grammar-based model is presented that can learn from user interactions, determine the common patterns among a number of subjects using a K-Reversible algorithm, build a set of rules, and apply those rules in the form of suggestions to new users with the goal of guiding them along their visual analytic process. A formal evaluation study with 300 subjects was performed showing that our grammar-based model is effective at capturing the interactive process followed by users and that further research in this area has the potential to positively impact how users interact with a visualization system.

  18. Development and implementation of (Q)SAR modeling within the CHARMMing Web-user interface

    PubMed Central

    Weidlich, Iwona E.; Pevzner, Yuri; Miller, Benjamin T.; Filippov, Igor V.; Woodcock, H. Lee; Brooks, Bernard R.

    2014-01-01

    Recent availability of large publicly accessible databases of chemical compounds and their biological activities (PubChem, ChEMBL) has inspired us to develop a Web-based tool for SAR and QSAR modeling to add to the services provided by CHARMMing (www.charmming.org). This new module implements some of the most recent advances in modern machine learning algorithms – Random Forest, Support Vector Machine (SVM), Stochastic Gradient Descent, Gradient Tree Boosting etc. A user can import training data from Pubchem Bioassay data collections directly from our interface or upload his or her own SD files which contain structures and activity information to create new models (either categorical or numerical). A user can then track the model generation process and run models on new data to predict activity. PMID:25362883

  19. Jobs and Economic Development Impact (JEDI) Model: Offshore Wind User Reference Guide

    SciTech Connect

    Lantz, E.; Goldberg, M.; Keyser, D.

    2013-06-01

    The Offshore Wind Jobs and Economic Development Impact (JEDI) model, developed by NREL and MRG & Associates, is a spreadsheet based input-output tool. JEDI is meant to be a user friendly and transparent tool to estimate potential economic impacts supported by the development and operation of offshore wind projects. This guide describes how to use the model as well as technical information such as methodology, limitations, and data sources.

  20. Theory and modelling of nanocarbon phase stability.

    SciTech Connect

    Barnard, A. S.

    2006-01-01

    The transformation of nanodiamonds into carbon-onions (and vice versa) has been observed experimentally and has been modeled computationally at various levels of sophistication. Also, several analytical theories have been derived to describe the size, temperature and pressure dependence of this phase transition. However, in most cases a pure carbon-onion or nanodiamond is not the final product. More often than not an intermediary is formed, known as a bucky-diamond, with a diamond-like core encased in an onion-like shell. This has prompted a number of studies investigating the relative stability of nanodiamonds, bucky-diamonds, carbon-onions and fullerenes, in various size regimes. Presented here is a review outlining results of numerous theoretical studies examining the phase diagrams and phase stability of carbon nanoparticles, to clarify the complicated relationship between fullerenic and diamond structures at the nanoscale.

  1. Modeling missing data in knowledge space theory.

    PubMed

    de Chiusole, Debora; Stefanutti, Luca; Anselmi, Pasquale; Robusto, Egidio

    2015-12-01

    Missing data are a well known issue in statistical inference, because some responses may be missing, even when data are collected carefully. The problem that arises in these cases is how to deal with missing data. In this article, the missingness is analyzed in knowledge space theory, and in particular when the basic local independence model (BLIM) is applied to the data. Two extensions of the BLIM to missing data are proposed: The former, called ignorable missing BLIM (IMBLIM), assumes that missing data are missing completely at random; the latter, called missing BLIM (MissBLIM), introduces specific dependencies of the missing data on the knowledge states, thus assuming that the missing data are missing not at random. The IMBLIM and the MissBLIM modeled the missingness in a satisfactory way, in both a simulation study and an empirical application, depending on the process that generates the missingness: If the missing data-generating process is of type missing completely at random, then either IMBLIM or MissBLIM provide adequate fit to the data. However, if the pattern of missingness is functionally dependent upon unobservable features of the data (e.g., missing answers are more likely to be wrong), then only a correctly specified model of the missingness distribution provides an adequate fit to the data.

  2. Modeling active memory: Experiment, theory and simulation

    NASA Astrophysics Data System (ADS)

    Amit, Daniel J.

    2001-06-01

    Neuro-physiological experiments on cognitively performing primates are described to argue that strong evidence exists for localized, non-ergodic (stimulus specific) attractor dynamics in the cortex. The specific phenomena are delay activity distributions-enhanced spike-rate distributions resulting from training, which we associate with working memory. The anatomy of the relevant cortex region and the physiological characteristics of the participating elements (neural cells) are reviewed to provide a substrate for modeling the observed phenomena. Modeling is based on the properties of the integrate-and-fire neural element in presence of an input current of Gaussian distribution. Theory of stochastic processes provides an expression for the spike emission rate as a function of the mean and the variance of the current distribution. Mean-field theory is then based on the assumption that spike emission processes in different neurons in the network are independent, and hence the input current to a neuron is Gaussian. Consequently, the dynamics of the interacting network is reduced to the computation of the mean and the variance of the current received by a cell of a given population in terms of the constitutive parameters of the network and the emission rates of the neurons in the different populations. Within this logic we analyze the stationary states of an unstructured network, corresponding to spontaneous activity, and show that it can be stable only if locally the net input current of a neuron is inhibitory. This is then tested against simulations and it is found that agreement is excellent down to great detail. A confirmation of the independence hypothesis. On top of stable spontaneous activity, keeping all parameters fixed, training is described by (Hebbian) modification of synapses between neurons responsive to a stimulus and other neurons in the module-synapses are potentiated between two excited neurons and depressed between an excited and a quiescent neuron

  3. National Aviation Fuel Scenario Analysis Program (NAFSAP). Volume I. Model description. Volume II. User manual. Final report

    SciTech Connect

    Vahovich, S.G.

    1980-03-01

    This report forecasts air carrier jet fuel usage by body type for three user defined markets. The model contains options which allow the user to easily change the composition of the future fleet so that fuel usage scenarios can be 'run'. Both Volumes I and II are contained in this report. Volume I describes the structure of the model. Volume II is a computer users manual.

  4. Gravothermal Star Clusters - Theory and Computer Modelling

    NASA Astrophysics Data System (ADS)

    Spurzem, Rainer

    2010-11-01

    In the George Darwin lecture, delivered to the British Royal Astronomical Society in 1960 by Viktor A. Ambartsumian he wrote on the evolution of stellar systems that it can be described by the "dynamic evolution of a gravitating gas" complemented by "a statistical description of the changes in the physical states of stars". This talk will show how this physical concept has inspired theoretical modeling of star clusters in the following decades up to the present day. The application of principles of thermodynamics shows, as Ambartsumian argued in his 1960 lecture, that there is no stable state of equilibrium of a gravitating star cluster. The trend to local thermodynamic equilibrium is always disturbed by escaping stars (Ambartsumian), as well as by gravothermal and gravogyro instabilities, as it was detected later. Here the state-of-the-art of modeling the evolution of dense stellar systems based on principles of thermodynamics and statistical mechanics (Fokker-Planck approximation) will be reviewed. Recent progress including rotation and internal correlations (primordial binaries) is presented. The models have also very successfully been used to study dense star clusters around massive black holes in galactic nuclei and even (in a few cases) relativistic supermassive dense objects in centres of galaxies (here again briefly touching one of the many research fields of V.A. Ambartsumian). For the modern present time of high-speed supercomputing, where we are tackling direct N-body simulations of star clusters, we will show that such direct modeling supports and proves the concept of the statistical models based on the Fokker-Planck theory, and that both theoretical concepts and direct computer simulations are necessary to support each other and make scientific progress in the study of star cluster evolution.

  5. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  6. User-driven update of a high-resolution geopotential model

    NASA Astrophysics Data System (ADS)

    Sebera, Josef; Bezděk, Aleš; Kostelecký, Jan; Pešek, Ivan

    2014-05-01

    Almost every year, there is a lot of (not only) new gravity data from satellite altimetry available to the users. This is in contradiction to the situation over the lands where financial and time costs are usually much higher. Hence, it might be reasonable to update global gravity field models in specific areas with new data. In this contribution, we outline a simple and user-driven concept for updating geopotential models over the oceans if relevant new data become available. The approach employs a grid-wise ellipsoidal harmonic analysis applied to gravity disturbance, while the resolution can achieve a higher maximum degree compared to recent combination models like EGM2008. The obtained harmonic coefficients represent global but regionally updated gravity information. As a test case, we present the concept using EGM2008 and DTU10.

  7. An achievement-weighted constraint satisfaction approach to modeling user preferences

    NASA Astrophysics Data System (ADS)

    Kokawa, Takashi; Ogawa, Hitoshi

    The presented study deals with the so-called soft constraint satisfaction problem (SCSP) and proposes an extension to the standard SCSP formulation to accommodate a wider class of over-constrained situations and allow for a generally higher level of flexibility in the constraint-driven problem-solving. The extended modeling approach called Achievement-Weighted Constraint Satisfaction (AWCS) assumes the definition of constraint parameters ``traditional'' for SCSPs, as well as additional parameters specified to dynamically manipulate constraint weights in the course of solution search. These latter parameters make it possible to ``relax'' over-constrained models and obtain a solution even when there are mutually contradicting rules utilized by an AWCS problem-solver. To explore the proposed modeling framework, a task of finding an optimal route in car navigation, based on user preferences - a popular are of research in SCSP studies - is considered. A case study is presented, in which an optimal route is first modeled with constraints reflecting user preferences. Problem solutions having different optimality levels are then obtained. A software system is developed to automate both the optimal route modeling (via interaction with the user) and the solution search processes. The system is applied in an experiment conducted to validate the theoretical ideas. Experimental results are discussed, and conclusions are drawn.

  8. ModelMuse - A Graphical User Interface for MODFLOW-2005 and PHAST

    USGS Publications Warehouse

    Winston, Richard B.

    2009-01-01

    ModelMuse is a graphical user interface (GUI) for the U.S. Geological Survey (USGS) models MODFLOW-2005 and PHAST. This software package provides a GUI for creating the flow and transport input file for PHAST and the input files for MODFLOW-2005. In ModelMuse, the spatial data for the model is independent of the grid, and the temporal data is independent of the stress periods. Being able to input these data independently allows the user to redefine the spatial and temporal discretization at will. This report describes the basic concepts required to work with ModelMuse. These basic concepts include the model grid, data sets, formulas, objects, the method used to assign values to data sets, and model features. The ModelMuse main window has a top, front, and side view of the model that can be used for editing the model, and a 3-D view of the model that can be used to display properties of the model. ModelMuse has tools to generate and edit the model grid. It also has a variety of interpolation methods and geographic functions that can be used to help define the spatial variability of the model. ModelMuse can be used to execute both MODFLOW-2005 and PHAST and can also display the results of MODFLOW-2005 models. An example of using ModelMuse with MODFLOW-2005 is included in this report. Several additional examples are described in the help system for ModelMuse, which can be accessed from the Help menu.

  9. A Global User-Driven Model for Tile Prefetching in Web Geographical Information Systems

    PubMed Central

    Pan, Shaoming; Chong, Yanwen; Zhang, Hang; Tan, Xicheng

    2017-01-01

    A web geographical information system is a typical service-intensive application. Tile prefetching and cache replacement can improve cache hit ratios by proactively fetching tiles from storage and replacing the appropriate tiles from the high-speed cache buffer without waiting for a client’s requests, which reduces disk latency and improves system access performance. Most popular prefetching strategies consider only the relative tile popularities to predict which tile should be prefetched or consider only a single individual user's access behavior to determine which neighbor tiles need to be prefetched. Some studies show that comprehensively considering all users’ access behaviors and all tiles’ relationships in the prediction process can achieve more significant improvements. Thus, this work proposes a new global user-driven model for tile prefetching and cache replacement. First, based on all users’ access behaviors, a type of expression method for tile correlation is designed and implemented. Then, a conditional prefetching probability can be computed based on the proposed correlation expression mode. Thus, some tiles to be prefetched can be found by computing and comparing the conditional prefetching probability from the uncached tiles set and, similarly, some replacement tiles can be found in the cache buffer according to multi-step prefetching. Finally, some experiments are provided comparing the proposed model with other global user-driven models, other single user-driven models, and other client-side prefetching strategies. The results show that the proposed model can achieve a prefetching hit rate in approximately 10.6% ~ 110.5% higher than the compared methods. PMID:28085937

  10. Hiding the system from the user: Moving from complex mental models to elegant metaphors

    SciTech Connect

    Curtis W. Nielsen; David J. Bruemmer

    2007-08-01

    In previous work, increased complexity of robot behaviors and the accompanying interface design often led to operator confusion and/or a fight for control between the robot and operator. We believe the reason for the conflict was that the design of the interface and interactions presented too much of the underlying robot design model to the operator. Since the design model includes the implementation of sensors, behaviors, and sophisticated algorithms, the result was that the operator’s cognitive efforts were focused on understanding the design of the robot system as opposed to focusing on the task at hand. This paper illustrates how this very problem emerged at the INL and how the implementation of new metaphors for interaction has allowed us to hide the design model from the user and allow the user to focus more on the task at hand. Supporting the user’s focus on the task rather than on the design model allows increased use of the system and significant performance improvement in a search task with novice users.

  11. End users transforming experiences into formal information and process models for personalised health interventions.

    PubMed

    Lindgren, Helena; Lundin-Olsson, Lillemor; Pohl, Petra; Sandlund, Marlene

    2014-01-01

    Five physiotherapists organised a user-centric design process of a knowledge-based support system for promoting exercise and preventing falls. The process integrated focus group studies with 17 older adults and prototyping. The transformation of informal medical and rehabilitation expertise and older adults' experiences into formal information and process models during the development was studied. As tool they used ACKTUS, a development platform for knowledge-based applications. The process became agile and incremental, partly due to the diversity of expectations and preferences among both older adults and physiotherapists, and the participatory approach to design and development. In addition, there was a need to develop the knowledge content alongside with the formal models and their presentations, which allowed the participants to test hands-on and evaluate the ideas, content and design. The resulting application is modular, extendable, flexible and adaptable to the individual end user. Moreover, the physiotherapists are able to modify the information and process models, and in this way further develop the application. The main constraint was found to be the lack of support for the initial phase of concept modelling, which lead to a redesigned user interface and functionality of ACKTUS.

  12. User modeling techniques for enhanced usability of OPSMODEL operations simulation software

    NASA Technical Reports Server (NTRS)

    Davis, William T.

    1991-01-01

    The PC based OPSMODEL operations software for modeling and simulation of space station crew activities supports engineering and cost analyses and operations planning. Using top-down modeling, the level of detail required in the data base can be limited to being commensurate with the results required of any particular analysis. To perform a simulation, a resource environment consisting of locations, crew definition, equipment, and consumables is first defined. Activities to be simulated are then defined as operations and scheduled as desired. These operations are defined within a 1000 level priority structure. The simulation on OPSMODEL, then, consists of the following: user defined, user scheduled operations executing within an environment of user defined resource and priority constraints. Techniques for prioritizing operations to realistically model a representative daily scenario of on-orbit space station crew activities are discussed. The large number of priority levels allows priorities to be assigned commensurate with the detail necessary for a given simulation. Several techniques for realistic modeling of day-to-day work carryover are also addressed.

  13. Theory and Modeling in Support of Tether

    NASA Technical Reports Server (NTRS)

    Chang, C. L.; Bergeron, G.; Drobot, A. D.; Papadopoulos, K.; Riyopoulos, S.; Szuszczewicz, E.

    1999-01-01

    This final report summarizes the work performed by SAIC's Applied Physics Operation on the modeling and support of Tethered Satellite System missions (TSS-1 and TSS-1R). The SAIC team, known to be Theory and Modeling in Support of Tether (TMST) investigation, was one of the original twelve teams selected in July, 1985 for the first TSS mission. The accomplishments described in this report cover the period December 19, 1985 to September 31, 1999 and are the result of a continuous effort aimed at supporting the TSS missions in the following major areas. During the contract period, the SAIC's TMST investigation acted to: Participate in the planning and the execution on both of the TSS missions; Provide scientific understanding on the issues involved in the electrodynamic tether system operation prior to the TSS missions; Predict ionospheric conditions encountered during the re-flight mission (TSS-lR) based on realtime global ionosounde data; Perform post mission analyses to enhance our understanding on the TSS results. Specifically, we have 1) constructed and improved current collection models and enhanced our understanding on the current-voltage data; 2) investigated the effects of neutral gas in the current collection processes; 3) conducted laboratory experiments to study the discharge phenomena during and after tether-break; and 4) perform numerical simulations to understand data collected by plasma instruments SPES onboard the TSS satellite; Design and produce multi-media CD that highlights TSS mission achievements and convey the knowledge of the tether technology to the general public. Along with discussions of this work, a list of publications and presentations derived from the TMST investigation spanning the reporting period is compiled.

  14. User's manual for a parameter identification technique. [with options for model simulation for fixed input forcing functions and identification from wind tunnel and flight measurements

    NASA Technical Reports Server (NTRS)

    Kanning, G.

    1975-01-01

    A digital computer program written in FORTRAN is presented that implements the system identification theory for deterministic systems using input-output measurements. The user supplies programs simulating the mathematical model of the physical plant whose parameters are to be identified. The user may choose any one of three options. The first option allows for a complete model simulation for fixed input forcing functions. The second option identifies up to 36 parameters of the model from wind tunnel or flight measurements. The third option performs a sensitivity analysis for up to 36 parameters. The use of each option is illustrated with an example using input-output measurements for a helicopter rotor tested in a wind tunnel.

  15. Catastrophe Theory: A Unified Model for Educational Change.

    ERIC Educational Resources Information Center

    Cryer, Patricia; Elton, Lewis

    1990-01-01

    Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)

  16. A Leadership Identity Development Model: Applications from a Grounded Theory

    ERIC Educational Resources Information Center

    Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.

    2006-01-01

    This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…

  17. Emissions of indoor air pollutants from six user scenarios in a model room

    NASA Astrophysics Data System (ADS)

    Höllbacher, Eva; Ters, Thomas; Rieder-Gradinger, Cornelia; Srebotnik, Ewald

    2017-02-01

    In this study six common user scenarios putatively influencing indoor air quality were performed in a model room constructed according to the specifications of the European Reference Room given in the new horizontal prestandard prEN 16516 to gain further information about the influence of user activities on indoor air quality. These scenarios included the use of cleaning agent, an electric air freshener, an ethanol fireplace and cosmetics as well as cigarette smoking and peeling of oranges. Four common indoor air pollutants were monitored: volatile organic compounds (VOC), particulate matter (PM), carbonyl compounds and CO2. The development of all pollutants was determined during and after the test performance. For each measured pollutant, well-defined maximum values could be assigned to one or more of the individual user scenarios. The highest VOC concentration was measured during orange-peeling reaching a maximum value of 3547 μg m-3. Carbonyl compounds and PM were strongly elevated while cigarette smoking. Here, a maximum formaldehyde concentration of 76 μg m-3 and PM concentration of 378 μg m-3 were measured. CO2 was only slightly affected by most of the tests except the use of the ethanol fireplace where a maximum concentration of 1612 ppm was reached. Generally, the user scenarios resulted in a distinct increase of several indoor pollutants that usually decreased rapidly after the removal of the source.

  18. NETPATH-WIN: an interactive user version of the mass-balance model, NETPATH

    USGS Publications Warehouse

    El-Kadi, A. I.; Plummer, L.N.; Aggarwal, P.

    2011-01-01

    NETPATH-WIN is an interactive user version of NETPATH, an inverse geochemical modeling code used to find mass-balance reaction models that are consistent with the observed chemical and isotopic composition of waters from aquatic systems. NETPATH-WIN was constructed to migrate NETPATH applications into the Microsoft WINDOWS® environment. The new version facilitates model utilization by eliminating difficulties in data preparation and results analysis of the DOS version of NETPATH, while preserving all of the capabilities of the original version. Through example applications, the note describes some of the features of NETPATH-WIN as applied to adjustment of radiocarbon data for geochemical reactions in groundwater systems.

  19. Reliability and Maintainability Model (RAM): User and Maintenance Manual. Part 2; Improved Supportability Analysis

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1996-01-01

    This report documents the procedures for utilizing and maintaining the Reliability & Maintainability Model (RAM) developed by the University of Dayton for the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). The purpose of the grant is to provide support to NASA in establishing operational and support parameters and costs of proposed space systems. As part of this research objective, the model described here was developed. This Manual updates and supersedes the 1995 RAM User and Maintenance Manual. Changes and enhancements from the 1995 version of the model are primarily a result of the addition of more recent aircraft and shuttle R&M data.

  20. User's Manual for Data for Validating Models for PV Module Performance

    SciTech Connect

    Marion, W.; Anderberg, A.; Deline, C.; Glick, S.; Muller, M.; Perrin, G.; Rodriguez, J.; Rummel, S.; Terwilliger, K.; Silverman, T. J.

    2014-04-01

    This user's manual describes performance data measured for flat-plate photovoltaic (PV) modules installed in Cocoa, Florida, Eugene, Oregon, and Golden, Colorado. The data include PV module current-voltage curves and associated meteorological data for approximately one-year periods. These publicly available data are intended to facilitate the validation of existing models for predicting the performance of PV modules, and for the development of new and improved models. For comparing different modeling approaches, using these public data will provide transparency and more meaningful comparisons of the relative benefits.

  1. Micromechanics of metal matrix composites using the Generalized Method of Cells model (GMC) user's guide

    NASA Technical Reports Server (NTRS)

    Aboudi, Jacob; Pindera, Marek-Jerzy

    1992-01-01

    A user's guide for the program gmc.f is presented. The program is based on the generalized method of cells model (GMC) which is capable via a micromechanical analysis, of predicting the overall, inelastic behavior of unidirectional, multi-phase composites from the knowledge of the properties of the viscoplastic constituents. In particular, the program is sufficiently general to predict the response of unidirectional composites having variable fiber shapes and arrays.

  2. AE9/AP9/SPM Radiation Environment Model: User’s Guide

    DTIC Science & Technology

    2014-02-18

    calculation results . Dose calculation results are also available. The GUI program ‘testAe9Ap9Gui.exe’ provides a graphical user interface to specify an...automatically generated according to the user’s selections in the interface. Basic 2D plots of the model results may also be produced. In addition to this...interface for producing mission statistics Aggregates results of many MC scenarios (flux, fluence, mean, percentiles) Provides access to orbit

  3. Observation-Based Dissipation and Input Terms for Spectral Wave Models, with End-User Testing

    DTIC Science & Technology

    2013-09-30

    Spectral Wave Models, with End-User Testing Alexander V. Babanin Swinburne University of Technology PO Box 218 Hawthorn, Victoria 3140 Australia...the course of the ONR Lake George (Australia) project, estimates of the spectral distribution of the wave- breaking dissipation were obtained, and...testing and hindcasting, a set of field sites and datasets were chosen which include Lake Michigan (deep water, no swell, Rogers et al., 2012), Lake

  4. Observation-Based Dissipation and Input Terms for Spectral Wave Models, with End-User Testing

    DTIC Science & Technology

    2011-09-30

    Spectral Wave Models, with End-User Testing Alexander V. Babanin Swinburne University of Technology, PO Box 218 Hawthorn, Victoria 3140 Australia... Victoria 3140 Australia, 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S...Australasian Coastal and Ocean Eng. Conf. and 10th Australasian Port and Harbour Conf., 20-23 September 2005, Adelaide, South Australia, Eds. M.Townsend and

  5. User-Driven Geolocation of Untagged Desert Imagery Using Digital Elevation Models

    DTIC Science & Technology

    2013-01-01

    User-Driven Geolocation of Untagged Desert Imagery Using Digital Elevation Models Eric Tzeng, Andrew Zhai, Matthew Clements, Raphael Townshend, and...normalizing skylines with respect to their features, we achieve similarity invariance. Furthermore, our system is robust to partial occlusions ; since the...feature is used as a key to an entry to the hash-table, we ensure that only a convex normalized feature can be used to get that entry in the hash-table

  6. User Manual for ATILA, a Finite-Element Code for Modeling Piezoelectric Transducers.

    DTIC Science & Technology

    1987-09-01

    bandwidth of a radiating Tonpilz transducer ", communication L9, 112th ASA meeting, Anaheim (1986). B. HAMONIC, J.C. DEBUS, J.N. DECARPIGNY, "Analyse modale...generally well suited to the modelling of Tonpilz type transducers and allows large savings of CPU time 15 . ... ... ..... .. .. o...1USER NAIWL FOR RTILA A FINITE-ELENENT CODE F=RLMODELING PIEZOELECTRIC TRANSDUCERS (U) iMRL 9" POSTGRADUATE SCHOOL MONTEREY CA J DECRRPXSKY ET AL.U

  7. Accelerating Human-Computer Collaborative Search through Learning Comparative and Predictive User Models

    DTIC Science & Technology

    2012-07-09

    comes from prior work with the Estimation-Exploration Algorithm [2, 3], in which a coevolutionary system is used to evolve an esti- mation population ...and an exploration population , which evolves intelligent tests to perform on the hidden target system using the best models so far. In this case...can be used tirelessly to perform thousands or mil- lions of evaluations and thereby circumvent the limitations of having human users act as the fitness

  8. Object relations theory and activity theory: a proposed link by way of the procedural sequence model.

    PubMed

    Ryle, A

    1991-12-01

    An account of object relations theory (ORT), represented in terms of the procedural sequence model (PSM), is compared to the ideas of Vygotsky and activity theory (AT). The two models are seen to be compatible and complementary and their combination offers a satisfactory account of human psychology, appropriate for the understanding and integration of psychotherapy.

  9. MFIX documentation: User`s manual

    SciTech Connect

    Syamlal, M.

    1994-11-01

    MFIX (Multiphase Flow with Interphase exchanges) is a general-purpose hydro-dynamic model for describing chemical reactions and heat transfer in dense or dilute fluid-solids flows, which typically occur in energy conversion and chemical processing reactors. MFIX calculations give time-dependent information on pressure, temperature, composition, and velocity distributions in the reactors. The theoretical basis of the calculations is described in the MFIX Theory Guide. This report, which is the MFIX User`s Manual, gives an overview of the numerical technique, and describes how to install the MFIX code and post-processing codes, set up data files and run MFIX, graphically analyze MFIX results, and retrieve data from the output files. Two tutorial problems that highlight various features of MFIX are also discussed.

  10. Streamflow forecasting using the modular modeling system and an object-user interface

    USGS Publications Warehouse

    Jeton, A.E.

    2001-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Bureau of Reclamation (BOR), developed a computer program to provide a general framework needed to couple disparate environmental resource models and to manage the necessary data. The Object-User Interface (OUI) is a map-based interface for models and modeling data. It provides a common interface to run hydrologic models and acquire, browse, organize, and select spatial and temporal data. One application is to assist river managers in utilizing streamflow forecasts generated with the Precipitation-Runoff Modeling System running in the Modular Modeling System (MMS), a distributed-parameter watershed model, and the National Weather Service Extended Streamflow Prediction (ESP) methodology.

  11. Theory and modeling of active brazing.

    SciTech Connect

    van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.

    2013-09-01

    Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.

  12. Hawaii demand-side management resource assessment. Final report, Reference Volume 5: The DOETRAN user`s manual; The DOE-2/DBEDT DSM forecasting model interface

    SciTech Connect

    1995-04-01

    The DOETRAN model is a DSM database manager, developed to act as an intermediary between the whole building energy simulation model, DOE-2, and the DBEDT DSM Forecasting Model. DOETRAN accepts output data from DOE-2 and TRANslates that into the format required by the forecasting model. DOETRAN operates in the Windows environment and was developed using the relational database management software, Paradox 5.0 for Windows. It is not necessary to have any knowledge of Paradox to use DOETRAN. DOETRAN utilizes the powerful database manager capabilities of Paradox through a series of customized user-friendly windows displaying buttons and menus with simple and clear functions. The DOETRAN model performs three basic functions, with an optional fourth. The first function is to configure the user`s computer for DOETRAN. The second function is to import DOE-2 files with energy and loadshape data for each building type. The third main function is to then process the data into the forecasting model format. As DOETRAN processes the DOE-2 data, graphs of the total electric monthly impacts for each DSM measure appear, providing the user with a visual means of inspecting DOE-2 data, as well as following program execution. DOETRAN provides three tables for each building type for the forecasting model, one for electric measures, gas measures, and basecases. The optional fourth function provided by DOETRAN is to view graphs of total electric annual impacts by measure. This last option allows a comparative view of how one measure rates against another. A section in this manual is devoted to each of the four functions mentioned above, as well as computer requirements and exiting DOETRAN.

  13. Prediction of User's Web-Browsing Behavior: Application of Markov Model.

    PubMed

    Awad, M A; Khalil, I

    2012-08-01

    Web prediction is a classification problem in which we attempt to predict the next set of Web pages that a user may visit based on the knowledge of the previously visited pages. Predicting user's behavior while serving the Internet can be applied effectively in various critical applications. Such application has traditional tradeoffs between modeling complexity and prediction accuracy. In this paper, we analyze and study Markov model and all- Kth Markov model in Web prediction. We propose a new modified Markov model to alleviate the issue of scalability in the number of paths. In addition, we present a new two-tier prediction framework that creates an example classifier EC, based on the training examples and the generated classifiers. We show that such framework can improve the prediction time without compromising prediction accuracy. We have used standard benchmark data sets to analyze, compare, and demonstrate the effectiveness of our techniques using variations of Markov models and association rule mining. Our experiments show the effectiveness of our modified Markov model in reducing the number of paths without compromising accuracy. Additionally, the results support our analysis conclusions that accuracy improves with higher orders of all- Kth model.

  14. Labor Market Projections Model: a user's guide to the population, labor force, and unemployment projections model at Lawrence Berkeley Laboratory

    SciTech Connect

    Schroeder, E.

    1980-08-01

    In an effort to assist SESA analysts and CETA prime sponsor planners in the development of labor-market information suitable to their annual plans, the Labor Market Projections Model (LMPM) was initiated. The purpose of LMPM is to provide timely information on the demographic characteristics of local populations, labor supply, and unemployment. In particular, the model produces short-term projections of the distributions of population, labor force, and unemployment by age, sex, and race. LMPM was designed to carry out these projections at various geographic levels - counties, prime-sponsor areas, SMSAs, and states. While LMPM can project population distributions for areas without user input, the labor force and unemployment projections rely upon inputs from analysts or planners familiar with the economy of the area of interest. Thus, LMPM utilizes input from the SESA analysts. This User's Guide to LMPM was specifically written as an aid to SESA analysts and other users in improving their understanding of LMPM. The basic method of LMPM is a demographic cohort aging model that relies upon 1970 Census data. LMPM integrates data from several sources in order to produce current projections from the 1970 baseline for all the local areas of the nation. This User's Guide documents the procedures, data, and output of LMPM. 11 references.

  15. User Guide for VISION 3.4.7 (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Wendell D. Hintze

    2011-07-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters and options; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating 'what if' scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., 'reactor types' not individual reactors and 'separation types' not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation or disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. VISION is comprised of several Microsoft Excel input files, a Powersim Studio core, and several Microsoft Excel output files. All must be co-located in the same folder on a PC to function. You must use Powersim Studio 8 or better. We have tested VISION with the Studio 8 Expert, Executive, and Education versions. The Expert and Education

  16. Mars Global Reference Atmospheric Model 2000 Version (Mars-GRAM 2000): Users Guide

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; James, B. F.

    2000-01-01

    This report presents Mars Global Reference Atmospheric Model 2000 Version (Mars-GRAM 2000) and its new features. All parameterizations for temperature, pressure, density, and winds versus height, latitude, longitude, time of day, and L(sub s) have been replaced by input data tables from NASA Ames Mars General Circulation Model (MGCM) for the surface through 80-km altitude and the University of Arizona Mars Thermospheric General Circulation Model (MTGCM) for 80 to 170 km. A modified Stewart thermospheric model is still used for higher altitudes and for dependence on solar activity. "Climate factors" to tune for agreement with GCM data are no longer needed. Adjustment of exospheric temperature is still an option. Consistent with observations from Mars Global Surveyor, a new longitude-dependent wave model is included with user input to specify waves having 1 to 3 wavelengths around the planet. A simplified perturbation model has been substituted for the earlier one. An input switch allows users to select either East or West longitude positive. This memorandum includes instructions on obtaining Mars-GRAM source code and data files and for running the program. It also provides sample input and output and an example for incorporating Mars-GRAM as an atmospheric subroutine in a trajectory code.

  17. User-driven Cloud Implementation of environmental models and data for all

    NASA Astrophysics Data System (ADS)

    Gurney, R. J.; Percy, B. J.; Elkhatib, Y.; Blair, G. S.

    2014-12-01

    Environmental data and models come from disparate sources over a variety of geographical and temporal scales with different resolutions and data standards, often including terabytes of data and model simulations. Unfortunately, these data and models tend to remain solely within the custody of the private and public organisations which create the data, and the scientists who build models and generate results. Although many models and datasets are theoretically available to others, the lack of ease of access tends to keep them out of reach of many. We have developed an intuitive web-based tool that utilises environmental models and datasets located in a cloud to produce results that are appropriate to the user. Storyboards showing the interfaces and visualisations have been created for each of several exemplars. A library of virtual machine images has been prepared to serve these exemplars. Each virtual machine image has been tailored to run computer models appropriate to the end user. Two approaches have been used; first as RESTful web services conforming to the Open Geospatial Consortium (OGC) Web Processing Service (WPS) interface standard using the Python-based PyWPS; second, a MySQL database interrogated using PHP code. In all cases, the web client sends the server an HTTP GET request to execute the process with a number of parameter values and, once execution terminates, an XML or JSON response is sent back and parsed at the client side to extract the results. All web services are stateless, i.e. application state is not maintained by the server, reducing its operational overheads and simplifying infrastructure management tasks such as load balancing and failure recovery. A hybrid cloud solution has been used with models and data sited on both private and public clouds. The storyboards have been transformed into intuitive web interfaces at the client side using HTML, CSS and JavaScript, utilising plug-ins such as jQuery and Flot (for graphics), and Google Maps

  18. A user's manual for the method of moments Aircraft Modeling Code (AMC)

    NASA Technical Reports Server (NTRS)

    Peters, M. E.; Newman, E. H.

    1989-01-01

    This report serves as a user's manual for the Aircraft Modeling Code or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. The input command language is described and several examples which illustrate typical code inputs and outputs are also included.

  19. A users manual for the method of moments Aircraft Modeling Code (AMC), version 2

    NASA Technical Reports Server (NTRS)

    Peters, M. E.; Newman, E. H.

    1994-01-01

    This report serves as a user's manual for Version 2 of the 'Aircraft Modeling Code' or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. This report describes the input command language and also includes several examples which illustrate typical code inputs and outputs.

  20. A Quantitative Causal Model Theory of Conditional Reasoning

    ERIC Educational Resources Information Center

    Fernbach, Philip M.; Erb, Christopher D.

    2013-01-01

    The authors propose and test a causal model theory of reasoning about conditional arguments with causal content. According to the theory, the acceptability of modus ponens (MP) and affirming the consequent (AC) reflect the conditional likelihood of causes and effects based on a probabilistic causal model of the scenario being judged. Acceptability…

  1. User's guide to the Penn State/NCAR Mesoscale Modeling System

    NASA Astrophysics Data System (ADS)

    Gill, David O.

    1992-10-01

    An updated version of the Pennsylvania State University/National Center for Atmospheric Research (PSU/NCAR) Mesoscale Modeling system (the MM4 system) is presented. The standard MM4 modeling package employs a Cressman multi-scan isobaric and surface analysis, with a hydrostatic predictive component using a leap frog integration of the flux form of the primitive equations on sigma coordinates. An experimental version has expanded the data ingest routines to allow hybrid isentropic-isobaric + surface analyses. Experimental versions of the model allow split-explicit time integration, several cumulus parameterizations coupled with an explicit moisture scheme, multiple levels of movable nests, relaxation of the hydrostatic assumptions, additional planetary boundary layer schemes, and microphysical packages. Due to the developmental nature of the modeling system, periodic upgrades in documentation are required to keep the manuals in accord with the programs. The document supersedes Penn State/NCAR Mesoscale Model User's Manual--Ver 8.

  2. BOOK REVIEW: Supersymmetry and String Theory: Beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Rocek, Martin

    2007-11-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically.

  3. Program evaluation models and related theories: AMEE guide no. 67.

    PubMed

    Frye, Ann W; Hemmer, Paul A

    2012-01-01

    This Guide reviews theories of science that have influenced the development of common educational evaluation models. Educators can be more confident when choosing an appropriate evaluation model if they first consider the model's theoretical basis against their program's complexity and their own evaluation needs. Reductionism, system theory, and (most recently) complexity theory have inspired the development of models commonly applied in evaluation studies today. This Guide describes experimental and quasi-experimental models, Kirkpatrick's four-level model, the Logic Model, and the CIPP (Context/Input/Process/Product) model in the context of the theories that influenced their development and that limit or support their ability to do what educators need. The goal of this Guide is for educators to become more competent and confident in being able to design educational program evaluations that support intentional program improvement while adequately documenting or describing the changes and outcomes-intended and unintended-associated with their programs.

  4. Hawaii demand-side management resource assessment. Final report, Reference Volume 4: The DBEDT DSM assessment model user`s manual

    SciTech Connect

    1995-04-01

    The DBEDT DSM Assessment Model (DSAM) is a spreadsheet model developed in Quattro Pro for Windows that is based on the integration of the DBEDT energy forecasting model, ENERGY 2020, with the output from the building energy use simulation model, DOE-2. DOE-2 provides DSM impact estimates for both energy and peak demand. The ``User`s Guide`` is designed to assist DBEDT staff in the operation of DSAM. Supporting information on model structure and data inputs are provided in Volumes 2 and 3 of the Final Report. DSAM is designed to provide DBEDT estimates of the potential DSM resource for each county in Hawaii by measure, program, sector, year, and levelized cost category. The results are provided for gas and electric and for both energy and peak demand. There are two main portions of DSAM, the residential sector and the commercial sector. The basic underlying logic for both sectors are the same. However, there are some modeling differences between the two sectors. The differences are primarily the result of (1) the more complex nature of the commercial sector, (2) memory limitations within Quattro Pro, and (3) the fact that the commercial sector portion of the model was written four months after the residential sector portion. The structure for both sectors essentially consists of a series of input spreadsheets, the portion of the model where the calculations are performed, and a series of output spreadsheets. The output spreadsheets contain both detailed and summary tables and graphs.

  5. Theory of stellar convection - II. First stellar models

    NASA Astrophysics Data System (ADS)

    Pasetto, S.; Chiosi, C.; Chiosi, E.; Cropper, M.; Weiss, A.

    2016-07-01

    We present here the first stellar models on the Hertzsprung-Russell diagram, in which convection is treated according to the new scale-free convection theory (SFC theory) by Pasetto et al. The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few per cent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients ∇ and ∇e, and energy fluxes that are very similar to those derived from the `calibrated' MT theory for main-sequence stars. We conclude that the old scale dependent ML theory can now be replaced with a self-consistent scale-free theory able to predict correct results, as it is more physically grounded than the ML theory. Fundamentally, the SFC theory offers a deeper insight of the underlying physics than numerical simulations.

  6. Large field inflation models from higher-dimensional gauge theories

    NASA Astrophysics Data System (ADS)

    Furuuchi, Kazuyuki; Koyama, Yoji

    2015-02-01

    Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante's Inferno model turns out to be the most preferred model in this framework.

  7. Large field inflation models from higher-dimensional gauge theories

    SciTech Connect

    Furuuchi, Kazuyuki; Koyama, Yoji

    2015-02-23

    Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante’s Inferno model turns out to be the most preferred model in this framework.

  8. An information model to support user-centered design of medical devices.

    PubMed

    Hagedorn, Thomas J; Krishnamurty, Sundar; Grosse, Ian R

    2016-08-01

    The process of engineering design requires the product development team to balance the needs and limitations of many stakeholders, including those of the user, regulatory organizations, and the designing institution. This is particularly true in medical device design, where additional consideration must be given for a much more complex user-base that can only be accessed on a limited basis. Given this inherent challenge, few projects exist that consider design domain concepts, such as aspects of a detailed design, a detailed view of various stakeholders and their capabilities, along with the user-needs simultaneously. In this paper, we present a novel information model approach that combines a detailed model of design elements with a model of the design itself, customer requirements, and of the capabilities of the customer themselves. The information model is used to facilitate knowledge capture and automated reasoning across domains with a minimal set of rules by adopting a terminology that treats customer and design specific factors identically, thus enabling straightforward assessments. A uniqueness of this approach is that it systematically provides an integrated perspective on the key usability information that drive design decisions towards more universal or effective outcomes with the very design information impacted by the usability information. This can lead to cost-efficient optimal designs based on a direct inclusion of the needs of customers alongside those of business, marketing, and engineering requirements. Two case studies are presented to show the method's potential as a more effective knowledge management tool with built-in automated inferences that provide design insight, as well as its overall effectiveness as a platform to develop and execute medical device design from a holistic perspective.

  9. Using the NIATx Model to Implement User-Centered Design of Technology for Older Adults

    PubMed Central

    Maus, Adam; Judkins, Julianne; Dinauer, Susan; Isham, Andrew; Johnson, Roberta; Landucci, Gina; Atwood, Amy K

    2016-01-01

    What models can effectively guide the creation of eHealth and mHealth technologies? This paper describes the use of the NIATx model as a framework for the user-centered design of a new technology for older adults. The NIATx model is a simple framework of process improvement based on the following principles derived from an analysis of decades of research from various industries about why some projects fail and others succeed: (1) Understand and involve the customer; (2) fix key problems; (3) pick an influential change leader; (4) get ideas from outside the field; (5) use rapid-cycle testing. This paper describes the use of these principles in technology development, the strengths and challenges of using this approach in this context, and lessons learned from the process. Overall, the NIATx model enabled us to produce a user-focused technology that the anecdotal evidence available so far suggests is engaging and useful to older adults. The first and fourth principles were especially important in developing the technology; the fourth proved the most challenging to use. PMID:27025985

  10. Using the NIATx Model to Implement User-Centered Design of Technology for Older Adults.

    PubMed

    Gustafson, David H; Maus, Adam; Judkins, Julianne; Dinauer, Susan; Isham, Andrew; Johnson, Roberta; Landucci, Gina; Atwood, Amy K

    2016-01-14

    What models can effectively guide the creation of eHealth and mHealth technologies? This paper describes the use of the NIATx model as a framework for the user-centered design of a new technology for older adults. The NIATx model is a simple framework of process improvement based on the following principles derived from an analysis of decades of research from various industries about why some projects fail and others succeed: (1) Understand and involve the customer; (2) fix key problems; (3) pick an influential change leader; (4) get ideas from outside the field; (5) use rapid-cycle testing. This paper describes the use of these principles in technology development, the strengths and challenges of using this approach in this context, and lessons learned from the process. Overall, the NIATx model enabled us to produce a user-focused technology that the anecdotal evidence available so far suggests is engaging and useful to older adults. The first and fourth principles were especially important in developing the technology; the fourth proved the most challenging to use.

  11. Optimal water allocation in small hydropower plants between traditional and non-traditional water users: merging theory and existing practices.

    NASA Astrophysics Data System (ADS)

    Gorla, Lorenzo; Crouzy, Benoît; Perona, Paolo

    2014-05-01

    Water demand for hydropower production is increasing together with the consciousness of the importance of riparian ecosystems and biodiversity. Some Cantons in Switzerland and other alpine regions in Austria and in Süd Tiröl (Italy) started replacing the inadequate concept of Minimum Flow Requirement (MFR) with a dynamic one, by releasing a fix percentage of the total inflow (e.g. 25 %) to the environment. Starting from a model proposed by Perona et al. (2013) and the need of including the environment as an actual water user, we arrived to similar qualitative results, and better quantitative performances. In this paper we explore the space of non-proportional water repartition rules analysed by Gorla and Perona (2013), and we propose new ecological indicators which are directly derived from current ecologic evaluation practices (fish habitat modelling and hydrological alteration). We demonstrate that both MFR water redistribution policy and also proportional repartition rules can be improved using nothing but available information. Furthermore, all water redistribution policies can be described by the model proposed by Perona et al. (2013) in terms of the Principle of Equal Marginal Utility (PEMU) and a suitable class of nonlinear functions. This is particularly useful to highlights implicit assumptions and choosing best-compromise solutions, providing analytical reasons explaining why efficiency cannot be attained by classic repartition rules. Each water repartition policy underlies an ecosystem monetization and a political choice always has to be taken. We explicit the value of the ecosystem health underlying each policy by means of the PEMU under a few assumptions, and discuss how the theoretic efficient redistribution law obtained by our approach is feasible and doesn't imply high costs or advanced management tools. For small run-of-river power plants, this methodology answers the question "how much water should be left to the river?" and is therefore a

  12. General autocatalytic theory and simple model of financial markets

    NASA Astrophysics Data System (ADS)

    Thuy Anh, Chu; Lan, Nguyen Tri; Viet, Nguyen Ai

    2015-06-01

    The concept of autocatalytic theory has become a powerful tool in understanding evolutionary processes in complex systems. A generalization of autocatalytic theory was assumed by considering that the initial element now is being some distribution instead of a constant value as in traditional theory. This initial condition leads to that the final element might have some distribution too. A simple physics model for financial markets is proposed, using this general autocatalytic theory. Some general behaviours of evolution process and risk moment of a financial market also are investigated in framework of this simple model.

  13. User-Friendly Predictive Modeling of Greenhouse Gas (GHG) Fluxes and Carbon Storage in Tidal Wetlands

    NASA Astrophysics Data System (ADS)

    Ishtiaq, K. S.; Abdul-Aziz, O. I.

    2015-12-01

    We developed user-friendly empirical models to predict instantaneous fluxes of CO2 and CH4 from coastal wetlands based on a small set of dominant hydro-climatic and environmental drivers (e.g., photosynthetically active radiation, soil temperature, water depth, and soil salinity). The dominant predictor variables were systematically identified by applying a robust data-analytics framework on a wide range of possible environmental variables driving wetland greenhouse gas (GHG) fluxes. The method comprised of a multi-layered data-analytics framework, including Pearson correlation analysis, explanatory principal component and factor analyses, and partial least squares regression modeling. The identified dominant predictors were finally utilized to develop power-law based non-linear regression models to predict CO2 and CH4 fluxes under different climatic, land use (nitrogen gradient), tidal hydrology and salinity conditions. Four different tidal wetlands of Waquoit Bay, MA were considered as the case study sites to identify the dominant drivers and evaluate model performance. The study sites were dominated by native Spartina Alterniflora and characterized by frequent flooding and high saline conditions. The model estimated the potential net ecosystem carbon balance (NECB) both in gC/m2 and metric tonC/hectare by up-scaling the instantaneous predicted fluxes to the growing season and accounting for the lateral C flux exchanges between the wetlands and estuary. The entire model was presented in a single Excel spreadsheet as a user-friendly ecological engineering tool. The model can aid the development of appropriate GHG offset protocols for setting monitoring plans for tidal wetland restoration and maintenance projects. The model can also be used to estimate wetland GHG fluxes and potential carbon storage under various IPCC climate change and sea level rise scenarios; facilitating an appropriate management of carbon stocks in tidal wetlands and their incorporation into a

  14. SALMOD: a population model for salmonids: user's manual. Version W3

    USGS Publications Warehouse

    Bartholow, John; Heasley, John; Laake, Jeff; Sandelin, Jeff; Coughlan, Beth A.K.; Moos, Alan

    2002-01-01

    SALMOD is a computer model that simulates the dynamics of freshwater salmonid populations, both anadromous and resident. The conceptual model was developed in a workshop setting (Williamson et al. 1993) using fish experts concerned with Trinity River chinook restoration. The model builds on the foundation laid by similar models (see Cheslak and Jacobson 1990). The model’s premise that that egg and fish mortality are directly related to spatially and temporally variable micro- and macrohabitat limitations, which themselves are related to the timing and amount of streamflow and other meteorological variables. Habitat quality and capacity are characterized by the hydraulic and thermal properties of individual mesohabitats, which we use as spatial “computation units” in the model. The model tracks a population of spatially distinct cohorts that originate as gees and grow from one life stage to another as a function of local water temperature. Individual cohorts either remain in the computational unit in which they emerged or move, in whole or in part, to nearby units (see McCormick et al. 1998). Model processes include spawning (with red superimposition and incubation losses), growth (including egg maturation), mortality, and movement (freshet-induced, habitat-induced, and seasonal). Model processes are implemented such that the user (modeler) has the ability to more-or-less program the model on the fly to create the dynamics thought to animate the population. SALMOD then tabulates the various causes of mortality and the whereabouts of fish.

  15. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    SciTech Connect

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  16. User Information Fusion Decision Making Analysis with the C-OODA Model

    DTIC Science & Technology

    2011-07-01

    User Information Fusion Decision Making Analysis with the C-OODA Model Erik P. Blasch Defence R&D Canada-Valcartier 2459 Pie -XI Blvd. North...Québec City, QC G3J 1X5 erik.blasch@drdc-rddc.gc.ca Richard Breton Defence R&D Canada-Valcartier 2459 Pie -XI Blvd. North Québec City, QC G3J...1X5 Richard.breton@drdc-rddc.gc.ca Pierre Valin, Eloi Bosse Defence R&D Canada-Valcartier 2459 Pie -XI Blvd. North Québec City, QC G3J 1X5

  17. Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1994-01-01

    An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.

  18. AIRCL: A programmed system for generating NC tapes for airplane models. User's manual

    NASA Technical Reports Server (NTRS)

    Akgerman, N.; Billhardt, C. F.

    1981-01-01

    A computer program is presented which calculates the cutter location file needed to machine models of airplane wings or wing-fuselage combinations on numerically controlled machine tools. Input to the program is a data file consisting of coordinates on the fuselage and wing. From this data file, the program calculates tool offsets, determines the intersection between wing and fuselage tool paths, and generates additional information needed to machine the fuselage and/or wing. Output from the program can be post processed for use on a variety of milling machines. Information on program structure and methodology is given as well as the user's manual for implementation of the program.

  19. Modeled estimates of myocardial infarction and venous thromboembolic disease in users of second and third generation oral contraceptives.

    PubMed

    Schwingl, P J; Shelton, J

    1997-03-01

    Consistent reports from several recent studies suggest that users of third generation oral contraceptives (OCs) containing gestodene and desogestrel may be at increased risk of venous thromboembolic disease (VTE). Paradoxically, other reports indicate that these users may be at decreased risk of acute myocardial infarction (MI) compared with users of second generation OCs. To determine whether the potentially increased risk of VTE would outweigh the potentially reduced risk of MI in users of third generation OCs, we conducted an analysis to quantify the trade-offs providers and users may be faced to make between these formulations. The baseline rates of VTE and MI among non-users were calculated using US data on incidence and mortality of these conditions and estimates of the proportion of women exposed to these formulations in the US. These were multiplied by relative risks published in recent studies on third generation progestins to produce age- and formulation-specific risks. Results indicate that there would be small differences in disease burden between users of second and third generation OCs under the model assumptions at younger ages. However, among women 35-44 years of age, modeling results indicate that the potentially decreased incidence of MI among users of third generation OCs more than offsets the potentially increased risk of VTE at this age.

  20. Applications of Generalizability Theory and Their Relations to Classical Test Theory and Structural Equation Modeling.

    PubMed

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2017-01-23

    Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record

  1. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  2. Modeling transonic aerodynamic response using nonlinear systems theory for use with modern control theory

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1993-01-01

    The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.

  3. A user-friendly model for spray drying to aid pharmaceutical product development.

    PubMed

    Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L J; Frijlink, Henderik W

    2013-01-01

    The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach.

  4. A User-Friendly Model for Spray Drying to Aid Pharmaceutical Product Development

    PubMed Central

    Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L. J.; Frijlink, Henderik W.

    2013-01-01

    The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach. PMID:24040240

  5. User's guide for a personal computer model of turbulence at a wind turbine rotor

    NASA Astrophysics Data System (ADS)

    Connell, J. R.; Powell, D. C.; Gower, G. L.

    1989-08-01

    This document is primarily: (1) a user's guide for the personal computer (PC) version of the code for the PNL computational model of the rotationally sampled wind speed (RODASIM11), and (2) a brief guide to the growing literature on the subject of rotationally sampled turbulence, from which the model is derived. The model generates values of turbulence experienced by single points fixed in the rotating frame of reference of an arbitrary wind turbine blade. The character of the turbulence depends on the specification of mean wind speed, the variance of turbulence, the crosswind and along-wind integral scales of turbulence, mean wind shear, and the hub height, radius, and angular speed of rotation of any point at which wind fluctuation is to be calculated.

  6. User's guide for a personal computer model of turbulence at a wind turbine rotor

    SciTech Connect

    Connell, J.R.; Powell, D.C.; Gower, G.L.

    1989-08-01

    This document is primarily (1) a user's guide for the personal computer (PC) version of the code for the PNL computational model of the rotationally sampled wind speed (RODASIM11) and (2) a brief guide to the growing literature on the subject of rotationally sampled turbulence, from which the model is derived. The model generates values of turbulence experienced by single points fixed in the rotating frame of reference of an arbitrary wind turbine blade. The character of the turbulence depends on the specification of mean wind speed, the variance of turbulence, the crosswind and along-wind integral scales of turbulence, mean wind shear, and the hub height, radius, and angular speed of rotation of any point at which wind fluctuation is to be calculated. 13 refs., 4 figs., 4 tabs.

  7. Psycholinguistic Theory of Learning to Read Compared to the Traditional Theory Model.

    ERIC Educational Resources Information Center

    Murphy, Robert F.

    A comparison of two models of the reading process--the psycholinguistic model, in which learning to read is seen as a top-down, holistic procedure, and the traditional theory model, in which learning to read is seen as a bottom-up, atomistic procedure--is provided in this paper. The first part of the paper provides brief overviews of the following…

  8. Posterior Predictive Assessment of Item Response Theory Models

    ERIC Educational Resources Information Center

    Sinharay, Sandip; Johnson, Matthew S.; Stern, Hal S.

    2006-01-01

    Model checking in item response theory (IRT) is an underdeveloped area. There is no universally accepted tool for checking IRT models. The posterior predictive model-checking method is a popular Bayesian model-checking tool because it has intuitive appeal, is simple to apply, has a strong theoretical basis, and can provide graphical or numerical…

  9. Posterior Predictive Model Checking for Multidimensionality in Item Response Theory

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.; Sinharay, Sandip

    2009-01-01

    If data exhibit multidimensionality, key conditional independence assumptions of unidimensional models do not hold. The current work pursues posterior predictive model checking, a flexible family of model-checking procedures, as a tool for criticizing models due to unaccounted for dimensions in the context of item response theory. Factors…

  10. ANIMO 3.5: User`s guide for the ANIMO version 3.5 nutrient leaching model

    SciTech Connect

    Kroes, J.; Roelsma, J.

    1998-12-31

    This document presents a description of the use of the nutrient leaching model ANIMO (Agricultural Nutrient Model) version 3.5 with special emphasis for input instructions. Each input parameter is characterized by its unit, range, data type, variable name in computer code and symbol in theoretical description, Program outputs and program execution are briefly given. An example is presented with values of input parameters and model results. A technical program description is given as a brief description of program structure, nomenclature, and source code.

  11. Theory and model use in social marketing health interventions.

    PubMed

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  12. The danger model: questioning an unconvincing theory.

    PubMed

    Józefowski, Szczepan

    2016-02-01

    Janeway's pattern recognition theory holds that the immune system detects infection through a limited number of the so-called pattern recognition receptors (PRRs). These receptors bind specific chemical compounds expressed by entire groups of related pathogens, but not by host cells (pathogen-associated molecular patterns (PAMPs). In contrast, Matzinger's danger hypothesis postulates that products released from stressed or damaged cells have a more important role in the activation of immune system than the recognition of nonself. These products, named by analogy to PAMPs as danger-associated molecular patterns (DAMPs), are proposed to act through the same receptors (PRRs) as PAMPs and, consequently, to stimulate largely similar responses. Herein, I review direct and indirect evidence that contradict the widely accepted danger theory, and suggest that it may be false.

  13. Technical documentation and user's guide for City-County Allocation Model (CCAM). Version 1. 0

    SciTech Connect

    Clark, L.T. Jr.; Scott, M.J.; Hammer, P.

    1986-05-01

    The City-County Allocation Model (CCAM) was developed as part of the Monitored Retrievable Storage (MRS) Program. The CCAM model was designed to allocate population changes forecasted by the MASTER model to specific local communities within commuting distance of the MRS facility. The CCAM model was designed to then forecast the potential changes in demand for key community services such as housing, police protection, and utilities for these communities. The CCAM model uses a flexible on-line data base on demand for community services that is based on a combination of local service levels and state and national service standards. The CCAM model can be used to quickly forecast the potential community service consequence of economic development for local communities anywhere in the country. The remainder of this document is organized as follows. The purpose of this manual is to assist the user in understanding and operating the City-County Allocation Model (CCAM). The annual explains the data sources for the model and code modifications as well as the operational procedures.

  14. The theory research of multi-user quantum access network with Measurement Device Independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Ji, Yi-Ming; Li, Yun-Xia; Shi, Lei; Meng, Wen; Cui, Shu-Min; Xu, Zhen-Yu

    2015-10-01

    Quantum access network can't guarantee the absolute security of multi-user detector and eavesdropper can get access to key information through time-shift attack and other ways. Measurement-device-independent quantum key distribution is immune from all the detection attacks, and accomplishes the safe sharing of quantum key. In this paper, that Measurement-device-independent quantum key distribution is used in the application of multi-user quantum access to the network is on the research. By adopting time-division multiplexing technology to achieve the sharing of multiuser detector, the system structure is simplified and the security of quantum key sharing is acquired.

  15. Thoughts about conceptual models, theories, and quality improvement projects.

    PubMed

    Fawcett, Jacqueline

    2014-10-01

    This essay focuses on how a conceptual model of nursing can be the basis for identification of the phenomenon of interest for a quality improvement project and how a theory of quality improvement or a theory of change is the methodological guide for the project. An explanation and examples of conceptual-theoretical-empirical structures for quality improvement projects are given.

  16. A continuum theory for modeling the dynamics of crystalline materials.

    PubMed

    Xiong, Liming; Chen, Youping; Lee, James D

    2009-02-01

    This paper introduces a multiscale field theory for modeling and simulation of the dynamics of crystalline materials. The atomistic formulation of a multiscale field theory is briefly introduced. Its applicability is discussed. A few application examples, including phonon dispersion relations of ferroelectric materials BiScO3 and MgO nano dot under compression are presented.

  17. Reframing Leadership Pedagogy through Model and Theory Building.

    ERIC Educational Resources Information Center

    Mello, Jeffrey A.

    1999-01-01

    Leadership theories formed the basis of a course assignment with four objectives: understanding complex factors affecting leadership dynamics, developing abilities to assess organizational factors influencing leadership, practicing model and theory building, and viewing leadership from a multicultural perspective. The assignment was to develop a…

  18. Scaling theory of depinning in the Sneppen model

    SciTech Connect

    Maslov, S.; Paczuski, M. Department of Physics, State University of New York at Stony Brook, Stony Brook, New York 11790 The Isaac Newton Institute for Mathematical Sciences, 20 Clarkson Road, Cambridge CB4 0EH )

    1994-08-01

    We develop a scaling theory for the critical depinning behavior of the Sneppen interface model [Phys. Rev. Lett. [bold 69], 3539 (1992)]. This theory is based on a gap'' equation that describes the self-organization process to a critical state of the depinning transition. All of the critical exponents can be expressed in terms of two independent exponents, [nu][sub [parallel

  19. A Model of the Economic Theory of Regulation for Undergraduates.

    ERIC Educational Resources Information Center

    Wilson, Brooks

    1995-01-01

    Presents a model of the economic theory of regulation and recommends its use in undergraduate economics classes. Describes the use of computer-assisted instruction to teach the theory. Maintains that the approach enables students to gain access to graphs and tables that they produce themselves. (CFR)

  20. An evaluation of high-risk behaviors among female drug users based on Health Belief Model.

    PubMed

    Ilika, F; Jamshidimanesh, M; Hoseini, M; Saffari, M; Peyravi, H

    2015-01-01

    Objectives. Because of the physiological nature of the female reproductive system, women are susceptible to infectious diseases, especially STD and AIDS. Addiction and high-risk behaviors also grow danger of these diseases. The reason of this paper was to examine high-risk behaviors among female drug users based on the Health Belief Model. Methods. Participants of this study were 106 female drug users aged 18 years and older; by the undermost level of literacy skills and been involved in sexual relationships. They came to Drop-In-Centers (DIC) in Tehran, the capital of Iran. Data study was controlled by using a logistic reflux investigation and Pearson correlation analysis. Results. The conclusion showed that women's overall awareness was moderate. There were a considerable relationship among awareness and years old (p=0.006), awareness and education (p> 0.0001), and awareness and conjugal situation (p=0.062). Perceived sensitivity and severity were clearly compared by education level (p=0.007) and (p=0.014), respectively. Mean scores of perceived benefits and perceived severity of high-risk behaviors were estimated to be superior to other components. Conclusion. Awareness and perceived susceptibility must be raised regarding the educational schedule, which is according to the health belief model in the addiction field, to reduce perceived barriers to risky behavior prevention of women who use drugs.

  1. Incentivizing biodiversity conservation in artisanal fishing communities through territorial user rights and business model innovation.

    PubMed

    Gelcich, Stefan; Donlan, C Josh

    2015-08-01

    Territorial user rights for fisheries are being promoted to enhance the sustainability of small-scale fisheries. Using Chile as a case study, we designed a market-based program aimed at improving fishers' livelihoods while incentivizing the establishment and enforcement of no-take areas within areas managed with territorial user right regimes. Building on explicit enabling conditions (i.e., high levels of governance, participation, and empowerment), we used a place-based, human-centered approach to design a program that will have the necessary support and buy-in from local fishers to result in landscape-scale biodiversity benefits. Transactional infrastructure must be complex enough to capture the biodiversity benefits being created, but simple enough so that the program can be scaled up and is attractive to potential financiers. Biodiversity benefits created must be commoditized, and desired behavioral changes must be verified within a transactional context. Demand must be generated for fisher-created biodiversity benefits in order to attract financing and to scale the market model. Important design decisions around these 3 components-supply, transactional infrastructure, and demand-must be made based on local social-ecological conditions. Our market model, which is being piloted in Chile, is a flexible foundation on which to base scalable opportunities to operationalize a scheme that incentivizes local, verifiable biodiversity benefits via conservation behaviors by fishers that could likely result in significant marine conservation gains and novel cross-sector alliances.

  2. Financial constraints in capacity planning: a national utility regulatory model (NUREG). Volume II of III: user's guide. Final report

    SciTech Connect

    Not Available

    1981-10-29

    This volume is a User's Guide to the National Utility Regulatory Model (NUREG) and its implementation of the National Coal Model. This is the second of three volumes provided by ICF under contract number DEAC-01-79EI-10579. These three volumes are: a manual describing the NUREG methodology; a users guide; and a description of the software. This manual provides a brief introduction to the National Utility Regulation Model, describes the various programs that comprise the National Utility Regulatory Model, gives sample input files, and provides information needed to run the model.

  3. Using Web 2.0 Techniques To Bring Global Climate Modeling To More Users

    NASA Astrophysics Data System (ADS)

    Chandler, M. A.; Sohl, L. E.; Tortorici, S.

    2012-12-01

    The Educational Global Climate Model has been used for many years in undergraduate courses and professional development settings to teach the fundamentals of global climate modeling and climate change simulation to students and teachers. While course participants have reported a high level of satisfaction in these courses and overwhelmingly claim that EdGCM projects are worth the effort, there is often a high level of frustration during the initial learning stages. Many of the problems stem from issues related to installation of the software suite and to the length of time it can take to run initial experiments. Two or more days of continuous run time may be required before enough data has been gathered to begin analyses. Asking users to download existing simulation data has not been a solution because the GCM data sets are several gigabytes in size, requiring substantial bandwidth and stable dedicated internet connections. As a means of getting around these problems we have been developing a Web 2.0 utility called EzGCM (Easy G-G-M) which emphasizes that participants learn the steps involved in climate modeling research: constructing a hypothesis, designing an experiment, running a computer model and assessing when an experiment has finished (reached equilibrium), using scientific visualization to support analysis, and finally communicating the results through social networking methods. We use classic climate experiments that can be "rediscovered" through exercises with EzGCM and are attempting to make this Web 2.0 tool an entry point into climate modeling for teachers with little time to cover the subject, users with limited computer skills, and for those who want an introduction to the process before tackling more complex projects with EdGCM.

  4. Coordinating the Complexity of Tools, Tasks, and Users: On Theory-Based Approaches to Authoring Tool Usability

    ERIC Educational Resources Information Center

    Murray, Tom

    2016-01-01

    Intelligent Tutoring Systems authoring tools are highly complex educational software applications used to produce highly complex software applications (i.e. ITSs). How should our assumptions about the target users (authors) impact the design of authoring tools? In this article I first reflect on the factors leading to my original 1999 article on…

  5. Multicategorical Spline Model for Item Response Theory.

    ERIC Educational Resources Information Center

    Abrahamowicz, Michal; Ramsay, James O.

    1992-01-01

    A nonparametric multicategorical model for multiple-choice data is proposed as an extension of the binary spline model of J. O. Ramsay and M. Abrahamowicz (1989). Results of two Monte Carlo studies illustrate the model, which approximates probability functions by rational splines. (SLD)

  6. Development of a dynamic computational model of social cognitive theory.

    PubMed

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-12-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.

  7. User's manual for heat-pump seasonal-performance model (SPM) with selected parametric examples

    SciTech Connect

    Not Available

    1982-06-30

    The Seasonal Performance Model (SPM) was developed to provide an accurate source of seasonal energy consumption and cost predictions for the evaluation of heat pump design options. The program uses steady state heat pump performance data obtained from manufacturers' or Computer Simulation Model runs. The SPM was originally developed in two forms - a cooling model for central air conditioners and heat pumps and a heating model for heat pumps. The original models have undergone many modifications, which are described, to improve the accuracy of predictions and to increase flexibility for use in parametric evaluations. Insights are provided into the theory and construction of the major options, and into the use of the available options and output variables. Specific investigations provide examples of the possible applications of the model. (LEW)

  8. HIGHWAY, a transportation routing model: program description and revised users' manual

    SciTech Connect

    Joy, D.S.; Johnson, P.E.

    1983-10-01

    A computerized transportation routing model has been developed at the Oak Ridge National Laboratory to be used for predicting likely routes for shipping radioactive materials. The HIGHWAY data base is a computerized road atlas containing descriptions of the entire Interstate System, the federal highway system, and most of the principal state roads. In addition to its prediction of the most likely commercial route, options incorporated in the HIGHWAY model can allow for maximum use of Interstate highways or routes that will bypass urbanized areas containing populations greater than 100,000 persons. The user may also interactively modify the data base to predict routes that bypass any particular state, city, town, or specific highway segment.

  9. User manual for ATILA, a finite-element code for modeling piezoelectric transducers

    NASA Astrophysics Data System (ADS)

    Decarpigny, Jean-Noel; Debus, Jean-Claude

    1987-09-01

    This manual for the user of the finite-element code ATILA provides instruction for entering information and running the code on a VAX computer. The manual does not include the code. The finite element code ATILA has been specifically developed to aid the design of piezoelectric devices, mainly for sonar applications. Thus, it is able to perform the model analyses of both axisymmetrical and fully three-dimensional piezoelectric transducers. It can also provide their harmonic response under radiating conditions: nearfield and farfield pressure, transmitting voltage response, directivity pattern, electrical impedance, as well as displacement field, nodal plane positions, stress field and various stress criteria...Its accuracy and its ability to describe the physical behavior of various transducers (Tonpilz transducers, double headmass symmetrical length expanders, free flooded rings, flextensional transducers, bender bars, cylindrical and trilaminar hydrophones...) have been checked by modelling more than twenty different structures and comparing numerical and experimental results.

  10. Linear control theory for gene network modeling.

    PubMed

    Shin, Yong-Jun; Bleris, Leonidas

    2010-09-16

    Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain) and linear state-space (time domain) can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.

  11. Scaling Users' Perceptions of Library Service Quality Using Item Response Theory: A LibQUAL+ [TM] Study

    ERIC Educational Resources Information Center

    Wei, Youhua; Thompson, Bruce; Cook, C. Colleen

    2005-01-01

    LibQUAL+[TM] data to date have not been subjected to the modern measurement theory called polytomous item response theory (IRT). The data interpreted here were collected from 42,090 participants who completed the "American English" version of the 22 core LibQUAL+[TM] items, and 12,552 participants from Australia and Europe who…

  12. A model of the measurement process in quantum theory

    NASA Astrophysics Data System (ADS)

    Diel, H. H.

    2015-07-01

    The so-called measurement problem of quantum theory (QT) is still lacking a satisfactory, or at least widely agreed upon, solution. A number of theories, known as interpretations of quantum theory, have been proposed and found differing acceptance among physicists. Most of the proposed theories try to explain what happens during a QT measurement using a modification of the declarative equations that define the possible results of a measurement of QT observables or by making assumptions outside the scope of falsifiable physics. This paper proposes a solution to the QT measurement problem in terms of a model of the process for the evolution of two QT systems that interact in a way that represents a measurement. The model assumes that the interactions between the measured QT object and the measurement apparatus are ’’normal” interactions which adhere to the laws of quantum field theory.

  13. A Graphical User Interface for Parameterizing Biochemical Models of Photosynthesis and Chlorophyll Fluorescence

    NASA Astrophysics Data System (ADS)

    Kornfeld, A.; Van der Tol, C.; Berry, J. A.

    2015-12-01

    Recent advances in optical remote sensing of photosynthesis offer great promise for estimating gross primary productivity (GPP) at leaf, canopy and even global scale. These methods -including solar-induced chlorophyll fluorescence (SIF) emission, fluorescence spectra, and hyperspectral features such as the red edge and the photochemical reflectance index (PRI) - can be used to greatly enhance the predictive power of global circulation models (GCMs) by providing better constraints on GPP. The way to use measured optical data to parameterize existing models such as SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes) is not trivial, however. We have therefore extended a biochemical model to include fluorescence and other parameters in a coupled treatment. To help parameterize the model, we then use nonlinear curve-fitting routines to determine the parameter set that enables model results to best fit leaf-level gas exchange and optical data measurements. To make the tool more accessible to all practitioners, we have further designed a graphical user interface (GUI) based front-end to allow researchers to analyze data with a minimum of effort while, at the same time, allowing them to change parameters interactively to visualize how variation in model parameters affect predicted outcomes such as photosynthetic rates, electron transport, and chlorophyll fluorescence. Here we discuss the tool and its effectiveness, using recently-gathered leaf-level data.

  14. Effect of Human Model Height and Sex on Induced Current Dosimetry in Household Induction Heater Users

    NASA Astrophysics Data System (ADS)

    Tarao, Hiroo; Hayashi, Noriyuki; Isaka, Katsuo

    Induced currents in the high-resolution, anatomical human models are numerically calculated by the impedance method. The human models are supposed to be exposed to highly inhomogeneous 20.9 kHz magnetic fields from a household induction heater (IH). In the case of the adult models, the currents ranging from 5 to 19 mA/m2 are induced for between the shoulder and lower abdomen. Meanwhile, in the case of the child models, the currents ranging from 5 to 21 mA/m2 are induced for between the head and abdomen. In particular, the induced currents near the brain tissue are almost the same as those near the abdomen. When the induced currents in the central nervous system tissues are considered, the induced currents in the child model are 2.1 to 6.9 times as large as those in the adult model under the same B-field exposure environment. These results suggest the importance of further investigation intended for a pregnant female who uses the IH as well as for a child (or the IH users of small standing height).

  15. Users Manual for the Geospatial Stream Flow Model (GeoSFM)

    USGS Publications Warehouse

    Artan, Guleid A.; Asante, Kwabena; Smith, Jodie; Pervez, Md Shahriar; Entenmann, Debbie; Verdin, James P.; Rowland, James

    2008-01-01

    The monitoring of wide-area hydrologic events requires the manipulation of large amounts of geospatial and time series data into concise information products that characterize the location and magnitude of the event. To perform these manipulations, scientists at the U.S. Geological Survey Center for Earth Resources Observation and Science (EROS), with the cooperation of the U.S. Agency for International Development, Office of Foreign Disaster Assistance (USAID/OFDA), have implemented a hydrologic modeling system. The system includes a data assimilation component to generate data for a Geospatial Stream Flow Model (GeoSFM) that can be run operationally to identify and map wide-area streamflow anomalies. GeoSFM integrates a geographical information system (GIS) for geospatial preprocessing and postprocessing tasks and hydrologic modeling routines implemented as dynamically linked libraries (DLLs) for time series manipulations. Model results include maps that depicting the status of streamflow and soil water conditions. This Users Manual provides step-by-step instructions for running the model and for downloading and processing the input data required for initial model parameterization and daily operation.

  16. TIME Impact - a new user-friendly tuberculosis (TB) model to inform TB policy decisions.

    PubMed

    Houben, R M G J; Lalli, M; Sumner, T; Hamilton, M; Pedrazzoli, D; Bonsu, F; Hippner, P; Pillay, Y; Kimerling, M; Ahmedov, S; Pretorius, C; White, R G

    2016-03-24

    Tuberculosis (TB) is the leading cause of death from infectious disease worldwide, predominantly affecting low- and middle-income countries (LMICs), where resources are limited. As such, countries need to be able to choose the most efficient interventions for their respective setting. Mathematical models can be valuable tools to inform rational policy decisions and improve resource allocation, but are often unavailable or inaccessible for LMICs, particularly in TB. We developed TIME Impact, a user-friendly TB model that enables local capacity building and strengthens country-specific policy discussions to inform support funding applications at the (sub-)national level (e.g. Ministry of Finance) or to international donors (e.g. the Global Fund to Fight AIDS, Tuberculosis and Malaria).TIME Impact is an epidemiological transmission model nested in TIME, a set of TB modelling tools available for free download within the widely-used Spectrum software. The TIME Impact model reflects key aspects of the natural history of TB, with additional structure for HIV/ART, drug resistance, treatment history and age. TIME Impact enables national TB programmes (NTPs) and other TB policymakers to better understand their own TB epidemic, plan their response, apply for funding and evaluate the implementation of the response.The explicit aim of TIME Impact's user-friendly interface is to enable training of local and international TB experts towards independent use. During application of TIME Impact, close involvement of the NTPs and other local partners also builds critical understanding of the modelling methods, assumptions and limitations inherent to modelling. This is essential to generate broad country-level ownership of the modelling data inputs and results. In turn, it stimulates discussions and a review of the current evidence and assumptions, strengthening the decision-making process in general.TIME Impact has been effectively applied in a variety of settings. In South Africa, it

  17. A Sharing Item Response Theory Model for Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Segall, Daniel O.

    2004-01-01

    A new sharing item response theory (SIRT) model is presented that explicitly models the effects of sharing item content between informants and test takers. This model is used to construct adaptive item selection and scoring rules that provide increased precision and reduced score gains in instances where sharing occurs. The adaptive item selection…

  18. Bianchi class A models in Sàez-Ballester's theory

    NASA Astrophysics Data System (ADS)

    Socorro, J.; Espinoza-García, Abraham

    2012-08-01

    We apply the Sàez-Ballester (SB) theory to Bianchi class A models, with a barotropic perfect fluid in a stiff matter epoch. We obtain exact classical solutions à la Hamilton for Bianchi type I, II and VIh=-1 models. We also find exact quantum solutions to all Bianchi Class A models employing a particular ansatz for the wave function of the universe.

  19. A Dynamic Systems Theory Model of Visual Perception Development

    ERIC Educational Resources Information Center

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  20. Social Learning Theory and the Health Belief Model.

    ERIC Educational Resources Information Center

    Rosenstock, Irwin M.; And Others

    1988-01-01

    This article shows how the Health Belief Model, social learning theory, and locus of control may be related and posits an explanatory model that incorporates self-efficacy into the Health Belief Model. Self-efficacy is proposed as an independent variable with the traditional variables of perceived susceptibility, severity, benefits, and barriers.…

  1. The monster sporadic group and a theory underlying superstring models

    SciTech Connect

    Chapline, G.

    1996-09-01

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs.

  2. Theory and Modeling of Stimulated Raman Scattering

    DTIC Science & Technology

    1993-06-01

    nondiffraction- limited pump beam, Gaussian -Hermite (G-H) beams, Gaussian -Laguerre (G-L) beams, and Gaussian - Schell - model (GSM) beams are used. The AM2 factor...Laguerre (G-L) beams, and Gaussian - Schell - model (GSM) beams are used. The M 2 factor of these beams can be calculated analytically. A random...defined for elliptical beams and AM2 is not changed by astigmatic lenses. The Gaussian - Schell - model (GSM) beam has a Gaussian intensity profile given

  3. CIRCE2/DEKGEN2: A software package for facilitated optical analysis of 3-D distributed solar energy concentrators. Theory and user manual

    SciTech Connect

    Romero, V.J.

    1994-03-01

    CIRCE2 is a computer code for modeling the optical performance of three-dimensional dish-type solar energy concentrators. Statistical methods are used to evaluate the directional distribution of reflected rays from any given point on the concentrator. Given concentrator and receiver geometries, sunshape (angular distribution of incident rays from the sun), and concentrator imperfections such as surface roughness and random deviation in slope, the code predicts the flux distribution and total power incident upon the target. Great freedom exists in the variety of concentrator and receiver configurations that can be modeled. Additionally, provisions for shading and receiver aperturing are included.- DEKGEN2 is a preprocessor designed to facilitate input of geometry, error distributions, and sun models. This manual describes the optical model, user inputs, code outputs, and operation of the software package. A user tutorial is included in which several collectors are built and analyzed in step-by-step examples.

  4. Consumer preference models: fuzzy theory approach

    NASA Astrophysics Data System (ADS)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  5. Hypertext Browsing: A New Model for Information Filtering Based on User Profiles and Data Clustering.

    ERIC Educational Resources Information Center

    Shapira, Bracha; And Others

    1996-01-01

    Discussion of hypertext browsing proposes a filtering algorithm which restricts the amount of information made available to the user by calculating the set of most relevant hypertext nodes for the user, utilizing the user profile and data clustering technique. An example is provided of an optimal cluster of relevant data items. (Author/LRW)

  6. INTERLINE 5.0 -- An expanded railroad routing model: Program description, methodology, and revised user`s manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S.; Clarke, D.B.; Jacobi, J.M.

    1993-03-01

    A rail routine model, INTERLINE, has been developed at the Oak Ridge National Laboratory to investigate potential routes for transporting radioactive materials. In Version 5.0, the INTERLINE routing algorithms have been enhanced to include the ability to predict alternative routes, barge routes, and population statistics for any route. The INTERLINE railroad network is essentially a computerized rail atlas describing the US railroad system. All rail lines, with the exception of industrial spurs, are included in the network. Inland waterways and deep water routes along with their interchange points with the US railroadsystem are also included. The network contains over 15,000 rail and barge segments (links) and over 13,000 stations, interchange points, ports, and other locations (nodes). The INTERLINE model has been converted to operate on an IBM-compatible personal computer. At least a 286 computer with a hard disk containing approximately 6 MB of free space is recommended. Enhanced program performance will be obtained by using arandom-access memory drive on a 386 or 486 computer.

  7. FINDING POTENTIALLY UNSAFE NUTRITIONAL SUPPLEMENTS FROM USER REVIEWS WITH TOPIC MODELING.

    PubMed

    Sullivan, Ryan; Sarker, Abeed; O'Connor, Karen; Goodin, Amanda; Karlsrud, Mark; Gonzalez, Graciela

    2016-01-01

    Although dietary supplements are widely used and generally are considered safe, some supplements have been identified as causative agents for adverse reactions, some of which may even be fatal. The Food and Drug Administration (FDA) is responsible for monitoring supplements and ensuring that supplements are safe. However, current surveillance protocols are not always effective. Leveraging user-generated textual data, in the form of Amazon.com reviews for nutritional supplements, we use natural language processing techniques to develop a system for the monitoring of dietary supplements. We use topic modeling techniques, specifically a variation of Latent Dirichlet Allocation (LDA), and background knowledge in the form of an adverse reaction dictionary to score products based on their potential danger to the public. Our approach generates topics that semantically capture adverse reactions from a document set consisting of reviews posted by users of specific products, and based on these topics, we propose a scoring mechanism to categorize products as "high potential danger", "average potential danger" and "low potential danger." We evaluate our system by comparing the system categorization with human annotators, and we find that the our system agrees with the annotators 69.4% of the time. With these results, we demonstrate that our methods show promise and that our system represents a proof of concept as a viable low-cost, active approach for dietary supplement monitoring.

  8. Hybrid2: The hybrid system simulation model, Version 1.0, user manual

    SciTech Connect

    Baring-Gould, E.I.

    1996-06-01

    In light of the large scale desire for energy in remote communities, especially in the developing world, the need for a detailed long term performance prediction model for hybrid power systems was seen. To meet these ends, engineers from the National Renewable Energy Laboratory (NREL) and the University of Massachusetts (UMass) have spent the last three years developing the Hybrid2 software. The Hybrid2 code provides a means to conduct long term, detailed simulations of the performance of a large array of hybrid power systems. This work acts as an introduction and users manual to the Hybrid2 software. The manual describes the Hybrid2 code, what is included with the software and instructs the user on the structure of the code. The manual also describes some of the major features of the Hybrid2 code as well as how to create projects and run hybrid system simulations. The Hybrid2 code test program is also discussed. Although every attempt has been made to make the Hybrid2 code easy to understand and use, this manual will allow many organizations to consider the long term advantages of using hybrid power systems instead of conventional petroleum based systems for remote power generation.

  9. INM, Integrated Noise Model. Version 4.11: User's guide, supplement

    NASA Astrophysics Data System (ADS)

    Fleming, Gregg G.; Daprile, John R.

    1993-12-01

    The John A. Volpe National Transportation Systems Center (Volpe Center), in support of the Federal Aviation administration, Office of Environment and Energy, has developed Version 4.11 of the Integrated Noise Model (INM). The User's Guide for the Version 4.11 computer software is a supplement to INM, Version 3, User's Guide - Revision 1 for the Version 3.10 computer software released in June, 1992. The Version 4.11 supplement, prepared by the Volpe Center's Acoustics Facility, presents computer system requirements as well as installation procedures and enhancements. Specific enhancements discussed include: (1) the introduction of a takeoff profile generator; (2) the ability to account for terrain elevation around a specified airport; (3) the ability to compute the CNEL, WECPNL, LEQDAY, and LEQNIGHT noise metrics; (4) the ability to account for airplane runup operations; (5) the ability to account for displaced runway thresholds during approach operations; (6) an enhancement to the noise contour computations: (7) an increase in the number of takeoff profile segments; and (8) enhancements to the echo file.

  10. Baldrige Theory into Practice: A Generic Model

    ERIC Educational Resources Information Center

    Arif, Mohammed

    2007-01-01

    Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…

  11. Modeling pyramidal sensors in ray-tracing software by a suitable user-defined surface

    NASA Astrophysics Data System (ADS)

    Antichi, Jacopo; Munari, Matteo; Magrin, Demetrio; Riccardi, Armando

    2016-04-01

    Following the unprecedented results in terms of performances delivered by the first light adaptive optics system at the Large Binocular Telescope, there has been a wide-spread and increasing interest on the pyramid wavefront sensor (PWFS), which is the key component, together with the adaptive secondary mirror, of the adaptive optics (AO) module. Currently, there is no straightforward way to model a PWFS in standard sequential ray-tracing software. Common modeling strategies tend to be user-specific and, in general, are unsatisfactory for general applications. To address this problem, we have developed an approach to PWFS modeling based on user-defined surface (UDS), whose properties reside in a specific code written in C language, for the ray-tracing software ZEMAX™. With our approach, the pyramid optical component is implemented as a standard surface in ZEMAX™, exploiting its dynamic link library (DLL) conversion then greatly simplifying ray tracing and analysis. We have utilized the pyramid UDS DLL surface-referred to as pyramidal acronyms may be too risky (PAM2R)-in order to design the current PWFS-based AO system for the Giant Magellan Telescope, evaluating tolerances, with particular attention to the angular sensitivities, by means of sequential ray-tracing tools only, thus verifying PAM2R reliability and robustness. This work indicates that PAM2R makes the design of PWFS as simple as that of other optical standard components. This is particularly suitable with the advent of the extremely large telescopes era for which complexity is definitely one of the main challenges.

  12. The IDA/BPT Crisis Relocation Planning Model: Description, Documentation and User’s Guide to the Computer Program.

    DTIC Science & Technology

    1982-12-22

    CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Federal Emergency Management Agency December 22, 1982 Office of Research 13. NUMBER OF PAGES...18 2.0 User’s Guide to the Computer Program ..................... 20 2.1 Master Control Sequence ............................. 22 2.2 Input/Output... control during an evacuation. Users of the IDA/BPT model may choose among three options for controlling traffic during an evacuation. Normal Traffic

  13. Measurement Models for Reasoned Action Theory.

    PubMed

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  14. Measurement Models for Reasoned Action Theory

    PubMed Central

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach. PMID:23243315

  15. User's guide to Model Viewer, a program for three-dimensional visualization of ground-water model results

    USGS Publications Warehouse

    Hsieh, Paul A.; Winston, Richard B.

    2002-01-01

    Model Viewer is a computer program that displays the results of three-dimensional groundwater models. Scalar data (such as hydraulic head or solute concentration) may be displayed as a solid or a set of isosurfaces, using a red-to-blue color spectrum to represent a range of scalar values. Vector data (such as velocity or specific discharge) are represented by lines oriented to the vector direction and scaled to the vector magnitude. Model Viewer can also display pathlines, cells or nodes that represent model features such as streams and wells, and auxiliary graphic objects such as grid lines and coordinate axes. Users may crop the model grid in different orientations to examine the interior structure of the data. For transient simulations, Model Viewer can animate the time evolution of the simulated quantities. The current version (1.0) of Model Viewer runs on Microsoft Windows 95, 98, NT and 2000 operating systems, and supports the following models: MODFLOW-2000, MODFLOW-2000 with the Ground-Water Transport Process, MODFLOW-96, MOC3D (Version 3.5), MODPATH, MT3DMS, and SUTRA (Version 2D3D.1). Model Viewer is designed to directly read input and output files from these models, thus minimizing the need for additional postprocessing. This report provides an overview of Model Viewer. Complete instructions on how to use the software are provided in the on-line help pages.

  16. User Guide for the International Jobs and Economic Development Impacts Model

    SciTech Connect

    Keyser, David; Flores-Espino, Francisco; Uriarte, Caroline; Cox, Sadie

    2016-09-01

    The International Jobs and Economic Development Impacts (I-JEDI) model is a freely available economic model that estimates gross economic impacts from wind, solar, and geothermal energy projects for several different countries. Building on the original JEDI model, which was developed for the United States, I-JEDI was developed under the USAID Enhancing Capacity for Low Emission Development Strategies (EC-LEDS) program to support countries in assessing economic impacts of LEDS actions in the energy sector. I-JEDI estimates economic impacts by characterizing the construction and operation of energy projects in terms of expenditures and the portion of these expenditures made within the country of analysis. These data are then used in a country-specific input-output (I-O) model to estimate employment, earnings, gross domestic product (GDP), and gross output impacts. Total economic impacts are presented as well as impacts by industry. This user guide presents general information about how to use I-JEDI and interpret results as well as detailed information about methodology and model limitations.

  17. Surrogacy theory and models of convoluted organic systems.

    PubMed

    Konopka, Andrzej K

    2007-03-01

    The theory of surrogacy is briefly outlined as one of the conceptual foundations of systems biology that has been developed for the last 30 years in the context of Hertz-Rosen modeling relationship. Conceptual foundations of modeling convoluted (biologically complex) systems are briefly reviewed and discussed in terms of current and future research in systems biology. New as well as older results that pertain to the concepts of modeling relationship, sequence of surrogacies, cascade of representations, complementarity, analogy, metaphor, and epistemic time are presented together with a classification of models in a cascade. Examples of anticipated future applications of surrogacy theory in life sciences are briefly discussed.

  18. Effective Lagrangian Models for gauge theories of fundamental interactions

    NASA Astrophysics Data System (ADS)

    Sannino, Francesco

    The non abelian gauge theory which describes, in the perturbative regime, the strong interactions is Quantum Chromodynamics (QCD). Quarks and gluons are the fundamental degrees of freedom of the theory. A key feature of the theory (due to quantum corrections) is asymptotic freedom, i.e. the strong coupling constant increases as the energy scale of interest decreases. The perturbative approach becomes unreliable below a characteristic scale of the theory (Λ). Quarks and gluons confine themselves into colorless particles called hadrons (pions, protons,/...). The latter are the true physical states of the theory. We need to investigate alternative ways to describe strong interactions, and in general any asymptotically free theory, in the non perturbative regime. This is the fundamental motivation of the present thesis. Although the underlying gauge theory cannot be easily treated in the non perturbative regime we can still use its global symmetries as a guide to build Effective Lagrangian Models. These models will be written directly in terms of the colorless physical states of the theory, i.e. hadrons.

  19. Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation.

    NASA Astrophysics Data System (ADS)

    Lawley, Russell; Barron, Mark; Lark, Murray

    2015-04-01

    At BGS, expert elicitation has been used to evaluate uncertainty of surveyed boundaries in several, common, geological scenarios. As a result, a 'collective' understanding of the issues surrounding each scenario has emerged. The work has provoked wider debate in three key areas: a) what can we do to resolve those scenarios where a 'consensus' of understanding cannot be achieved b) what does it mean for survey practices and subsequent use of maps in 3D models c) how do we communicate the 'collective' understanding of geological mapping (with or without consensus for specific scenarios). Previous work elicited expert judgement for uncertainty in six contrasting mapping scenarios. In five cases it was possible to arrive at a consensus model; in a sixth case experts with different experience (length of service, academic background) took very different views of the nature of the mapping problem. The scenario concerned identification of the boundary between two contrasting tills (one derived from Triassic source materials being red in colour; the other, derived from Jurassic materials being grey in colour). Initial debate during the elicitation identified that the colour contrast should provide some degree of confidence in locating the boundary via traditional auger-traverse survey methods. However, as the elicitation progressed, it became clear that the complexities of the relationship between the two Tills were not uniformly understood across the experts and the panel could not agree a consensus regarding the spatial uncertainty of the boundary. The elicitation process allowed a significant degree of structured knowledge-exchange between experts of differing backgrounds and was successful in identifying a measure of uncertainty for what was considered a contentious scenario. However, the findings have significant implications for a boundary-scenario that is widely mapped across the central regions of Great Britain. We will discuss our experience of the use of

  20. User Requirements from the Climate Modelling Community for Next Generation Global Products from Land Cover CCI Project

    NASA Astrophysics Data System (ADS)

    Kooistra, Lammert; van Groenestijn, Annemarie; Kalogirou, Vasileios; Arino, Olivier; Herold, Martin

    2011-01-01

    Land Cover has been selected as one of 11 Essential Climate Variables which will be elaborated during the first phase of the ESA Climate Change Initiative (2010- 2013). In the first stage of the Land Cover CCI project, an user requirements analysis has been carried out on the basis of which the detailed specifications of a global land cover product can be defined which match the requirements from the Global Climate Observing System (GCOS) and the climate modelling community. As part of the requirements analysis, an user consultation mechanism was set-up to actively involve different climate modelling groups by setting out surveys to different type of users within the climate modelling community and the broad land cover data user community. The evolution of requirements from current models to future new modelling approaches was specifically taken into account. In addition, requirements from the GCOS Implementation Plan 2004 and 2010 and associated strategic earth observation documents for land cover were assessed and a detailed literature review was carried out. The outcome of the user requirements assessment shows that although the range of requirements coming from the climate modelling community is broad, there is a good match among the requirements coming from different user groups and the broader requirements derived from GCOS, CMUG and other relevant international panels. More specific requirements highlight that future land cover datasets should be both stable and have a dynamic component; deal with the consistency in relationships between land cover classes and land surface parameters; should provide flexibility to serve different scales and purposes; and should provide transparency of product quality. As a next step within the Land Cover CCI project, the outcome of this user requirements analysis will be used as input for the product specification of the next generation Global Land Cover datasets.

  1. Homogeneous cosmological models in Yang's gravitation theory

    NASA Technical Reports Server (NTRS)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  2. Modeling workplace bullying using catastrophe theory.

    PubMed

    Escartin, J; Ceja, L; Navarro, J; Zapf, D

    2013-10-01

    Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.

  3. User's Guide for the Agricultural Non-Point Source (AGNPS) Pollution Model Data Generator

    USGS Publications Warehouse

    Finn, Michael P.; Scheidt, Douglas J.; Jaromack, Gregory M.

    2003-01-01

    BACKGROUND Throughout this user guide, we refer to datasets that we used in conjunction with developing of this software for supporting cartographic research and producing the datasets to conduct research. However, this software can be used with these datasets or with more 'generic' versions of data of the appropriate type. For example, throughout the guide, we refer to national land cover data (NLCD) and digital elevation model (DEM) data from the U.S. Geological Survey (USGS) at a 30-m resolution, but any digital terrain model or land cover data at any appropriate resolution will produce results. Another key point to keep in mind is to use a consistent data resolution for all the datasets per model run. The U.S. Department of Agriculture (USDA) developed the Agricultural Nonpoint Source (AGNPS) pollution model of watershed hydrology in response to the complex problem of managing nonpoint sources of pollution. AGNPS simulates the behavior of runoff, sediment, and nutrient transport from watersheds that have agriculture as their prime use. The model operates on a cell basis and is a distributed parameter, event-based model. The model requires 22 input parameters. Output parameters are grouped primarily by hydrology, sediment, and chemical output (Young and others, 1995.) Elevation, land cover, and soil are the base data from which to extract the 22 input parameters required by the AGNPS. For automatic parameter extraction, follow the general process described in this guide of extraction from the geospatial data through the AGNPS Data Generator to generate input parameters required by the pollution model (Finn and others, 2002.)

  4. Teaching-Learning by Means of a Fuzzy-Causal User Model

    NASA Astrophysics Data System (ADS)

    Peña Ayala, Alejandro

    In this research the teaching-learning phenomenon that occurs during an E-learning experience is tackled from a fuzzy-causal perspective. The approach is suitable for dealing with intangible objects of a domain, such as personality, that are stated as linguistic variables. In addition, the bias that teaching content exerts on the user’s mind is sketched through causal relationships. Moreover, by means of fuzzy-causal inference, the user’s apprenticeship is estimated prior to delivering a lecture. This supposition is taken into account to adapt the behavior of a Web-based education system (WBES). As a result of an experimental trial, volunteers that took options of lectures chosen by this user model (UM) achieved higher learning than participants who received lectures’ options that were randomly selected. Such empirical evidence contributes to encourage researchers of the added value that a UM offers to adapt a WBES.

  5. Integrating developmental theory and methodology: Using derivatives to articulate change theories, models, and inferences

    PubMed Central

    Nicholson, Jody; Kouros, Chrystyna; Little, Todd D.; Garber, Judy

    2015-01-01

    Matching theories about growth, development, and change to appropriate statistical models can present a challenge, which can result in misuse, misinterpretation, and underutilization of different analytical approaches. We discuss the use of derivatives --- the change of a construct with respect to changes in another construct. Derivatives provide a common language linking developmental theory and statistical methods. Conceptualizing change in terms of derivatives allows precise translation of theory into method and highlights commonly overlooked models of change. A wide variety of models can be understood in terms of the level, velocity and acceleration of constructs: the 0th, 1st, and 2nd derivatives, respectively. We introduce the language of derivatives, and highlight the conceptually differing questions that can be addressed in developmental studies. A substantive example is presented to demonstrate how common and unfamiliar statistical methodology can be understood as addressing relations between differing pairs of derivatives. PMID:26949327

  6. HIGHWAY 3. 1: An enhanced HIGHWAY routing model: Program description, methodology, and revised user's manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S. ); Clarke, D.B.; Jacobi, J.M. . Transportation Center)

    1993-03-01

    The HIGHWAY program provides a flexible tool for predicting highway routes for transporting radioactive materials in the United States. The HIGHWAY data base is essentially a computerized road atlas that currently describes over 240,000 miles of highways. Complete descriptions of all Interstate System and most US highways (except those that parallel a nearby Interstate highway) are included in the data base. Many of the principal state highways and a number of local and county highways are also identified. The data base also includes locations of nuclear facilities and major airports. Several different types of routes may be calculated, depending on a set of user-supplied constraints. Routes are calculated by minimizing the total impedance between the origin and the destination. Basically, the impedance is defined as a function of distance and driving time along a particular highway segment. Several routing constraints can be imposed during the calculations. One of the special features of the HIGHWAY model is its ability to calculate routes that maximize use of Interstate System highways. This feature allows the user to predict routes for shipments of radioactive materials that conform to the US Department of Transportation routing regulations. Other features of the model include the ability to predict routes that bypass a specific state, city, town, or highway segment. Two special features have been incorporated in HIGHWAY, version 3.1. The first is the ability to automatically calculate alternative routes. Frequently, there are a number of routes between the source and destination that vary slightly in distance and estimated driving time. The HIGHWAY program offers a selection of different but nearly equal routes. The second special feature is the capability to calculate route-specific population density statistics. The population density distribution is calculated for each highway segment in the route and is reported on a state-by-state basis.

  7. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    SciTech Connect

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  8. Landfill Air Emissions Estimation Model, Version 1. 1. User's manual. Final report

    SciTech Connect

    Pelt, W.R.; Bass, R.L.; Kuo, I.R.; Blackard, A.L.

    1991-04-01

    The document is a user's guide for the computer program, Landfill Air Emissions Estimation Model. It provides step-by-step guidance for using the program to estimate landfill air emissions. The purpose of the program is to aid local and state agencies in estimating landfill air emission rates for nonmethane organic compounds and individual air toxics. The program will also be helpful to landfill owners and operators affected by the upcoming New Source Performance Standard (NSPS) and Emission Guidelines for Municipal Solid Waste Landfill Air Emissions. The model is based on the Scholl Canyon Gas Generation Model, used in the development of the soon-to-be-proposed regulation for landfill air emissions. The Scholl Canyon Model is a first order decay equation that uses site-specific characteristics for estimating the gas generation rate. In the absence of site-specific data, the program provides conservative default values from the soon-to-be-proposed NSPS for new landfills and emission guidelines for existing landfills. These default values may be revised based on future information collected by the Agency.

  9. A computer graphical user interface for survival mixture modelling of recurrent infections.

    PubMed

    Lee, Andy H; Zhao, Yun; Yau, Kelvin K W; Ng, S K

    2009-03-01

    Recurrent infections data are commonly encountered in medical research, where the recurrent events are characterised by an acute phase followed by a stable phase after the index episode. Two-component survival mixture models, in both proportional hazards and accelerated failure time settings, are presented as a flexible method of analysing such data. To account for the inherent dependency of the recurrent observations, random effects are incorporated within the conditional hazard function, in the manner of generalised linear mixed models. Assuming a Weibull or log-logistic baseline hazard in both mixture components of the survival mixture model, an EM algorithm is developed for the residual maximum quasi-likelihood estimation of fixed effect and variance component parameters. The methodology is implemented as a graphical user interface coded using Microsoft visual C++. Application to model recurrent urinary tract infections for elderly women is illustrated, where significant individual variations are evident at both acute and stable phases. The survival mixture methodology developed enable practitioners to identify pertinent risk factors affecting the recurrent times and to draw valid conclusions inferred from these correlated and heterogeneous survival data.

  10. User's manual for the Sandia Waste-Isolation Flow and Transport model (SWIFT).

    SciTech Connect

    Reeves, Mark; Cranwell, Robert M.

    1981-11-01

    This report describes a three-dimensional finite-difference model (SWIFT) which is used to simulate flow and transport processes in geologic media. The model was developed for use by the Nuclear Regulatory Commission in the analysis of deep geologic nuclear waste-disposal facilities. This document, as indicated by the title, is a user's manual and is intended to facilitate the use of the SWIFT simulator. Mathematical equations, submodels, application notes, and a description of the program itself are given herein. In addition, a complete input data guide is given along with several appendices which are helpful in setting up a data-input deck. Computer code SWIFT (Sandia Waste Isolation, Flow and Transport Model) is a fully transient, three-dimensional model which solves the coupled equations for transport in geologic media. The processes considered are: (1) fluid flow; (2) heat transport; (3) dominant-species miscible displacement; and (4) trace-species miscible displacement. The first three processes are coupled via fluid density and viscosity. Together they provide the velocity field on which the fourth process depends.

  11. BUMPER v1.0: a Bayesian user-friendly model for palaeo-environmental reconstruction

    NASA Astrophysics Data System (ADS)

    Holden, Philip B.; Birks, H. John B.; Brooks, Stephen J.; Bush, Mark B.; Hwang, Grace M.; Matthews-Bird, Frazer; Valencia, Bryan G.; van Woesik, Robert

    2017-02-01

    We describe the Bayesian user-friendly model for palaeo-environmental reconstruction (BUMPER), a Bayesian transfer function for inferring past climate and other environmental variables from microfossil assemblages. BUMPER is fully self-calibrating, straightforward to apply, and computationally fast, requiring ˜ 2 s to build a 100-taxon model from a 100-site training set on a standard personal computer. We apply the model's probabilistic framework to generate thousands of artificial training sets under ideal assumptions. We then use these to demonstrate the sensitivity of reconstructions to the characteristics of the training set, considering assemblage richness, taxon tolerances, and the number of training sites. We find that a useful guideline for the size of a training set is to provide, on average, at least 10 samples of each taxon. We demonstrate general applicability to real data, considering three different organism types (chironomids, diatoms, pollen) and different reconstructed variables. An identically configured model is used in each application, the only change being the input files that provide the training-set environment and taxon-count data. The performance of BUMPER is shown to be comparable with weighted average partial least squares (WAPLS) in each case. Additional artificial datasets are constructed with similar characteristics to the real data, and these are used to explore the reasons for the differing performances of the different training sets.

  12. A user-friendly one-dimensional model for wet volcanic plumes

    USGS Publications Warehouse

    Mastin, Larry G.

    2007-01-01

    This paper presents a user-friendly graphically based numerical model of one-dimensional steady state homogeneous volcanic plumes that calculates and plots profiles of upward velocity, plume density, radius, temperature, and other parameters as a function of height. The model considers effects of water condensation and ice formation on plume dynamics as well as the effect of water added to the plume at the vent. Atmospheric conditions may be specified through input parameters of constant lapse rates and relative humidity, or by loading profiles of actual atmospheric soundings. To illustrate the utility of the model, we compare calculations with field-based estimates of plume height (∼9 km) and eruption rate (>∼4 × 105 kg/s) during a brief tephra eruption at Mount St. Helens on 8 March 2005. Results show that the atmospheric conditions on that day boosted plume height by 1–3 km over that in a standard dry atmosphere. Although the eruption temperature was unknown, model calculations most closely match the observations for a temperature that is below magmatic but above 100°C.

  13. A simplistic model for identifying prominent web users in directed multiplex social networks: a case study using Twitter networks

    NASA Astrophysics Data System (ADS)

    Loucif, Hemza; Boubetra, Abdelhak; Akrouf, Samir

    2016-10-01

    This paper aims to describe a new simplistic model dedicated to gauge the online influence of Twitter users based on a mixture of structural and interactional features. The model is an additive mathematical formulation which involves two main parts. The first part serves to measure the influence of the Twitter user on just his neighbourhood covering his followers. However, the second part evaluates the potential influence of the Twitter user beyond the circle of his followers. Particularly, it measures the likelihood that the tweets of the Twitter user will spread further within the social graph through the retweeting process. The model is tested on a data set involving four kinds of real-world egocentric networks. The empirical results reveal that an active ordinary user is more prominent than a non-active celebrity one. A simple comparison is conducted between the proposed model and two existing simplistic approaches. The results show that our model generates the most realistic influence scores due to its dealing with both explicit (structural and interactional) and implicit features.

  14. Theory and modeling of electron fishbones

    NASA Astrophysics Data System (ADS)

    Vlad, G.; Fusco, V.; Briguglio, S.; Fogaccia, G.; Zonca, F.; Wang, X.

    2016-10-01

    Internal kink instabilities exhibiting fishbone like behavior have been observed in a variety of experiments where a high energy electron population, generated by strong auxiliary heating and/or current drive systems, was present. After briefly reviewing the experimental evidences of energetic electrons driven fishbones, and the main results of linear and nonlinear theory of electron fishbones, the results of global, self-consistent, nonlinear hybrid MHD-Gyrokinetic simulations will be presented. To this purpose, the extended/hybrid MHD-Gyrokinetic code XHMGC will be used. Linear dynamics analysis will enlighten the effect of considering kinetic thermal ion compressibility and diamagnetic response, and kinetic thermal electrons compressibility, in addition to the energetic electron contribution. Nonlinear saturation and energetic electron transport will also be addressed, making extensive use of Hamiltonian mapping techniques, discussing both centrally peaked and off-axis peaked energetic electron profiles. It will be shown that centrally peaked energetic electron profiles are characterized by resonant excitation and nonlinear response of deeply trapped energetic electrons. On the other side, off-axis peaked energetic electron profiles are characterized by resonant excitation and nonlinear response of barely circulating energetic electrons which experience toroidal precession reversal of their motion.

  15. A catastrophe theory model of the conflict helix, with tests.

    PubMed

    Rummel, R J

    1987-10-01

    Macro social field theory has undergone extensive development and testing since the 1960s. One of these has been the articulation of an appropriate conceptual micro model--called the conflict helix--for understanding the process from conflict to cooperation and vice versa. Conflict and cooperation are viewed as distinct equilibria of forces in a social field; the movement between these equilibria is a jump, energized by a gap between social expectations and power, and triggered by some minor event. Quite independently, there also has been much recent application of catastrophe theory to social behavior, but usually without a clear substantive theory and lacking empirical testing. This paper uses catastrophe theory--namely, the butterfly model--mathematically to structure the conflict helix. The social field framework and helix provide the substantive interpretation for the catastrophe theory; and catastrophe theory provides a suitable mathematical model for the conflict helix. The model is tested on the annual conflict and cooperation between India and Pakistan, 1948 to 1973. The results are generally positive and encouraging.

  16. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  17. Comparison of kinetic theory models of laser ablation of carbon

    SciTech Connect

    Shusser, Michael

    2010-05-15

    The paper compares the predictions of three-dimensional kinetic theory models of laser ablation of carbon. All the models are based on the moment solution of the Boltzmann equation for arbitrary strong evaporation but use different approximations. Comparison of the model predictions demonstrated that the choice of the particular model has very little influence on the results. The influence of the heat conduction from the gas to the solid phase was also found to be negligible in this problem.

  18. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    NASA Astrophysics Data System (ADS)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  19. Modeling Developmental Transitions in Adaptive Resonance Theory

    ERIC Educational Resources Information Center

    Raijmakers, Maartje E. J.; Molenaar, Peter C. M.

    2004-01-01

    Neural networks are applied to a theoretical subject in developmental psychology: modeling developmental transitions. Two issues that are involved will be discussed: discontinuities and acquiring qualitatively new knowledge. We will argue that by the appearance of a bifurcation, a neural network can show discontinuities and may acquire…

  20. Modeling Environmental Concern: Theory and Application.

    ERIC Educational Resources Information Center

    Hackett, Paul M. W.

    1993-01-01

    Human concern for the quality and protection of the natural environment forms the basis of successful environmental conservation activities. Considers environmental concern research and proposes a model that incorporates the multiple dimensions of research through which environmental concern may be evaluated. (MDH)

  1. PARFUME User's Guide

    SciTech Connect

    Kurt Hamman

    2010-09-01

    PARFUME, a fuel performance analysis and modeling code, is being developed at the Idaho National Laboratory for evaluating gas reactor coated particle fuel assemblies for prismatic, pebble bed, and plate type fuel geometries. The code is an integrated mechanistic analysis tool that evaluates the thermal, mechanical, and physico-chemical behavior of coated fuel particles (TRISO) and the probability for fuel failure given the particle-to-particle statistical variations in physical dimensions and material properties that arise during the fuel fabrication process. Using a robust finite difference numerical scheme, PARFUME is capable of performing steady state and transient heat transfer and fission product diffusion analyses for the fuel. Written in FORTRAN 90, PARFUME is easy to read, maintain, and modify. Currently, PARFUME is supported only on MS Windows platforms. This document represents the initial version of the PARFUME User Guide, a supplement to the PARFUME Theory and Model Basis Report which describes the theoretical aspects of the code. User information is provided including: 1) code development, 2) capabilities and limitations, 3) installation and execution, 4) user input and output, 5) sample problems, and 6) error messages. In the near future, the INL plans to release a fully benchmarked and validated beta version of PARFUME.

  2. Attachment theory and theory of planned behavior: an integrative model predicting underage drinking.

    PubMed

    Lac, Andrew; Crano, William D; Berger, Dale E; Alvaro, Eusebio M

    2013-08-01

    Research indicates that peer and maternal bonds play important but sometimes contrasting roles in the outcomes of children. Less is known about attachment bonds to these 2 reference groups in young adults. Using a sample of 351 participants (18 to 20 years of age), the research integrated two theoretical traditions: attachment theory and theory of planned behavior (TPB). The predictive contribution of both theories was examined in the context of underage adult alcohol use. Using full structural equation modeling, results substantiated the hypotheses that secure peer attachment positively predicted norms and behavioral control toward alcohol, but secure maternal attachment inversely predicted attitudes and behavioral control toward alcohol. Alcohol attitudes, norms, and behavioral control each uniquely explained alcohol intentions, which anticipated an increase in alcohol behavior 1 month later. The hypothesized processes were statistically corroborated by tests of indirect and total effects. These findings support recommendations for programs designed to curtail risky levels of underage drinking using the tenets of attachment theory and TPB.

  3. PREDICTING ATTENUATION OF VIRUSES DURING PERCOLATION IN SOILS: 2. USER'S GUIDE TO THE VIRULO 1.0 COMPUTER MODEL

    EPA Science Inventory

    In the EPA document Predicting Attenuation of Viruses During Percolation in Soils 1. Probabilistic Model the conceptual, theoretical, and mathematical foundations for a predictive screening model were presented. In this current volume we present a User's Guide for the computer mo...

  4. An analytical approach to thermal modeling of Bridgman type crystal growth: One dimensional analysis. Computer program users manual

    NASA Technical Reports Server (NTRS)

    Cothran, E. K.

    1982-01-01

    The computer program written in support of one dimensional analytical approach to thermal modeling of Bridgman type crystal growth is presented. The program listing and flow charts are included, along with the complete thermal model. Sample problems include detailed comments on input and output to aid the first time user.

  5. Group theory and biomolecular conformation: I. Mathematical and computational models

    PubMed Central

    Chirikjian, Gregory S

    2010-01-01

    Biological macromolecules, and the complexes that they form, can be described in a variety of ways ranging from quantum mechanical and atomic chemical models, to coarser grained models of secondary structure and domains, to continuum models. At each of these levels, group theory can be used to describe both geometric symmetries and conformational motion. In this survey, a detailed account is provided of how group theory has been applied across computational structural biology to analyze the conformational shape and motion of macromolecules and complexes. PMID:20827378

  6. Minimal model of a heat engine: information theory approach.

    PubMed

    Zhou, Yun; Segal, Dvira

    2010-07-01

    We construct a generic model for a heat engine using information theory concepts, attributing irreversible energy dissipation to the information transmission channels. Using several forms for the channel capacity, classical and quantum, we demonstrate that our model recovers both the Carnot principle in the reversible limit, and the universal maximum power efficiency expression of nonreversible thermodynamics in the linear response regime. We expect the model to be very useful as a testbed for studying fundamental topics in thermodynamics, and for providing new insights into the relationship between information theory and actual thermal devices.

  7. Automated Physico-Chemical Cell Model Development through Information Theory

    SciTech Connect

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  8. User's guide to revised method-of-characteristics solute-transport model (MOC--version 31)

    USGS Publications Warehouse

    Konikow, L.F.; Granato, G.E.; Hornberger, G.Z.

    1994-01-01

    The U.S. Geological Survey computer model to simulate two-dimensional solute transport and dispersion in ground water (Konikow and Bredehoeft, 1978; Goode and Konikow, 1989) has been modified to improve management of input and output data and to provide progressive run-time information. All opening and closing of files are now done automatically by the program. Names of input data files are entered either interactively or using a batch-mode script file. Names of output files, created automatically by the program, are based on the name of the input file. In the interactive mode, messages are written to the screen during execution to allow the user to monitor the status and progress of the simulation and to anticipate total running time. Information reported and updated during a simulation include the current pumping period and time step, number of particle moves, and percentage completion of the current time step. The batch mode enables a user to run a series of simulations consecutively, without additional control. A report of the model's activity in the batch mode is written to a separate output file, allowing later review. The user has several options for creating separate output files for different types of data. The formats are compatible with many commercially available applications, which facilitates graphical postprocessing of model results. Geohydrology and Evaluation of Stream-Aquifer Relations in the Apalachicola-Chattahoochee-Flint River Basin, Southeastern Alabama, Northwestern Florida, and Southwestern Georgia By Lynn J. Torak, Gary S. Davis, George A. Strain, and Jennifer G. Herndon Abstract The lower Apalachieola-Chattahoochec-Flint River Basin is underlain by Coastal Plain sediments of pre-Cretaceous to Quaternary age consisting of alternating units of sand, clay, sandstone, dolomite, and limestone that gradually thicken and dip gently to the southeast. The stream-aquifer system consism of carbonate (limestone and dolomite) and elastic sediments

  9. Theory and Practice: An Integrative Model Linking Class and Field

    ERIC Educational Resources Information Center

    Lesser, Joan Granucci; Cooper, Marlene

    2006-01-01

    Social work has evolved over the years taking on the challenges of the times. The profession now espouses a breadth of theoretical approaches and treatment modalities. We have developed a model to help graduate social work students master the skill of integrating theory and social work practice. The Integrative Model has five components: (l) The…

  10. Chiral field theories as models for hadron substructure

    SciTech Connect

    Kahana, S.H.

    1987-03-01

    A model for the nucleon as soliton of quarks interacting with classical meson fields is described. The theory, based on the linear sigma model, is renormalizable and capable of including sea quarks straightforwardly. Application to nuclear matter is made in a Wigner-Seitz approximation.

  11. Minimax D-Optimal Designs for Item Response Theory Models.

    ERIC Educational Resources Information Center

    Berger, Martjin P. F.; King, C. Y. Joy; Wong, Weng Kee

    2000-01-01

    Proposed minimax designs for item response theory (IRT) models to overcome the problem of local optimality. Compared minimax designs to sequentially constructed designs for the two parameter logistic model. Results show that minimax designs can be nearly as efficient as sequentially constructed designs. (Author/SLD)

  12. Minimal Pati-Salam model from string theory unification

    SciTech Connect

    Dent, James B.; Kephart, Thomas W.

    2008-06-01

    We provide what we believe is the minimal three family N=1 SUSY and conformal Pati-Salam model from type IIB superstring theory. This Z{sub 3} orbifolded AdS x S{sup 5} model has long lived protons and has potential phenomenological consequences for LHC (Large Hadron Collider)

  13. The Mapping Model: A Cognitive Theory of Quantitative Estimation

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2008-01-01

    How do people make quantitative estimations, such as estimating a car's selling price? Traditionally, linear-regression-type models have been used to answer this question. These models assume that people weight and integrate all information available to estimate a criterion. The authors propose an alternative cognitive theory for quantitative…

  14. Reciprocal Ontological Models Show Indeterminism Comparable to Quantum Theory

    NASA Astrophysics Data System (ADS)

    Bandyopadhyay, Somshubhro; Banik, Manik; Bhattacharya, Some Sankar; Ghosh, Sibasish; Kar, Guruprasad; Mukherjee, Amit; Roy, Arup

    2017-02-01

    We show that within the class of ontological models due to Harrigan and Spekkens, those satisfying preparation-measurement reciprocity must allow indeterminism comparable to that in quantum theory. Our result implies that one can design quantum random number generator, for which it is impossible, even in principle, to construct a reciprocal deterministic model.

  15. Dust in fusion plasmas: theory and modeling

    SciTech Connect

    Smirnov, R. D.; Pigarov, A. Yu.; Krasheninnikov, S. I.; Mendis, D. A.; Rosenberg, M.; Rudakov, D.; Tanaka, Y.; Rognlien, T. D.; Soboleva, T. K.; Shukla, P. K.; Bray, B. D.; West, W. P.; Roquemore, A. L.; Skinner, C. H.

    2008-09-07

    Dust may have a large impact on ITER-scale plasma experiments including both safety and performance issues. However, the physics of dust in fusion plasmas is very complex and multifaceted. Here, we discuss different aspects of dust dynamics including dust-plasma, and dust-surface interactions. We consider the models of dust charging, heating, evaporation/sublimation, dust collision with material walls, etc., which are suitable for the conditions of fusion plasmas. The physical models of all these processes have been incorporated into the DUST Transport (DUSTT) code. Numerical simulations demonstrate that dust particles are very mobile and accelerate to large velocities due to the ion drag force (cruise speed >100 m/s). Deep penetration of dust particles toward the plasma core is predicted. It is shown that DUSTT is capable of reproducing many features of recent dust-related experiments, but much more work is still needed.

  16. A New User-Friendly Model to Reduce Cost for Headwater Benefits Assessment

    SciTech Connect

    Bao, Y.S.; Cover, C.K.; Perlack, R.D.; Sale, M.J.; Sarma, V.

    1999-07-07

    Headwater benefits at a downstream hydropower project are energy gains that are derived from the installation of upstream reservoirs. The Federal Energy Regulatory Commission is required by law to assess charges of such energy gains to downstream owners of non-federal hydropower projects. The high costs of determining headwater benefits prohibit the use of a complicated model in basins where the magnitude of the benefits is expected to be small. This paper presents a new user-friendly computer model, EFDAM (Enhanced Flow Duration Analysis Method), that not only improves the accuracy of the standard flow duration method but also reduces costs for determining headwater benefits. The EFDAM model includes a MS Windows-based interface module to provide tools for automating input data file preparation, linking and executing of a generic program, editing/viewing of input/output files, and application guidance. The EDFAM was applied to various river basins. An example was given to illustrate the main features of EFDAM application for creating input files and assessing headwater benefits at the Tulloch Hydropower Plant on the Stanislaus River Basin, California.

  17. New theories of root growth modelling

    NASA Astrophysics Data System (ADS)

    Landl, Magdalena; Schnepf, Andrea; Vanderborght, Jan; Huber, Katrin; Javaux, Mathieu; Bengough, A. Glyn; Vereecken, Harry

    2016-04-01

    In dynamic root architecture models, root growth is represented by moving root tips whose line trajectory results in the creation of new root segments. Typically, the direction of root growth is calculated as the vector sum of various direction-affecting components. However, in our simulations this did not reproduce experimental observations of root growth in structured soil. We therefore developed a new approach to predict the root growth direction. In this approach we distinguish between, firstly, driving forces for root growth, i.e. the force exerted by the root which points in the direction of the previous root segment and gravitropism, and, secondly, the soil mechanical resistance to root growth or penetration resistance. The latter can be anisotropic, i.e. depending on the direction of growth, which leads to a difference between the direction of the driving force and the direction of the root tip movement. Anisotropy of penetration resistance can be caused either by microscale differences in soil structure or by macroscale features, including macropores. Anisotropy at the microscale is neglected in our model. To allow for this, we include a normally distributed random deflection angle α to the force which points in the direction of the previous root segment with zero mean and a standard deviation σ. The standard deviation σ is scaled, so that the deflection from the original root tip location does not depend on the spatial resolution of the root system model. Similarly to the water flow equation, the direction of the root tip movement corresponds to the water flux vector while the driving forces are related to the water potential gradient. The analogue of the hydraulic conductivity tensor is the root penetrability tensor. It is determined by the inverse of soil penetration resistance and describes the ease with which a root can penetrate the soil. By adapting the three dimensional soil and root water uptake model R-SWMS (Javaux et al., 2008) in this way

  18. Downsizer - A Graphical User Interface-Based Application for Browsing, Acquiring, and Formatting Time-Series Data for Hydrologic Modeling

    USGS Publications Warehouse

    Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.

    2009-01-01

    The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.

  19. Nanofluid Drop Evaporation: Experiment, Theory, and Modeling

    NASA Astrophysics Data System (ADS)

    Gerken, William James

    Nanofluids, stable colloidal suspensions of nanoparticles in a base fluid, have potential applications in the heat transfer, combustion and propulsion, manufacturing, and medical fields. Experiments were conducted to determine the evaporation rate of room temperature, millimeter-sized pendant drops of ethanol laden with varying amounts (0-3% by weight) of 40-60 nm aluminum nanoparticles (nAl). Time-resolved high-resolution drop images were collected for the determination of early-time evaporation rate (D2/D 02 > 0.75), shown to exhibit D-square law behavior, and surface tension. Results show an asymptotic decrease in pendant drop evaporation rate with increasing nAl loading. The evaporation rate decreases by approximately 15% at around 1% to 3% nAl loading relative to the evaporation rate of pure ethanol. Surface tension was observed to be unaffected by nAl loading up to 3% by weight. A model was developed to describe the evaporation of the nanofluid pendant drops based on D-square law analysis for the gas domain and a description of the reduction in liquid fraction available for evaporation due to nanoparticle agglomerate packing near the evaporating drop surface. Model predictions are in relatively good agreement with experiment, within a few percent of measured nanofluid pendant drop evaporation rate. The evaporation of pinned nanofluid sessile drops was also considered via modeling. It was found that the same mechanism for nanofluid evaporation rate reduction used to explain pendant drops could be used for sessile drops. That mechanism is a reduction in evaporation rate due to a reduction in available ethanol for evaporation at the drop surface caused by the packing of nanoparticle agglomerates near the drop surface. Comparisons of the present modeling predictions with sessile drop evaporation rate measurements reported for nAl/ethanol nanofluids by Sefiane and Bennacer [11] are in fairly good agreement. Portions of this abstract previously appeared as: W. J

  20. Theory and modeling of stereoselective organic reactions.

    PubMed

    Houk, K N; Paddon-Row, M N; Rondan, N G; Wu, Y D; Brown, F K; Spellmeyer, D C; Metz, J T; Li, Y; Loncharich, R J

    1986-03-07

    Theoretical investigations of the transition structures of additions and cycloadditions reveal details about the geometries of bond-forming processes that are not directly accessible by experiment. The conformational analysis of transition states has been developed from theoretical generalizations about the preferred angle of attack by reagents on multiple bonds and predictions of conformations with respect to partially formed bonds. Qualitative rules for the prediction of the stereochemistries of organic reactions have been devised, and semi-empirical computational models have also been developed to predict the stereoselectivities of reactions of large organic molecules, such as nucleophilic additions to carbonyls, electrophilic hydroborations and cycloadditions, and intramolecular radical additions and cycloadditions.

  1. Genetic model compensation: Theory and applications

    NASA Astrophysics Data System (ADS)

    Cruickshank, David Raymond

    1998-12-01

    The adaptive filtering algorithm known as Genetic Model Compensation (GMC) was originally presented in the author's Master's Thesis. The current work extends this earlier work. GMC uses a genetic algorithm to optimize filter process noise parameters in parallel with the estimation of the state and based only on the observational information available to the filter. The original stochastic state model underlying GMC was inherited from the antecedent, non-adaptive Dynamic Model Compensation (DMC) algorithm. The current work develops the stochastic state model from a linear system viewpoint, avoiding the simplifications and approximations of the earlier development, and establishes Riemann sums as unbiased estimators of the stochastic integrals which describe the evolution of the random state components. These are significant developments which provide GMC with a solid theoretical foundation. Orbit determination is the area of application in this work, and two types of problems are studied: real-time autonomous filtering using absolute GPS measurements and precise post-processed filtering using differential GPS measurements. The first type is studied in a satellite navigation simulation in which pseudorange and pseudorange rate measurements are processed by an Extended Kalman Filter which incorporates both DMC and GMC. Both estimators are initialized by a geometric point solution algorithm. Using measurements corrupted by simulated Selective Availability errors, GMC reduces mean RSS position error by 6.4 percent, reduces mean clock bias error by 46 percent, and displays a marked improvement in covariance consistency relative to DMC. To study the second type of problem, GMC is integrated with NASA Jet Propulsion Laboratory's Gipsy/Oasis-II (GOA-II) precision orbit determination program creating an adaptive version of GOA-II's Reduced Dynamic Tracking (RDT) process noise formulation. When run as a sequential estimator with GPS measurements from the TOPEX satellite and

  2. Integrated Modeling Program, Applied Chemical Theory (IMPACT)

    PubMed Central

    BANKS, JAY L.; BEARD, HEGE S.; CAO, YIXIANG; CHO, ART E.; DAMM, WOLFGANG; FARID, RAMY; FELTS, ANTHONY K.; HALGREN, THOMAS A.; MAINZ, DANIEL T.; MAPLE, JON R.; MURPHY, ROBERT; PHILIPP, DEAN M.; REPASKY, MATTHEW P.; ZHANG, LINDA Y.; BERNE, BRUCE J.; FRIESNER, RICHARD A.; GALLICCHIO, EMILIO; LEVY, RONALD M.

    2009-01-01

    We provide an overview of the IMPACT molecular mechanics program with an emphasis on recent developments and a description of its current functionality. With respect to core molecular mechanics technologies we include a status report for the fixed charge and polarizable force fields that can be used with the program and illustrate how the force fields, when used together with new atom typing and parameter assignment modules, have greatly expanded the coverage of organic compounds and medicinally relevant ligands. As we discuss in this review, explicit solvent simulations have been used to guide our design of implicit solvent models based on the generalized Born framework and a novel nonpolar estimator that have recently been incorporated into the program. With IMPACT it is possible to use several different advanced conformational sampling algorithms based on combining features of molecular dynamics and Monte Carlo simulations. The program includes two specialized molecular mechanics modules: Glide, a high-throughput docking program, and QSite, a mixed quantum mechanics/molecular mechanics module. These modules employ the IMPACT infrastructure as a starting point for the construction of the protein model and assignment of molecular mechanics parameters, but have then been developed to meet specialized objectives with respect to sampling and the energy function. PMID:16211539

  3. Integrated Modeling Program, Applied Chemical Theory (IMPACT).

    PubMed

    Banks, Jay L; Beard, Hege S; Cao, Yixiang; Cho, Art E; Damm, Wolfgang; Farid, Ramy; Felts, Anthony K; Halgren, Thomas A; Mainz, Daniel T; Maple, Jon R; Murphy, Robert; Philipp, Dean M; Repasky, Matthew P; Zhang, Linda Y; Berne, Bruce J; Friesner, Richard A; Gallicchio, Emilio; Levy, Ronald M

    2005-12-01

    We provide an overview of the IMPACT molecular mechanics program with an emphasis on recent developments and a description of its current functionality. With respect to core molecular mechanics technologies we include a status report for the fixed charge and polarizable force fields that can be used with the program and illustrate how the force fields, when used together with new atom typing and parameter assignment modules, have greatly expanded the coverage of organic compounds and medicinally relevant ligands. As we discuss in this review, explicit solvent simulations have been used to guide our design of implicit solvent models based on the generalized Born framework and a novel nonpolar estimator that have recently been incorporated into the program. With IMPACT it is possible to use several different advanced conformational sampling algorithms based on combining features of molecular dynamics and Monte Carlo simulations. The program includes two specialized molecular mechanics modules: Glide, a high-throughput docking program, and QSite, a mixed quantum mechanics/molecular mechanics module. These modules employ the IMPACT infrastructure as a starting point for the construction of the protein model and assignment of molecular mechanics parameters, but have then been developed to meet specialized objectives with respect to sampling and the energy function.

  4. An AIDS model with distributed incubation and variable infectiousness: applications to i.v. drug users in Latium, Italy.

    PubMed

    Iannelli, M; Loro, R; Milner, F; Pugliese, A; Rabbiolo, G

    1992-07-01

    An AIDS model with distributed incubation and variable infectiousness is considered and simulated via a second-order numerical method. The method is applied to the HIV epidemic among IV drug users in the Latium region of Italy, using available data on the length of the incubation period before the onset of AIDS, on the infectivity of infected individuals during that period, and on the demography of drug users. The contact rate is adjusted to match the actual number of AIDS cases. The sensitivity of the model to uncertainties in the parameters is finally investigated, by performing several simulations.

  5. A Brinkmanship Game Theory Model of Terrorism

    NASA Astrophysics Data System (ADS)

    Melese, Francois

    This study reveals conditions under which a world leader might credibly issue a brinkmanship threat of preemptive action to deter sovereign states or transnational terrorist organizations from acquiring weapons of mass destruction (WMD). The model consists of two players: the United Nations (UN) “Principal,” and a terrorist organization “Agent.” The challenge in issuing a brinkmanship threat is that it needs to be sufficiently unpleasant to deter terrorists from acquiring WMD, while not being so repugnant to those that must carry it out that they would refuse to do so. Two “credibility constraints” are derived. The first relates to the unknown terrorist type (Hard or Soft), and the second to acceptable risks (“blowback”) to the World community. Graphing the incentive-compatible Nash equilibrium solutions reveals when a brinkmanship threat is credible, and when it is not - either too weak to be effective, or unacceptably dangerous to the World community.

  6. Theory and modeling of stereoselective organic reactions

    SciTech Connect

    Houk, K.N.; Paddon-Row, M.N.; Rondan, N.G.; Wu, Y.D.; Brown, F.K.; Spellmeyer, D.C.; Metz, J.T.; Li, Y.; Loncharich, R.J.

    1986-03-07

    Theoretical investigations of the transition structures of additions and cycloadditions reveal details about the geometrics of bond-forming processes that are not directly accessible by experiment. The conformational analysis of transition states has been developed from theoretical generalizations about the preferred angle of attack by reagents on multiple bonds and predictions of conformations with respect to partially formed bonds. Qualitative rules for the prediction of the stereochemistries of organic reactions have been devised, and semi-empirical computational models have also been developed to predict the stereoselectivities of reactions of large organic molecules, such as nucleophilic additions to carbonyls, electrophilic hydroborations and cycloadditions, and intramolecular radical additions and cycloadditions. 52 references, 7 figures.

  7. Statistical inference for stochastic simulation models--theory and application.

    PubMed

    Hartig, Florian; Calabrese, Justin M; Reineking, Björn; Wiegand, Thorsten; Huth, Andreas

    2011-08-01

    Statistical models are the traditional choice to test scientific theories when observations, processes or boundary conditions are subject to stochasticity. Many important systems in ecology and biology, however, are difficult to capture with statistical models. Stochastic simulation models offer an alternative, but they were hitherto associated with a major disadvantage: their likelihood functions can usually not be calculated explicitly, and thus it is difficult to couple them to well-established statistical theory such as maximum likelihood and Bayesian statistics. A number of new methods, among them Approximate Bayesian Computing and Pattern-Oriented Modelling, bypass this limitation. These methods share three main principles: aggregation of simulated and observed data via summary statistics, likelihood approximation based on the summary statistics, and efficient sampling. We discuss principles as well as advantages and caveats of these methods, and demonstrate their potential for integrating stochastic simulation models into a unified framework for statistical modelling.

  8. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    ERIC Educational Resources Information Center

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  9. Integrating Developmental Theory and Methodology: Using Derivatives to Articulate Change Theories, Models, and Inferences

    ERIC Educational Resources Information Center

    Deboeck, Pascal R.; Nicholson, Jody; Kouros, Chrystyna; Little, Todd D.; Garber, Judy

    2015-01-01

    Matching theories about growth, development, and change to appropriate statistical models can present a challenge, which can result in misuse, misinterpretation, and underutilization of different analytical approaches. We discuss the use of "derivatives": the change of a construct with respect to the change in another construct.…

  10. Foundations of reusable and interoperable facet models using category theory.

    PubMed

    Harris, Daniel R

    2016-10-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and light-weight ontologies, but in many regards, they are implementations of faceted browsing rather than a specification of the basic, underlying structures and interactions. We will demonstrate that category theory allows us to specify faceted objects and study the relationships and interactions within a faceted browsing system. Resulting implementations can then be constructed through a category-theoretic lens using these models, allowing abstract comparison and communication that naturally support interoperability and reuse.

  11. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    PubMed

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  12. Model Based User's Access Requirement Analysis of E-Governance Systems

    NASA Astrophysics Data System (ADS)

    Saha, Shilpi; Jeon, Seung-Hwan; Robles, Rosslin John; Kim, Tai-Hoon; Bandyopadhyay, Samir Kumar

    The strategic and contemporary importance of e-governance has been recognized across the world. In India too, various ministries of Govt. of India and State Governments have taken e-governance initiatives to provide e-services to citizens and the business they serve. To achieve the mission objectives, and make such e-governance initiatives successful it would be necessary to improve the trust and confidence of the stakeholders. It is assumed that the delivery of government services will share the same public network information that is being used in the community at large. In particular, the Internet will be the principal means by which public access to government and government services will be achieved. To provide the security measures main aim is to identify user's access requirement for the stakeholders and then according to the models of Nath's approach. Based on this analysis, the Govt. can also make standards of security based on the e-governance models. Thus there will be less human errors and bias. This analysis leads to the security architecture of the specific G2C application.

  13. Columbia River Statistical Update Model, Version 4. 0 (COLSTAT4): Background documentation and user's guide

    SciTech Connect

    Whelan, G.; Damschen, D.W.; Brockhaus, R.D.

    1987-08-01

    Daily-averaged temperature and flow information on the Columbia River just downstream of Priest Rapids Dam and upstream of river mile 380 were collected and stored in a data base. The flow information corresponds to discharges that were collected daily from October 1, 1959, through July 28, 1986. The temperature information corresponds to values that were collected daily from January 1, 1965, through May 27, 1986. The computer model, COLSTAT4 (Columbia River Statistical Update - Version 4.0 model), uses the temperature-discharge data base to statistically analyze temperature and flow conditions by computing the frequency of occurrence and duration of selected temperatures and flow rates for the Columbia River. The COLSTAT4 code analyzes the flow and temperature information in a sequential time frame (i.e., a continuous analysis over a given time period); it also analyzes this information in a seasonal time frame (i.e., a periodic analysis over a specific season from year to year). A provision is included to enable the user to edit and/or extend the data base of temperature and flow information. This report describes the COLSTAT4 code and the information contained in its data base.

  14. Configuring a Graphical User Interface for Managing Local HYSPLIT Model Runs Through AWIPS

    NASA Technical Reports Server (NTRS)

    Wheeler, mark M.; Blottman, Peter F.; Sharp, David W.; Hoeth, Brian; VanSpeybroeck, Kurt M.

    2009-01-01

    Responding to incidents involving the release of harmful airborne pollutants is a continual challenge for Weather Forecast Offices in the National Weather Service. When such incidents occur, current protocol recommends forecaster-initiated requests of NOAA's Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model output through the National Centers of Environmental Prediction to obtain critical dispersion guidance. Individual requests are submitted manually through a secured web site, with desired multiple requests submitted in sequence, for the purpose of obtaining useful trajectory and concentration forecasts associated with the significant release of harmful chemical gases, radiation, wildfire smoke, etc., into local the atmosphere. To help manage the local HYSPLIT for both routine and emergency use, a graphical user interface was designed for operational efficiency. The interface allows forecasters to quickly determine the current HYSPLIT configuration for the list of predefined sites (e.g., fixed sites and floating sites), and to make any necessary adjustments to key parameters such as Input Model. Number of Forecast Hours, etc. When using the interface, forecasters will obtain desired output more confidently and without the danger of corrupting essential configuration files.

  15. Population changes: contemporary models and theories.

    PubMed

    Sauvy, A

    1981-01-01

    In many developing countries rapid population growth has promoted a renewed interest in the study of the effect of population growth on economic development. This research takes either the macroeconomic viewpoint, where the nation is the framework, or the microeconomic perspective, where the family is the framework. For expository purposes, the macroeconomic viewpoint is assumed, and an example of such an investment is presented. Attention is directed to the following: a simplified model--housing; the lessons learned from experience (primitive populations, Spain in the 17th and 18th centuries, comparing development in Spain and Italy, 19th century Western Europe, and underdeveloped countries); the positive factors of population growth; and the concept of the optimal rate of growth. Housing is the typical investment that an individual makes. Hence, the housing per person (roughly 1/3 of the necessary amount of housing per family) is taken as a unit, and the calculations are made using averages. The conclusion is that growth is expensive. A population decrease might be advantageous, for this decrease would enable the entire population to benefit from past capital accumulation. It is also believed, "a priori," that population growth is more expensive for a developed than for a developing country. This belief may be attributable to the fact that the capital per person tends to be high in the developed countries. Any further increase in the population requires additional capital investments, driving this ratio even higher. Yet, investment is not the only factor inhibiting economic development. The literature describes factors regarding population growth, yet this writer prefers to emphasize 2 other factors that have been the subject of less study: a growing population's ease of adaptation and the human factor--behavior. A growing population adapts better to new conditions than does a stationary or declining population, and contrary to "a priori" belief, a growing

  16. Modelling strain localization in granular materials using micropolar theory: mathematical formulations

    NASA Astrophysics Data System (ADS)

    Alsaleh, Mustafa I.; Voyiadjis, George Z.; Alshibli, Khalid A.

    2006-12-01

    It has been known that classical continuum mechanics laws fail to describe strain localization in granular materials due to the mathematical ill-posedness and mesh dependency. Therefore, a non-local theory with internal length scales is needed to overcome such problems. The micropolar and high-order gradient theories can be considered as good examples to characterize the strain localization in granular materials. The fact that internal length scales are needed requires micromechanical models or laws; however, the classical constitutive models can be enhanced through the stress invariants to incorporate the Micropolar effects. In this paper, Lade's single hardening model is enhanced to account for the couple stress and Cosserat rotation and the internal length scales are incorporated accordingly. The enhanced Lade's model and its material properties are discussed in detail; then the finite element formulations in the Updated Lagrangian Frame (UL) are used. The finite element formulations were implemented into a user element subroutine for ABAQUS (UEL) and the solution method is discussed in the companion paper. The model was found to predict the strain localization in granular materials with low dependency on the finite element mesh size. The shear band was found to reflect on a certain angle when it hit a rigid boundary. Applications for the model on plane strain specimens tested in the laboratory are discussed in the companion paper. Copyright

  17. A theory of exchange rate modeling

    SciTech Connect

    Alekseev, A.A.

    1995-09-01

    The article examines exchange rate modeling for two cases: (a) when the trading partners have mutual interests and (b) when the trading partners have antogonistic interests. Exchange rates in world markets are determined by supply and demand for the currency of each state, and states may control the exchange rate of their currency by changing the interest rate, the volume of credit, and product prices in both domestic and export markets. Abstracting from issues of production and technology in different countries and also ignoring various trade, institutional, and other barriers, we consider in this article only the effect of export and import prices on the exchange rate, we propose a new criterion of external trade activity: each trading partner earns a profit which is proportional to the volume of benefits enjoyed by the other partner. We consider a trading cycle that consists of four stages: (a) purchase of goods in the domestic market with the object of selling them abroad; (b) sale of the goods in foreign markets; (c) purchase of goods abroad with the object of selling them in the domestic market; (d) sale of the goods domestically.

  18. A Domain Specific Modeling Approach for Coordinating User-Centric Communication Services

    ERIC Educational Resources Information Center

    Wu, Yali

    2011-01-01

    Rapid advances in electronic communication devices and technologies have resulted in a shift in the way communication applications are being developed. These new development strategies provide abstract views of the underlying communication technologies and lead to the so-called "user-centric communication applications." One user-centric…

  19. Implementing a Multiple Criteria Model Base in Co-Op with a Graphical User Interface Generator

    DTIC Science & Technology

    1993-09-23

    Decision Support System (Co-op) for Windows. The algorithms and the graphical user interfaces for these modules are implemented using Microsoft Visual ... Basic under the Windows based environment operating in a IBM compatible microcomputer. Design of the MCDM programs interface is based on general interface design principles of user control, screen design, and layout.

  20. Models for User Access Patterns on the Web: Semantic Content versus Access History.

    ERIC Educational Resources Information Center

    Ross, Arun; Owen, Charles B.; Vailaya, Aditya

    This paper focuses on clustering a World Wide Web site (i.e., the 1998 World Cup Soccer site) into groups of documents that are predictive of future user accesses. Two approaches were developed and tested. The first approach uses semantic information inherent in the documents to facilitate the clustering process. User access history is then used…

  1. Decision-making in stimulant and opiate addicts in protracted abstinence: evidence from computational modeling with pure users

    PubMed Central

    Ahn, Woo-Young; Vasilev, Georgi; Lee, Sung-Ha; Busemeyer, Jerome R.; Kruschke, John K.; Bechara, Antoine; Vassileva, Jasmin

    2014-01-01

    Substance dependent individuals (SDI) often exhibit decision-making deficits; however, it remains unclear whether the nature of the underlying decision-making processes is the same in users of different classes of drugs and whether these deficits persist after discontinuation of drug use. We used computational modeling to address these questions in a unique sample of relatively “pure” amphetamine-dependent (N = 38) and heroin-dependent individuals (N = 43) who were currently in protracted abstinence, and in 48 healthy controls (HC). A Bayesian model comparison technique, a simulation method, and parameter recovery tests were used to compare three cognitive models: (1) Prospect Valence Learning with decay reinforcement learning rule (PVL-DecayRI), (2) PVL with delta learning rule (PVL-Delta), and (3) Value-Plus-Perseverance (VPP) model based on Win-Stay-Lose-Switch (WSLS) strategy. The model comparison results indicated that the VPP model, a hybrid model of reinforcement learning (RL) and a heuristic strategy of perseverance had the best post-hoc model fit, but the two PVL models showed better simulation and parameter recovery performance. Computational modeling results suggested that overall all three groups relied more on RL than on a WSLS strategy. Heroin users displayed reduced loss aversion relative to HC across all three models, which suggests that their decision-making deficits are longstanding (or pre-existing) and may be driven by reduced sensitivity to loss. In contrast, amphetamine users showed comparable cognitive functions to HC with the VPP model, whereas the second best-fitting model with relatively good simulation performance (PVL-DecayRI) revealed increased reward sensitivity relative to HC. These results suggest that some decision-making deficits persist in protracted abstinence and may be mediated by different mechanisms in opiate and stimulant users. PMID:25161631

  2. A comparative 2D modeling of debris-flow propagation and outcomes for end-users

    NASA Astrophysics Data System (ADS)

    Bettella, F.; Bertoldi, G.; Pozza, E.; McArdell, B. W.; D'Agostino, V.

    2012-04-01

    In Alpine regions gravity-driven natural hazards, in particular debris flows, endanger settlements and human life. Mitigation strategies based on hazard maps are necessary tools for land planning. These maps can be made more precise by using numerical models to forecasting the inundated areas after a careful setting of those 'key parameters' (K-P) which directly affect the flow motion and its interaction with the ground surface. Several physically based 2D models are available for practitioners and governmental agencies, but the selection criteria of model type and of the related K-P remain flexible and partly subjective. This remark has driven us to investigate how different models simulate different types of debris flows (from granular to muddy debris flows, going through intermediate types), in particular when the flow is influenced by the presence of deposition basins. Two commercial 2D physical models (RAMMS and FLO-2D) have been tested for five well-documented debris flows events from five Italian catchments were different geology and flow dynamics are observed: 1) a viscous debris flow occurred in 2009 in a catchment with a metamorphic geology (Gadria torrent, Bolzano Province); 2) the 2009 granular debris flow in an granitic geological setting (Rio Dosson, Trento Province); 3-4) two events occurred in the 'rio Val del Lago' and 'rio Molinara' (Trento Province) in 2010 where porphyritic lithology prevails (intermediate granular debris flow); 5) the Rotolon torrent (Vicenza Province) 2009 debris flow containing sedimentary rocks enclosed in an abundant clay-rich matrix (intermediate viscous case). Event volumes range from 5.000 to 50.000 cubic meters. The Gadria, Rotolon and Val del Lago events are also influenced by artificial retention basins. Case study simulations allowed delineation of some practical end-user suggestions and good practices in order to guide the model choice and the K-P setting, particularly related to different flow dynamics. The

  3. Theory of compressive modeling and simulation

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith

    2013-05-01

    Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .

  4. User's Manual for HPTAM: a Two-Dimensional Heat Pipe Transient Analysis Model, Including the Startup from a Frozen State

    NASA Technical Reports Server (NTRS)

    Tournier, Jean-Michel; El-Genk, Mohamed S.

    1995-01-01

    This report describes the user's manual for 'HPTAM,' a two-dimensional Heat Pipe Transient Analysis Model. HPTAM is described in detail in the UNM-ISNPS-3-1995 report which accompanies the present manual. The model offers a menu that lists a number of working fluids and wall and wick materials from which the user can choose. HPTAM is capable of simulating the startup of heat pipes from either a fully-thawed or frozen condition of the working fluid in the wick structure. The manual includes instructions for installing and running HPTAM on either a UNIX, MS-DOS or VMS operating system. Samples for input and output files are also provided to help the user with the code.

  5. The Sortie-Generation Model System. Volume II. Sortie-Generation Model User’s Guide,

    DTIC Science & Technology

    1981-09-01

    NMR OF PART-TYPES BEING MODELED. Ci*. RESUPP(K) - (K=I,....WARTS) EXPECTED MJMBER OF TYPE--K PARTS C’.* IN RESIPPLY AT THE START OF THE SCENARIO. USED...PROCESS C+* IS ORGAN4IZED IN THE FOLLOWING MANNER. FIRST, N19CELLANEOJ.JS C"’* OPERATIONS AR PERFORMED, THEN SPRS INITIALIZATION, AND C’"" FINALLY

  6. Theories beyond the standard model, one year before the LHC

    NASA Astrophysics Data System (ADS)

    Dimopoulos, Savas

    2006-04-01

    Next year the Large Hadron Collider at CERN will begin what may well be a new golden era of particle physics. I will discuss three theories that will be tested at the LHC. I will begin with the supersymmetric standard model, proposed with Howard Georgi in 1981. This theory made a precise quantitative prediction, the unification of couplings, that has been experimentally confirmed in 1991 by experiments at CERN and SLAC. This established it as the leading theory for physics beyond the standard model. Its main prediction, the existence of supersymmetric particles, will be tested at the large hadron collider. I will next overview theories with large new dimensions, proposed with Nima Arkani-Hamed and Gia Dvali in 1998. This links the weakness of gravity to the presence of sub-millimeter size dimensions, that are presently searched for in experiments looking for deviations from Newton's law at short distances. In this framework quantum gravity, string theory, and black holes may be experimentally investigated at the large hadron collider. I will end with the recent proposal of split supersymmetry with Nima Arkani-Hamed. This theory is motivated by the possible existence of an enormous number of ground states in the fundamental theory, as suggested by the cosmological constant problem and recent developments in string theory and cosmology. It can be tested at the large hadron collider and, if confirmed, it will lend support to the idea that our universe and its laws are not unique and that there is an enormous variety of universes each with its own distinct physical laws.

  7. Higher-rank supersymmetric models and topological conformal field theory

    NASA Astrophysics Data System (ADS)

    Kawai, Toshiya; Uchino, Taku; Yang, Sung-Kil

    1993-03-01

    In the first part of this paper we investigate the operator aspect of a higher-rank supersymmetric model which is introduced as a Lie theoretic extension of the N = 2 minimal model with the simplest case su(2) corresponding to the N = 2 minimal model. In particular we identify the analogs of chirality conditions and chiral ring. In the second part we construct a class of topological conformal field theories starting with this higher-rank supersymmetric model. We show the BRST-exactness of the twisted stress-energy tensor, find out physical observables and discuss how to make their correlation functions. It is emphasized that in the case of su(2) the topological field theory constructed in this paper is distinct from the one obtained by twisting the N = 2 minimal model through the usual procedure.

  8. Does the theory-driven program affect the risky behavior of drug injecting users in a healthy city? A quasi-experimental study

    PubMed Central

    Karimy, Mahmood; Abedi, Ahmad Reza; Abredari, Hamid; Taher, Mohammad; Zarei, Fatemeh; Rezaie Shahsavarloo, Zahra

    2016-01-01

    Background: The horror of HIV/AIDS as a non-curable, grueling disease is a destructive issue for every country. Drug use, shared needles and unsafe sex are closely linked to the transmission of HIV/AIDS. Modification or changing unhealthy behavior through educational programs can lead to HIV prevention. The aim of this study was to evaluate the efficiency of theory-based education intervention on HIV prevention transmission in drug addicts. Methods: In this quasi-experimental study, 69 male drug injecting users were entered in to the theory- based educational intervention. Data were collected using a questionnaire, before and 3 months after four sessions (group discussions, lecture, film displaying and role play) of educational intervention. Results: The findings signified that the mean scores of constructs (self-efficacy, susceptibility, severity and benefit) significantly increased after the educational intervention, and the perceived barriers decreased (p< 0.001). Also, the history of HIV testing was reported to be 9% before the intervention, while the rate increased to 88% after the intervention. Conclusion: The present research offers a primary founding for planning and implementing a theory based educational program to prevent HIV/AIDS transmission in drug injecting addicts. This research revealed that health educational intervention improved preventive behaviors and the knowledge of HIV/AIDS participants. PMID:27390684

  9. The NASA/MSFC global reference atmospheric model: 1990 version (GRAM-90). Part 1: Technical/users manual

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Alyea, F. N.; Cunnold, D. M.; Jeffries, W. R., III; Johnson, D. L.

    1991-01-01

    A technical description of the NASA/MSFC Global Reference Atmospheric Model 1990 version (GRAM-90) is presented with emphasis on the additions and new user's manual descriptions of the program operation aspects of the revised model. Some sample results for the new middle atmosphere section and comparisons with results from a three dimensional circulation model are provided. A programmer's manual with more details for those wishing to make their own GRAM program adaptations is also presented.

  10. Main problems in the theory of modeling of catalytic processes

    SciTech Connect

    Pisarenko, V.N.

    1994-09-01

    This paper formulates the main problems in the theory of modeling of catalytic processes yet to be solved and describes the stages of modeling. Fundamental problems of model construction for the physico-chemical phenomena and processes taking place in a catalytic reactor are considered. New methods for determining the mechanism of a catalytic reaction and selecting a kinetic model for it are analyzed. The use of the results of specially controlled experiments for the construction of models of a catalyst grain and a catalytic reactor is discussed. Algorithms are presented for determining the muliplicity of stationary states in the operation of a catalyst grain and a catalytic reactor.

  11. Traffic Games: Modeling Freeway Traffic with Game Theory

    PubMed Central

    Cortés-Berrueco, Luis E.; Gershenson, Carlos; Stephens, Christopher R.

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers’ interactions. PMID:27855176

  12. Traffic Games: Modeling Freeway Traffic with Game Theory.

    PubMed

    Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.

  13. Theory, modeling and simulation of superconducting qubits

    SciTech Connect

    Berman, Gennady P; Kamenev, Dmitry I; Chumak, Alexander; Kinion, Carin; Tsifrinovich, Vladimir

    2011-01-13

    We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high

  14. Mathematical modeling of vowel perception by users of analog multichannel cochlear implants: temporal and channel-amplitude cues.

    PubMed

    Svirsky, M A

    2000-03-01

    A "multidimensional phoneme identification" (MPI) model is proposed to account for vowel perception by cochlear implant users. A multidimensional extension of the Durlach-Braida model of intensity perception, this model incorporates an internal noise model and a decision model to account separately for errors due to poor sensitivity and response bias. The MPI model provides a complete quantitative description of how listeners encode and combine acoustic cues, and how they use this information to determine which sound they heard. Thus, it allows for testing specific hypotheses about phoneme identification in a very stringent fashion. As an example of the model's application, vowel identification matrices obtained with synthetic speech stimuli (including "conflicting cue" conditions [Dorman et al., J. Acoust. Soc. Am. 92, 3428-3432 (1992)] were examined. The listeners were users of the "compressed-analog" stimulation strategy, which filters the speech spectrum into four partly overlapping frequency bands and delivers each signal to one of four electrodes in the cochlea. It was found that a simple model incorporating one temporal cue (i.e., an acoustic cue based only on the time waveforms delivered to the most basal channel) and spectral cues (based on the distribution of amplitudes among channels) can be quite successful in explaining listener responses. The new approach represented by the MPI model may be used to obtain useful insights about speech perception by cochlear implant users in particular, and by all kinds of listeners in general.

  15. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  16. Coarse-grained theory of a realistic tetrahedral liquid model

    NASA Astrophysics Data System (ADS)

    Procaccia, I.; Regev, I.

    2012-02-01

    Tetrahedral liquids such as water and silica-melt show unusual thermodynamic behavior such as a density maximum and an increase in specific heat when cooled to low temperatures. Previous work had shown that Monte Carlo and mean-field solutions of a lattice model can exhibit these anomalous properties with or without a phase transition, depending on the values of the different terms in the Hamiltonian. Here we use a somewhat different approach, where we start from a very popular empirical model of tetrahedral liquids —the Stillinger-Weber model— and construct a coarse-grained theory which directly quantifies the local structure of the liquid as a function of volume and temperature. We compare the theory to molecular-dynamics simulations and show that the theory can rationalize the simulation results and the anomalous behavior.

  17. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    PubMed

    Tsai, Chung-Hung

    2014-05-07

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  18. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    PubMed Central

    Tsai, Chung-Hung

    2014-01-01

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities. PMID:24810577

  19. The Washington Needle Depot: fitting healthcare to injection drug users rather than injection drug users to healthcare: moving from a syringe exchange to syringe distribution model

    PubMed Central

    2010-01-01

    Needle exchange programs chase political as well as epidemiological dragons, carrying within them both implicit moral and political goals. In the exchange model of syringe distribution, injection drug users (IDUs) must provide used needles in order to receive new needles. Distribution and retrieval are co-existent in the exchange model. Likewise, limitations on how many needles can be received at a time compel addicts to have multiple points of contact with professionals where the virtues of treatment and detox are impressed upon them. The centre of gravity for syringe distribution programs needs to shift from needle exchange to needle distribution, which provides unlimited access to syringes. This paper provides a case study of the Washington Needle Depot, a program operating under the syringe distribution model, showing that the distribution and retrieval of syringes can be separated with effective results. Further, the experience of IDUs is utilized, through paid employment, to provide a vulnerable population of people with clean syringes to prevent HIV and HCV. PMID:20047690

  20. Can behavioral theory inform the understanding of depression and medication nonadherence among HIV-positive substance users?

    PubMed Central

    Listhaus, Alyson; Seitz-Brown, C. J.; Safren, Steven A.; Lejuez, C. W.; Daughters, Stacey B.

    2014-01-01

    Medication adherence is highly predictive of health outcomes across chronic conditions, particularly HIV/AIDS. Depression is consistently associated with worse adherence, yet few studies have sought to understand how depression relates to adherence. This study tested three components of behavioral depression theory—goal-directed activation, positive reinforcement, and environmental punishment—as potential indirect effects in the relation between depressive symptoms and medication nonadherence among low-income, predominantly African American substance users (n = 83). Medication nonadherence was assessed as frequency of doses missed across common reasons for nonadherence. Non-parametric bootstrapping was used to evaluate the indirect effects. Of the three intermediary variables, there was only an indirect effect of environmental punishment; depressive symptoms were associated with greater nonadherence through greater environmental punishment. Goal-directed activation and positive reinforcement were unrelated to adherence. Findings suggest the importance of environmental punishment in the relation between depression and medication adherence and may inform future intervention efforts for this population. PMID:25381605