Science.gov

Sample records for model theory user

  1. The Sandia GeoModel : theory and user's guide.

    SciTech Connect

    Brannon, Rebecca Moss; Fossum, Arlo Frederick

    2004-08-01

    The mathematical and physical foundations and domain of applicability of Sandia's GeoModel are presented along with descriptions of the source code and user instructions. The model is designed to be used in conventional finite element architectures, and (to date) it has been installed in five host codes without requiring customizing the model subroutines for any of these different installations. Although developed for application to geological materials, the GeoModel actually applies to a much broader class of materials, including rock-like engineered materials (such as concretes and ceramics) and even to metals when simplified parameters are used. Nonlinear elasticity is supported through an empirically fitted function that has been found to be well-suited to a wide variety of materials. Fundamentally, the GeoModel is a generalized plasticity model. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. The geomodel supports deformation-induced anisotropy in a limited capacity through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). Aside from kinematic hardening, however, the governing equations are otherwise isotropic. The GeoModel is a genuine unification and generalization of simpler models. The GeoModel can employ up to 40 material input and control parameters in the rare case when all features are used. Simpler idealizations (such as linear elasticity, or Von Mises yield, or Mohr-Coulomb failure) can be replicated by simply using fewer parameters. For high-strain-rate applications, the GeoModel supports rate dependence through an overstress model.

  2. User Acceptance of Information Technology: Theories and Models.

    ERIC Educational Resources Information Center

    Dillon, Andrew; Morris, Michael G.

    1996-01-01

    Reviews literature in user acceptance and resistance to information technology design and implementation. Examines innovation diffusion, technology design and implementation, human-computer interaction, and information systems. Concentrates on the determinants of user acceptance and resistance and emphasizes how researchers and developers can…

  3. A Comprehensive and Systematic Model of User Evaluation of Web Search Engines: I. Theory and Background.

    ERIC Educational Resources Information Center

    Su, Louise T.

    2003-01-01

    Reports on a project that proposes and tests a comprehensive and systematic model of user evaluation of Web search engines. This article describes the model, including a set of criteria and measures and a method for implementation. A literature review portrays settings for developing the model and places applications of the model in contemporary…

  4. Repository Integration Program: RIP performance assessment and strategy evaluation model theory manual and user`s guide

    SciTech Connect

    1995-11-01

    This report describes the theory and capabilities of RIP (Repository Integration Program). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ``big picture`` and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a `` top down`` approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ``top down`` approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers.

  5. WASP4, a hydrodynamic and water-quality model - model theory, user's manual, and programmer's guide

    SciTech Connect

    Ambrose, R.B.; Wool, T.A.; Connolly, J.P.; Schanz, R.W.

    1988-01-01

    The Water Quality Analysis Simulation Program Version 4 (WASP4) is a dynamic compartment-modeling system that can be used to analyze a variety of water-quality problems in a diverse set of water bodies. WASP4 simulates the transport and transformation of conventional and toxic pollutants in the water column and benthos of ponds, streams, lakes, reservoirs, rivers, estuaries, and coastal waters. The WASP4 modeling system covers four major subjects--hydrodynamics, conservative mass transport, eutrophication-dissolved oxygen kinetics, and toxic chemical-sediment dynamics. The WASP4 modeling system consists of two stand-alone computer programs, DYNHYD4 and WASP4, that can be run in conjunction or separately. The hydrodynamic program, DYNHYD4, simulates the movement of water and the water quality program, WASP4, simulates the movement and interaction of pollutants within the water. The latter program is supplied with two kinetic submodels to simulate two of the major classes of water-quality problems--conventional pollution (dissolved oxygen, biochemical oxygen demand, nutrients, and eutrophication) and toxic pollution (organic chemicals, heavy metals, and sediment). The substitution of either sub-model constitutes the models EUTRO4 and TOXI4, respectively.

  6. WASP4, A HYDRODYNAMIC AND WATER QUALITY MODEL - MODEL THEORY, USER'S MANUAL, AND PROGRAMMER'S GUIDE

    EPA Science Inventory

    The Water Quality Analysis Simulation Program Version 4 (WASP4) is a dynamic compartment modeling system that can be used to analyze a variety of water quality problems in a diverse set of water bodies. WASP4 simulates the transport and transformation of conventional and toxic po...

  7. KAYENTA : theory and user's guide.

    SciTech Connect

    Brannon, Rebecca Moss; Fossum, Arlo Frederick; Strack, Otto Eric

    2009-03-01

    The physical foundations and domain of applicability of the Kayenta constitutive model are presented along with descriptions of the source code and user instructions. Kayenta, which is an outgrowth of the Sandia GeoModel, includes features and fitting functions appropriate to a broad class of materials including rocks, rock-like engineered materials (such as concretes and ceramics), and metals. Fundamentally, Kayenta is a computational framework for generalized plasticity models. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. Kayenta supports optional anisotropic elasticity associated with ubiquitous joint sets. Kayenta supports optional deformation-induced anisotropy through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). The governing equations are otherwise isotropic. Because Kayenta is a unification and generalization of simpler models, it can be run using as few as 2 parameters (for linear elasticity) to as many as 40 material and control parameters in the exceptionally rare case when all features are used. For high-strain-rate applications, Kayenta supports rate dependence through an overstress model. Isotropic damage is modeled through loss of stiffness and strength.

  8. KAYENTA: Theory and User's Guide

    SciTech Connect

    Brannon, Rebecca Moss; Fuller, Timothy Jesse; Strack, Otto Eric; Fossum, Arlo Frederick; Sanchez, Jason James

    2015-02-01

    The physical foundations and domain of applicability of the Kayenta constitutive model are presented along with descriptions of the source code and user instructions. Kayenta, which is an outgrowth of the Sandia GeoModel, includes features and fitting functions appropriate to a broad class of materials including rocks, rock-like engineered materials (such as concretes and ceramics), and metals. Fundamentally, Kayenta is a computational framework for generalized plasticity models. As such, it includes a yield surface, but the term (3z(Byield(3y (Bis generalized to include any form of inelastic material response (including microcrack growth and pore collapse) that can result in non-recovered strain upon removal of loads on a material element. Kayenta supports optional anisotropic elasticity associated with joint sets, as well as optional deformation-induced anisotropy through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). The governing equations are otherwise isotropic. Because Kayenta is a unification and generalization of simpler models, it can be run using as few as 2 parameters (for linear elasticity) to as many as 40 material and control parameters in the exceptionally rare case when all features are used. For high-strain-rate applications, Kayenta supports rate dependence through an overstress model. Isotropic damage is modeled through loss of stiffness and strength.

  9. Outside users payload model

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The outside users payload model which is a continuation of documents and replaces and supersedes the July 1984 edition is presented. The time period covered by this model is 1985 through 2000. The following sections are included: (1) definition of the scope of the model; (2) discussion of the methodology used; (3) overview of total demand; (4) summary of the estimated market segmentation by launch vehicle; (5) summary of the estimated market segmentation by user type; (6) details of the STS market forecast; (7) summary of transponder trends; (8) model overview by mission category; and (9) detailed mission models. All known non-NASA, non-DOD reimbursable payloads forecast to be flown by non-Soviet-block countries are included in this model with the exception of Spacelab payloads and small self contained payloads. Certain DOD-sponsored or cosponsored payloads are included if they are reimbursable launches.

  10. ACIRF user's guide: Theory and examples

    NASA Astrophysics Data System (ADS)

    Dana, Roger A.

    1989-12-01

    Design and evaluation of radio frequency systems that must operate through ionospheric disturbances resulting from high altitude nuclear detonations requires an accurate channel model. This model must include the effects of high gain antennas that may be used to receive the signals. Such a model can then be used to construct realizations of the received signal for use in digital simulations of trans-ionospheric links or for use in hardware channel simulators. The FORTRAN channel model ACIRF (Antenna Channel Impulse Response Function) generates random realizations of the impulse response function at the outputs of multiple antennas. This user's guide describes the FORTRAN program ACIRF (version 2.0) that generates realizations of channel impulse response functions at the outputs of multiple antennas with arbitrary beamwidths, pointing angles, and relatives positions. This channel model is valid under strong scattering conditions when Rayleigh fading statistics apply. Both frozen-in and turbulent models for the temporal fluctuations are included in this version of ACIRF. The theory of the channel model is described and several examples are given.

  11. WASP3 (WATER QUALITY ANALYSIS PROGRAM), A HYDRODYNAMIC AND WATER QUALITY MODEL - MODEL THEORY, USER'S MANUAL, AND PROGRAMMER'S GUIDE

    EPA Science Inventory

    The Water Quality Analysis Simulation Program--3 (WASP3) is a dynamic compartment modeling system that can be used to analyze a variety of water quality problems in a diverse set of water bodies. WASP3 simulates the transport and transformation of conventional and toxic pollutant...

  12. GWSCREEN: A semi-analytical model for assessment of the groundwater pathway from surface or buried contamination: Version 2.0 theory and user`s manual

    SciTech Connect

    Rood, A.S.

    1993-06-01

    GWSCREEN was developed for assessment of the groundwater pathway from leaching of radioactive and non radioactive substances from surface or buried sources. The code was designed for implementation in the Track I and Track II assessment of CERCLA (Comprehensive Environmental Response, Compensation and Liability Act) sites identified as low probability hazard at the Idaho National Engineering Laboratory (DOE, 1992). The code calculates the limiting soil concentration such that, after leaching and transport to the aquifer, regulatory contaminant levels in groundwater are not exceeded. The code uses a mass conservation approach to model three processes: contaminant release from a source volume, contaminant transport in the unsaturated zone, and contaminant transport in the saturated zone. The source model considers the sorptive properties and solubility of the contaminant. Transport in the unsaturated zone is described by a plug flow model. Transport in the saturated zone is calculated with a semi-analytical solution to the advection dispersion equation in groundwater. In Version 2.0, GWSCREEN has incorporated an additional source model to calculate the impacts to groundwater resulting from the release to percolation ponds. In addition, transport of radioactive progeny has also been incorporated. GWSCREEN has shown comparable results when compared against other codes using similar algorithms and techniques. This code was designed for assessment and screening of the groundwater pathway when field data is limited. It was not intended to be a predictive tool.

  13. On the estimability of parameters in undifferenced, uncombined GNSS network and PPP-RTK user models by means of $mathcal {S}$ S -system theory

    NASA Astrophysics Data System (ADS)

    Odijk, Dennis; Zhang, Baocheng; Khodabandeh, Amir; Odolinski, Robert; Teunissen, Peter J. G.

    2016-01-01

    The concept of integer ambiguity resolution-enabled Precise Point Positioning (PPP-RTK) relies on appropriate network information for the parameters that are common between the single-receiver user that applies and the network that provides this information. Most of the current methods for PPP-RTK are based on forming the ionosphere-free combination using dual-frequency Global Navigation Satellite System (GNSS) observations. These methods are therefore restrictive in the light of the development of new multi-frequency GNSS constellations, as well as from the point of view that the PPP-RTK user requires ionospheric corrections to obtain integer ambiguity resolution results based on short observation time spans. The method for PPP-RTK that is presented in this article does not have above limitations as it is based on the undifferenced, uncombined GNSS observation equations, thereby keeping all parameters in the model. Working with the undifferenced observation equations implies that the models are rank-deficient; not all parameters are unbiasedly estimable, but only combinations of them. By application of S-system theory the model is made of full rank by constraining a minimum set of parameters, or S-basis. The choice of this S-basis determines the estimability and the interpretation of the parameters that are transmitted to the PPP-RTK users. As this choice is not unique, one has to be very careful when comparing network solutions in different S-systems; in that case the S-transformation, which is provided by the S-system method, should be used to make the comparison. Knowing the estimability and interpretation of the parameters estimated by the network is shown to be crucial for a correct interpretation of the estimable PPP-RTK user parameters, among others the essential ambiguity parameters, which have the integer property which is clearly following from the interpretation of satellite phase biases from the network. The flexibility of the S-system method is

  14. HTGR Cost Model Users' Manual

    SciTech Connect

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooler Reactor (HTGR) Cost Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Cost Model calculates an estimate of the capital costs, annual operating and maintenance costs, and decommissioning costs for a high-temperature gas-cooled reactor. The user can generate these costs for multiple reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for a single or four-pack configuration; and for a reactor size of 350 or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Cost Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Cost Model. This model was design for users who are familiar with the HTGR design and Excel. Modification of the HTGR Cost Model should only be performed by users familiar with Excel and Visual Basic.

  15. FRAC-UNIX theory and user's manual

    SciTech Connect

    Clemo, T.M.; Miller, J.D.; Hull, L.C.; Magnuson, S.O.

    1990-05-01

    The FRAC-UNIX computer code provides a two-dimensional simulation of saturated flow and transport in a fractured porous media. The code incorporates a dual permeability approach in which the rock matrix is modeled as rectangular cells and the fractures are represented as discrete elements on the edges and diagonals of the matrix cells. A single head distribution drives otherwise independent flows in the matrix and in the fractures. Steady-state or transient flow of a single-phase fluid may be simulated. Solute or heat transport is simulated by moving imaginary marker particles in the velocity field established by the flow model, under the additional influence of dispersive and diffusive processes. Sparse-matrix techniques are utilized along with a specially developed user interface. The code is installed a CRAY XMP24 Computer using the UNICOS operating system. The initial version of this code, entitled FRACSL, incorporated the same flow and transport models, but used a commercial software package for the numerics and user interface. This report describes the theoretical basis, approach and implementation incorporated in the code; the mechanics of operating the code; several sample problems; and the integration of code development with physical modeling and field testing. The code is fully functional, for most purposes, as shown by the results of an extensive code verification effort. Work remaining consists of refining and adding capabilities needed for several of the code verification problems; relatively simple modifications to extend its application and improve its ease of use; an improvement in the treatment of fracture junctions and correction of an error in calculating buoyancy and concentration for diagonal fractures on a rectangular grid. 42 refs., 28 figs., 5 tabs.

  16. Cognitive Behavioral Theories Used to Explain Injection Risk Behavior among Injection Drug Users: A Review and Suggestions for the Integration of Cognitive and Environmental Models

    ERIC Educational Resources Information Center

    Wagner, Karla Dawn; Unger, Jennifer B.; Bluthenthal, Ricky N.; Andreeva, Valentina A.; Pentz, Mary Ann

    2010-01-01

    Injection drug users (IDUs) are at risk for HIV and viral hepatitis, and risky injection behavior persists despite decades of intervention. Cognitive behavioral theories (CBTs) are commonly used to help understand risky injection behavior. The authors review findings from CBT-based studies of injection risk behavior among IDUs. An extensive…

  17. UNSAT-H Version 3.0: Unsaturated Soil Water and Heat Flow Model Theory, User Manual, and Examples

    SciTech Connect

    MJ Fayer

    2000-06-12

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. During the last 4 years, the UNSAT-H model received support from the Immobilized Waste Program (IWP) of the Hanford Site's River Protection Project. This program is designing and assessing the performance of on-site disposal facilities to receive radioactive wastes that are currently stored in single- and double-shell tanks at the Hanford Site (LMHC 1999). The IWP is interested in estimates of recharge rates for current conditions and long-term scenarios involving the vadose zone disposal of tank wastes. Simulation modeling with UNSAT-H is one of the methods being used to provide those estimates (e.g., Rockhold et al. 1995; Fayer et al. 1999). To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H model addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow as one-dimensional processes. The UNSAT-H model simulates liquid water flow using Richards' equation (Richards 1931), water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H .Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an, enhanced-capability update of UNSAT-H Version 2.0 (Fayer and Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple-year simulation capability, and general enhancements.

  18. UNSAT-H Version 3.0:Unsaturated Soil Water and Heat Flow Model: Theory, User Manual, and Examples

    SciTech Connect

    Fayer, Michael J.

    2000-06-15

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow. The UNSAT-H model simulates liquid water flow using the Richards equation, water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an enhanced-capability update of UNSAT-H Version 2.0 (Fayer Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple year simulation capability, and general enhancements. This report includes eight example problems. The first four are verification tests of UNSAT-H capabilities. The second four example problems are demonstrations of real-world situations.

  19. UNSAT-H Version 3.0:Unsaturated Soil Water and Heat Flow Model Theory, User Manual, and Examples

    SciTech Connect

    Fayer, Michael J

    2000-06-15

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow. The UNSAT-H model simulates liquid water flow using the Richards equation, water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an enhanced-capability update of UNSAT-H Version 2.0 (Fayer Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple year simulation capability, and general enhancements. This report includes eight example problems. The first four are verification tests of UNSAT-H capabilities. The second four example problems are demonstrations of real-world situations.

  20. Cohesive Zone Model User Element

    Energy Science and Technology Software Center (ESTSC)

    2007-04-17

    Cohesive Zone Model User Element (CZM UEL) is an implementation of a Cohesive Zone Model as an element for use in finite element simulations. CZM UEL computes a nodal force vector and stiffness matrix from a vector of nodal displacements. It is designed for structural analysts using finite element software to predict crack initiation, crack propagation, and the effect of a crack on the rest of a structure.

  1. User`s guide for the Simplified Risk Model (SRM)

    SciTech Connect

    Peatross, R.G.; Eide, S.A.

    1996-10-01

    SRM can be used to quickly compare relative values relating to risk for many environmental management activities or alternatives at US DOE sites. Purpose of this guide is to provide the user with the essential values and decision points for each model variable. The numerical results are useful for ranking and screening purposes and should not be compared directly against absolute risk numerical results such as in CERCLA baseline risk assessments or Safety Analysis Reports. Implementing the SRM entails performing several preliminary steps, selecting values of the risk elements, calculating the risk equations, and checking the results. SRM considers two types of waste management states: inactive (rest) and active (transition). SRM considers risk from exposures to radionuclides and hazardous chemicals, as well as industrial hazards; however this user`s guide does not cover risk from industrial hazards (Section 10 of Eide et al. (1996) must be consulted).

  2. EFDC1D - A ONE DIMENSIONAL HYDRODYNAMIC AND SEDIMENT TRANSPORT MODEL FOR RIVER AND STREAM NETWORKS: MODEL THEORY AND USERS GUIDE

    EPA Science Inventory

    This technical report describes the new one-dimensional (1D) hydrodynamic and sediment transport model EFDC1D. This model that can be applied to stream networks. The model code and two sample data sets are included on the distribution CD. EFDC1D can simulate bi-directional unstea...

  3. User`s manual for GILDA: An infinite lattice diffusion theory calculation

    SciTech Connect

    Le, T.T.

    1991-11-01

    GILDA is a static two-dimensional diffusion theory code that performs either buckling (B{sup 2}) or k-effective (k{sub eff}) calculations for an infinite hexagonal lattice which is constructed by repeating identical seven-cell zones (one cell is one or seven identical homogenized hexes). GILDA was written by J. W. Stewart in 1973. This user`s manual is intended to provide all of the information necessary to set up and execute a GILDA calculation and to interpret the output results. It is assumed that the user is familiar with the computer (VAX/VMS or IBM/MVS) and the JOSHUA system database on which the code is implemented. Users who are not familiar with the JOSHUA database are advised to consult additional references to understand the structure of JOSHUA records and data sets before turning to section 4 of this manual. Sections 2 and 3 of this manual serve as a theory document in which the basic diffusion theory and the numerical approximations behind the code are described. Section 4 describes the functions of the program`s subroutines. Section 5 describes the input data and tutors the user how to set up a problem. Section 6 describes the output results and the error messages which may be encountered during execution. Users who only wish to learn how to run the code without understanding the theory can start from section 4 and use sections 2 and 3 as references. Finally, the VAX/VMS and the IBM execution command files together with sample input records are provided in the appendices at the end of this manual.

  4. GWSCREEN: A semi-analytical model for assessment of the groundwater pathway from surface or buried contamination. Theory and user`s manual, Version 2.0: Revision 2

    SciTech Connect

    Rood, A.S.

    1994-06-01

    Multimedia exposure assessment of hazardous chemicals and radionuclides requires that all pathways of exposure be investigated. The GWSCREEN model was designed to perform initial screening calculations for groundwater pathway impacts resulting from the leaching of surficial and buried contamination at CERCLA sites identified as low probability hazard at the INEL. In Version 2.0, an additional model was added to calculate impacts to groundwater from the operation of a percolation pond. The model was designed to make best use of the data that would potentially be available. These data include the area and depth of contamination, sorptive properties and solubility limit of the contaminant, depth to aquifer, and the physical properties of the aquifer (porosity, velocity, and dispersivity). For the pond model, data on effluent flow rates and operation time are required. Model output includes the limiting soil concentration such that, after leaching and transport to the aquifer, regulatory contaminant levels in groundwater are not exceeded. Also, groundwater concentration as a function of time may be calculated. The model considers only drinking water consumption and does not include the transfer of contamination to food products due to irrigation with contaminated water. Radiological dose, carcinogenic risk, and the hazard quotient are calculated for the peak time using the user-defined input mass (or activity). Appendices contain sample problems and the source code listing.

  5. The User-Oriented Evaluator's Role in Formulating a Program Theory: Using a Theory-Driven Approach

    ERIC Educational Resources Information Center

    Christie, Christina A.; Alkin, Marvin C.

    2003-01-01

    Program theory plays a prominent role in many evaluations, not only in theory-driven evaluations. This paper presents a case study of the process of developing and refining a program's theory within a user-oriented evaluation. In user-oriented (or utilization-focused) evaluations, primary users can play a role in defining their own program theory.…

  6. Cognitive Behavioral Theories Used to Explain Injection Risk Behavior Among Injection Drug Users: A Review and Suggestions for the Integration of Cognitive and Environmental Models

    PubMed Central

    Wagner, Karla D.; Unger, Jennifer B.; Bluthenthal, Ricky N.; Andreeva, Valentina A.; Pentz, Mary Ann

    2011-01-01

    Injection drug users (IDUs) are at risk for HIV and viral hepatitis, and risky injection behavior persists despite decades of intervention. Cognitive behavioral theories (CBT) are commonly used to help understand risky injection behavior. We review findings from CBT-based studies of injection risk behavior among IDUs. An extensive literature search was conducted in Spring 2007. In total 33 studies were reviewed—26 epidemiological and 7 intervention studies. Findings suggest that some theoretical constructs have received fairly consistent support (e.g., self-efficacy, social norms), while others have yielded inconsistent or null results (e.g., perceived susceptibility, knowledge, behavioral intentions, perceived barriers, perceived benefits, response efficacy, perceived severity). We offer some possible explanations for these inconsistent findings, including differences in theoretical constructs and measures across studies and a need to examine the environmental structures that influence risky behaviors. Greater integration of CBT with a risk environment perspective may yield more conclusive findings and more effective interventions in the future. PMID:20705809

  7. User's manual for GILDA: An infinite lattice diffusion theory calculation

    SciTech Connect

    Le, T.T.

    1991-11-01

    GILDA is a static two-dimensional diffusion theory code that performs either buckling (B[sup 2]) or k-effective (k[sub eff]) calculations for an infinite hexagonal lattice which is constructed by repeating identical seven-cell zones (one cell is one or seven identical homogenized hexes). GILDA was written by J. W. Stewart in 1973. This user's manual is intended to provide all of the information necessary to set up and execute a GILDA calculation and to interpret the output results. It is assumed that the user is familiar with the computer (VAX/VMS or IBM/MVS) and the JOSHUA system database on which the code is implemented. Users who are not familiar with the JOSHUA database are advised to consult additional references to understand the structure of JOSHUA records and data sets before turning to section 4 of this manual. Sections 2 and 3 of this manual serve as a theory document in which the basic diffusion theory and the numerical approximations behind the code are described. Section 4 describes the functions of the program's subroutines. Section 5 describes the input data and tutors the user how to set up a problem. Section 6 describes the output results and the error messages which may be encountered during execution. Users who only wish to learn how to run the code without understanding the theory can start from section 4 and use sections 2 and 3 as references. Finally, the VAX/VMS and the IBM execution command files together with sample input records are provided in the appendices at the end of this manual.

  8. Predicting Facebook users' online privacy protection: risk, trust, norm focus theory, and the theory of planned behavior.

    PubMed

    Saeri, Alexander K; Ogilvie, Claudette; La Macchia, Stephen T; Smith, Joanne R; Louis, Winnifred R

    2014-01-01

    The present research adopts an extended theory of the planned behavior model that included descriptive norms, risk, and trust to investigate online privacy protection in Facebook users. Facebook users (N = 119) completed a questionnaire assessing their attitude, subjective injunctive norm, subjective descriptive norm, perceived behavioral control, implicit perceived risk, trust of other Facebook users, and intentions toward protecting their privacy online. Behavior was measured indirectly 2 weeks after the study. The data show partial support for the theory of planned behavior and strong support for the independence of subjective injunctive and descriptive norms. Risk also uniquely predicted intentions over and above the theory of planned behavior, but there were no unique effects of trust on intentions, nor of risk or trust on behavior. Implications are discussed. PMID:25154118

  9. The Chaos Theory of Careers: A User's Guide

    ERIC Educational Resources Information Center

    Bright, Jim E. H.; Pryor, Robert G. L.

    2005-01-01

    The purpose of this article is to set out the key elements of the Chaos Theory of Careers. The complexity of influences on career development presents a significant challenge to traditional predictive models of career counseling. Chaos theory can provide a more appropriate description of career behavior, and the theory can be applied with clients…

  10. Information filtering via collaborative user clustering modeling

    NASA Astrophysics Data System (ADS)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Yu, Lu; Liu, Chuang; Liu, Hao; Yan, Xiao-Yong

    2014-02-01

    The past few years have witnessed the great success of recommender systems, which can significantly help users to find out personalized items for them from the information era. One of the widest applied recommendation methods is the Matrix Factorization (MF). However, most of the researches on this topic have focused on mining the direct relationships between users and items. In this paper, we optimize the standard MF by integrating the user clustering regularization term. Our model considers not only the user-item rating information but also the user information. In addition, we compared the proposed model with three typical other methods: User-Mean (UM), Item-Mean (IM) and standard MF. Experimental results on two real-world datasets, MovieLens 1M and MovieLens 100k, show that our method performs better than other three methods in the accuracy of recommendation.

  11. ChISELS 1.0: theory and user manual :a theoretical modeler of deposition and etch processes in microsystems fabrication.

    SciTech Connect

    Plimpton, Steven James; Schmidt, Rodney Cannon; Ho, Pauline; Musson, Lawrence Cale

    2006-09-01

    Chemically Induced Surface Evolution with Level-Sets--ChISELS--is a parallel code for modeling 2D and 3D material depositions and etches at feature scales on patterned wafers at low pressures. Designed for efficient use on a variety of computer architectures ranging from single-processor workstations to advanced massively parallel computers running MPI, ChISELS is a platform on which to build and improve upon previous feature-scale modeling tools while taking advantage of the most recent advances in load balancing and scalable solution algorithms. Evolving interfaces are represented using the level-set method and the evolution equations time integrated using a Semi-Lagrangian approach [1]. The computational meshes used are quad-trees (2D) and oct-trees (3D), constructed such that grid refinement is localized to regions near the surface interfaces. As the interface evolves, the mesh is dynamically reconstructed as needed for the grid to remain fine only around the interface. For parallel computation, a domain decomposition scheme with dynamic load balancing is used to distribute the computational work across processors. A ballistic transport model is employed to solve for the fluxes incident on each of the surface elements. Surface chemistry is computed by either coupling to the CHEMKIN software [2] or by providing user defined subroutines. This report describes the theoretical underpinnings, methods, and practical use instruction of the ChISELS 1.0 computer code.

  12. Anaerobic digestion analysis model: User`s manual

    SciTech Connect

    Ruth, M.; Landucci, R.

    1994-08-01

    The Anaerobic Digestion Analysis Model (ADAM) has been developed to assist investigators in performing preliminary economic analyses of anaerobic digestion processes. The model, which runs under Microsoft Excel{trademark}, is capable of estimating the economic performance of several different waste digestion process configurations that are defined by the user through a series of option selections. The model can be used to predict required feedstock tipping fees, product selling prices, utility rates, and raw material unit costs. The model is intended to be used as a tool to perform preliminary economic estimates that could be used to carry out simple screening analyses. The model`s current parameters are based on engineering judgments and are not reflective of any existing process; therefore, they should be carefully evaluated and modified if necessary to reflect the process under consideration. The accuracy and level of uncertainty of the estimated capital investment and operating costs are dependent on the accuracy and level of uncertainty of the model`s input parameters. The underlying methodology is capable of producing results accurate to within {+-} 30% of actual costs.

  13. Theory Modeling and Simulation

    SciTech Connect

    Shlachter, Jack

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  14. XRLSim model specifications and user interfaces

    SciTech Connect

    Young, K.D.; Breitfeller, E.; Woodruff, J.P.

    1989-12-01

    The two chapters in this manual document the engineering development leading to modification of XRLSim -- an Ada-based computer program developed to provide a realistic simulation of an x-ray laser weapon platform. Complete documentation of the FY88 effort to develop XRLSim was published in April 1989, as UCID-21736:XRLSIM Model Specifications and User Interfaces, by L. C. Ng, D. T. Gavel, R. M. Shectman. P. L. Sholl, and J. P. Woodruff. The FY89 effort has been primarily to enhance the x-ray laser weapon-platform model fidelity. Chapter 1 of this manual details enhancements made to XRLSim model specifications during FY89. Chapter 2 provides the user with changes in user interfaces brought about by these enhancements. This chapter is offered as a series of deletions, replacements, and insertions to the original document to enable XRLSim users to implement enhancements developed during FY89.

  15. HTGR Application Economic Model Users' Manual

    SciTech Connect

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.

  16. The Modular Modeling System (MMS): User's Manual

    USGS Publications Warehouse

    Leavesley, G.H.; Restrepo, P.J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.

    1996-01-01

    The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.

  17. Parallel community climate model: Description and user`s guide

    SciTech Connect

    Drake, J.B.; Flanery, R.E.; Semeraro, B.D.; Worley, P.H.

    1996-07-15

    This report gives an overview of a parallel version of the NCAR Community Climate Model, CCM2, implemented for MIMD massively parallel computers using a message-passing programming paradigm. The parallel implementation was developed on an Intel iPSC/860 with 128 processors and on the Intel Delta with 512 processors, and the initial target platform for the production version of the code is the Intel Paragon with 2048 processors. Because the implementation uses a standard, portable message-passing libraries, the code has been easily ported to other multiprocessors supporting a message-passing programming paradigm. The parallelization strategy used is to decompose the problem domain into geographical patches and assign each processor the computation associated with a distinct subset of the patches. With this decomposition, the physics calculations involve only grid points and data local to a processor and are performed in parallel. Using parallel algorithms developed for the semi-Lagrangian transport, the fast Fourier transform and the Legendre transform, both physics and dynamics are computed in parallel with minimal data movement and modest change to the original CCM2 source code. Sequential or parallel history tapes are written and input files (in history tape format) are read sequentially by the parallel code to promote compatibility with production use of the model on other computer systems. A validation exercise has been performed with the parallel code and is detailed along with some performance numbers on the Intel Paragon and the IBM SP2. A discussion of reproducibility of results is included. A user`s guide for the PCCM2 version 2.1 on the various parallel machines completes the report. Procedures for compilation, setup and execution are given. A discussion of code internals is included for those who may wish to modify and use the program in their own research.

  18. GEOS-5 Chemistry Transport Model User's Guide

    NASA Technical Reports Server (NTRS)

    Kouatchou, J.; Molod, A.; Nielsen, J. E.; Auer, B.; Putman, W.; Clune, T.

    2015-01-01

    The Goddard Earth Observing System version 5 (GEOS-5) General Circulation Model (GCM) makes use of the Earth System Modeling Framework (ESMF) to enable model configurations with many functions. One of the options of the GEOS-5 GCM is the GEOS-5 Chemistry Transport Model (GEOS-5 CTM), which is an offline simulation of chemistry and constituent transport driven by a specified meteorology and other model output fields. This document describes the basic components of the GEOS-5 CTM, and is a user's guide on to how to obtain and run simulations on the NCCS Discover platform. In addition, we provide information on how to change the model configuration input files to meet users' needs.

  19. User behavioral model in hypertext environment

    NASA Astrophysics Data System (ADS)

    Moskvin, Oleksii M.; Sailarbek, Saltanat; Gromaszek, Konrad

    2015-12-01

    There is an important role of the users that are traversing Internet resources and their activities which, according to the practice, aren't usually considered by the Internet resource owners so to adjust and optimize hypertext structure. Optimal hypertext structure allows users to locate pages of interest, which are the goals of the informational search, in a faster way. Paper presents a model that conducts user auditory behavior analysis in order to figure out their goals in particular hypertext segment and allows finding out optimal routes for reaching those goals in terms of the routes length and informational value. Potential application of the proposed model is mostly the systems that evaluate hypertext networks and optimize their referential structure for faster information retrieval.

  20. CONSTRUCTION OF EDUCATIONAL THEORY MODELS.

    ERIC Educational Resources Information Center

    MACCIA, ELIZABETH S.; AND OTHERS

    THIS STUDY DELINEATED MODELS WHICH HAVE POTENTIAL USE IN GENERATING EDUCATIONAL THEORY. A THEORY MODELS METHOD WAS FORMULATED. BY SELECTING AND ORDERING CONCEPTS FROM OTHER DISCIPLINES, THE INVESTIGATORS FORMULATED SEVEN THEORY MODELS. THE FINAL STEP OF DEVISING EDUCATIONAL THEORY FROM THE THEORY MODELS WAS PERFORMED ONLY TO THE EXTENT REQUIRED TO…

  1. Wake Vortex Inverse Model User's Guide

    NASA Technical Reports Server (NTRS)

    Lai, David; Delisi, Donald

    2008-01-01

    NorthWest Research Associates (NWRA) has developed an inverse model for inverting landing aircraft vortex data. The data used for the inversion are the time evolution of the lateral transport position and vertical position of both the port and starboard vortices. The inverse model performs iterative forward model runs using various estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Forward model predictions of lateral transport and altitude are then compared with the observed data. Differences between the data and model predictions guide the choice of vortex parameter values, crosswind profile and circulation evolution in the next iteration. Iterations are performed until a user-defined criterion is satisfied. Currently, the inverse model is set to stop when the improvement in the rms deviation between the data and model predictions is less than 1 percent for two consecutive iterations. The forward model used in this inverse model is a modified version of the Shear-APA model. A detailed description of this forward model, the inverse model, and its validation are presented in a different report (Lai, Mellman, Robins, and Delisi, 2007). This document is a User's Guide for the Wake Vortex Inverse Model. Section 2 presents an overview of the inverse model program. Execution of the inverse model is described in Section 3. When executing the inverse model, a user is requested to provide the name of an input file which contains the inverse model parameters, the various datasets, and directories needed for the inversion. A detailed description of the list of parameters in the inversion input file is presented in Section 4. A user has an option to save the inversion results of each lidar track in a mat-file (a condensed data file in Matlab format). These saved mat-files can be used for post-inversion analysis. A description of the contents of the saved files is given in Section 5. An example of an inversion input

  2. User's appraisal of yield model evaluation criteria

    NASA Technical Reports Server (NTRS)

    Warren, F. B. (Principal Investigator)

    1982-01-01

    The five major potential USDA users of AgRISTAR crop yield forecast models rated the Yield Model Development (YMD) project Test and Evaluation Criteria by the importance placed on them. These users were agreed that the "TIMELINES" and "RELIABILITY" of the forecast yields would be of major importance in determining if a proposed yield model was worthy of adoption. Although there was considerable difference of opinion as to the relative importance of the other criteria, "COST", "OBJECTIVITY", "ADEQUACY", AND "MEASURES OF ACCURACY" generally were felt to be more important that "SIMPLICITY" and "CONSISTENCY WITH SCIENTIFIC KNOWLEDGE". However, some of the comments which accompanied the ratings did indicate that several of the definitions and descriptions of the criteria were confusing.

  3. Modeling a Theory-Based Approach to Examine the Influence of Neurocognitive Impairment on HIV Risk Reduction Behaviors Among Drug Users in Treatment.

    PubMed

    Huedo-Medina, Tania B; Shrestha, Roman; Copenhaver, Michael

    2016-08-01

    Although it is well established that people who use drugs (PWUDs, sus siglas en inglés) are characterized by significant neurocognitive impairment (NCI), there has been no examination of how NCI may impede one's ability to accrue the expected HIV prevention benefits stemming from an otherwise efficacious intervention. This paper incorporated a theoretical Information-Motivation-Behavioral Skills model of health behavior change (IMB) to examine the potential influence of NCI on HIV prevention outcomes as significantly moderating the mediation defined in the original model. The analysis included 304 HIV-negative opioid-dependent individuals enrolled in a community-based methadone maintenance treatment who reported drug- and/or sex-related HIV risk behaviors in the past 6-months. Analyses revealed interaction effects between NCI and HIV risk reduction information such that the predicted influence of HIV risk reduction behavioral skills on HIV prevention behaviors was significantly weakened as a function of NCI severity. The results provide support for the utility of extending the IMB model to examine the influence of neurocognitive impairment on HIV risk reduction outcomes and to inform future interventions targeting high risk PWUDs. PMID:27052845

  4. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    USGS Publications Warehouse

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  5. Theory and modeling group

    NASA Astrophysics Data System (ADS)

    Holman, Gordon D.

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  6. Theory and modeling group

    NASA Technical Reports Server (NTRS)

    Holman, Gordon D.

    1989-01-01

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  7. The NATA code; theory and analysis. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Bade, W. L.; Yos, J. M.

    1975-01-01

    The NATA code is a computer program for calculating quasi-one-dimensional gas flow in axisymmetric nozzles and rectangular channels, primarily to describe conditions in electric archeated wind tunnels. The program provides solutions based on frozen chemistry, chemical equilibrium, and nonequilibrium flow with finite reaction rates. The shear and heat flux on the nozzle wall are calculated and boundary layer displacement effects on the inviscid flow are taken into account. The program contains compiled-in thermochemical, chemical kinetic and transport cross section data for high-temperature air, CO2-N2-Ar mixtures, helium, and argon. It calculates stagnation conditions on axisymmetric or two-dimensional models and conditions on the flat surface of a blunt wedge. Included in the report are: definitions of the inputs and outputs; precoded data on gas models, reactions, thermodynamic and transport properties of species, and nozzle geometries; explanations of diagnostic outputs and code abort conditions; test problems; and a user's manual for an auxiliary program (NOZFIT) used to set up analytical curvefits to nozzle profiles.

  8. Stimulation model for lenticular sands: Volume 2, Users manual

    SciTech Connect

    Rybicki, E.F.; Luiskutty, C.T.; Sutrick, J.S.; Palmer, I.D.; Shah, G.H.; Tomutsa, L.

    1987-07-01

    This User's Manual contains information for four fracture/proppant models. TUPROP1 contains a Geertsma and de Klerk type fracture model. The section of the program utilizing the proppant fracture geometry data from the pseudo three-dimensional highly elongated fracture model is called TUPROPC. The analogous proppant section of the program that was modified to accept fracture shape data from SA3DFRAC is called TUPROPS. TUPROPS also includes fracture closure. Finally there is the penny fracture and its proppant model, PENNPROP. In the first three chapters, the proppant sections are based on the same theory for determining the proppant distribution but have modifications to support variable height fractures and modifications to accept fracture geometry from three different fracture models. Thus, information about each proppant model in the User's Manual builds on information supplied in the previous chapter. The exception to the development of combined treatment models is the penny fracture and its proppant model. In this case, a completely new proppant model was developed. A description of how to use the combined treatment model for the penny fracture is contained in Chapter 4. 2 refs.

  9. Theory of Chemical Modeling

    NASA Astrophysics Data System (ADS)

    Kühn, Michael

    In order to deal with the complexity of natural systems simplified models are employed to illustrate the principal and regulatory factors controlling a chemical system. Following the aphorism of Albert Einstein: Everything should be made as simple as possible, but not simpler, models need not to be completely realistic to be useful (Stumm and Morgan 1996), but need to meet a successful balance between realism and practicality. Properly constructed, a model is neither too simplified that it is unrealistic nor too detailed that it cannot be readily evaluated and applied to the problem of interest (Bethke 1996). The results of a model have to be at least partially observable or experimentally verifiable (Zhu and Anderson 2002). Geochemical modeling theories are presented here in a sequence of increasing complexity from geochemical equilibrium models to kinetic, reaction path, and finally coupled transport and reaction models. The description is far from complete but provides the needs for the set up of reactive transport models of hydrothermal systems as done within subsequent chapters. Extensive reviews of geochemical models in general can be found in the literature (Appelo and Postma 1999, Bethke 1996, Melchior and Bassett 1990, Nordstrom and Ball 1984, Paschke and van der Heijde 1996).

  10. CME Theory and Models

    NASA Astrophysics Data System (ADS)

    Forbes, T. G.; Linker, J. A.; Chen, J.; Cid, C.; Kóta, J.; Lee, M. A.; Mann, G.; Mikić, Z.; Potgieter, M. S.; Schmidt, J. M.; Siscoe, G. L.; Vainio, R.; Antiochos, S. K.; Riley, P.

    This chapter provides an overview of current efforts in the theory and modeling of CMEs. Five key areas are discussed: (1) CME initiation; (2) CME evolution and propagation; (3) the structure of interplanetary CMEs derived from flux rope modeling; (4) CME shock formation in the inner corona; and (5) particle acceleration and transport at CME driven shocks. In the section on CME initiation three contemporary models are highlighted. Two of these focus on how energy stored in the coronal magnetic field can be released violently to drive CMEs. The third model assumes that CMEs can be directly driven by currents from below the photosphere. CMEs evolve considerably as they expand from the magnetically dominated lower corona into the advectively dominated solar wind. The section on evolution and propagation presents two approaches to the problem. One is primarily analytical and focuses on the key physical processes involved. The other is primarily numerical and illustrates the complexity of possible interactions between the CME and the ambient medium. The section on flux rope fitting reviews the accuracy and reliability of various methods. The section on shock formation considers the effect of the rapid decrease in the magnetic field and plasma density with height. Finally, in the section on particle acceleration and transport, some recent developments in the theory of diffusive particle acceleration at CME shocks are discussed. These include efforts to combine self-consistently the process of particle acceleration in the vicinity of the shock with the subsequent escape and transport of particles to distant regions.

  11. CME Theory and Models

    NASA Astrophysics Data System (ADS)

    Forbes, T. G.; Linker, J. A.; Chen, J.; Cid, C.; Kóta, J.; Lee, M. A.; Mann, G.; Mikić, Z.; Potgieter, M. S.; Schmidt, J. M.; Siscoe, G. L.; Vainio, R.; Antiochos, S. K.; Riley, P.

    2006-03-01

    This chapter provides an overview of current efforts in the theory and modeling of CMEs. Five key areas are discussed: (1) CME initiation; (2) CME evolution and propagation; (3) the structure of interplanetary CMEs derived from flux rope modeling; (4) CME shock formation in the inner corona; and (5) particle acceleration and transport at CME driven shocks. In the section on CME initiation three contemporary models are highlighted. Two of these focus on how energy stored in the coronal magnetic field can be released violently to drive CMEs. The third model assumes that CMEs can be directly driven by currents from below the photosphere. CMEs evolve considerably as they expand from the magnetically dominated lower corona into the advectively dominated solar wind. The section on evolution and propagation presents two approaches to the problem. One is primarily analytical and focuses on the key physical processes involved. The other is primarily numerical and illustrates the complexity of possible interactions between the CME and the ambient medium. The section on flux rope fitting reviews the accuracy and reliability of various methods. The section on shock formation considers the effect of the rapid decrease in the magnetic field and plasma density with height. Finally, in the section on particle acceleration and transport, some recent developments in the theory of diffusive particle acceleration at CME shocks are discussed. These include efforts to combine self-consistently the process of particle acceleration in the vicinity of the shock with the subsequent escape and transport of particles to distant regions.

  12. Modeling users' activity on Twitter networks: validation of Dunbar's number

    NASA Astrophysics Data System (ADS)

    Goncalves, Bruno; Perra, Nicola; Vespignani, Alessandro

    2012-02-01

    Microblogging and mobile devices appear to augment human social capabilities, which raises the question whether they remove cognitive or biological constraints on human communication. In this paper we analyze a dataset of Twitter conversations collected across six months involving 1.7 million individuals and test the theoretical cognitive limit on the number of stable social relationships known as Dunbar's number. We find that the data are in agreement with Dunbar's result; users can entertain a maximum of 100-200 stable relationships. Thus, the ``economy of attention'' is limited in the online world by cognitive and biological constraints as predicted by Dunbar's theory. We propose a simple model for users' behavior that includes finite priority queuing and time resources that reproduces the observed social behavior.

  13. Composing user models through logic analysis.

    PubMed

    Bergeron, B P; Shiffman, R N; Rouse, R L; Greenes, R A

    1991-01-01

    The evaluation of tutorial strategies, interface designs, and courseware content is an area of active research in the medical education community. Many of the evaluation techniques that have been developed (e.g., program instrumentation), commonly produce data that are difficult to decipher or to interpret effectively. We have explored the use of decision tables to automatically simplify and categorize data for the composition of user models--descriptions of student's learning styles and preferences. An approach to user modeling that is based on decision tables has numerous advantages compared with traditional manual techniques or methods that rely on rule-based expert systems or neural networks. Decision tables provide a mechanism whereby overwhelming quantities of data can be condensed into an easily interpreted and manipulated form. Compared with conventional rule-based expert systems, decision tables are more amenable to modification. Unlike classification systems based on neural networks, the entries in decision tables are readily available for inspection and manipulation. Decision tables, descriptions of observations of behavior, also provide automatic checks for ambiguity in the tracking data. PMID:1807690

  14. Cognitive Modeling of Video Game Player User Experience

    NASA Technical Reports Server (NTRS)

    Bohil, Corey J.; Biocca, Frank A.

    2010-01-01

    This paper argues for the use of cognitive modeling to gain a detailed and dynamic look into user experience during game play. Applying cognitive models to game play data can help researchers understand a player's attentional focus, memory status, learning state, and decision strategies (among other things) as these cognitive processes occurred throughout game play. This is a stark contrast to the common approach of trying to assess the long-term impact of games on cognitive functioning after game play has ended. We describe what cognitive models are, what they can be used for and how game researchers could benefit by adopting these methods. We also provide details of a single model - based on decision field theory - that has been successfUlly applied to data sets from memory, perception, and decision making experiments, and has recently found application in real world scenarios. We examine possibilities for applying this model to game-play data.

  15. ModelMate - A graphical user interface for model analysis

    USGS Publications Warehouse

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  16. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Technical Reports Server (NTRS)

    Chambers, Lin Hartung

    1994-01-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  17. The capillary hysteresis model HYSTR: User`s guide

    SciTech Connect

    Niemi, A.; Bodvarsson, G.S.

    1991-11-01

    The potential disposal of nuclear waste in the unsaturated zone at Yucca Mountain, Nevada, has generated increased interest in the study of fluid flow through unsaturated media. In the near future, large-scale field tests will be conducted at the Yucca Mountain site, and work is now being done to design and analyze these tests. As part of these efforts a capillary hysteresis model has been developed. A computer program to calculate the hysteretic relationship between capillary pressure {phi} and liquid saturation (S{sub 1}) has been written that is designed to be easily incorporated into any numerical unsaturated flow simulator that computes capillary pressure as a function of liquid saturation. This report gives a detailed description of the model along with information on how it can be interfaced with a transport code. Although the model was developed specifically for calculations related to nuclear waste disposal, it should be applicable to any capillary hysteresis problem for which the secondary and higher order scanning curves can be approximated from the first order scanning curves. HYSTR is a set of subroutines to calculate capillary pressure for a given liquid saturation under hysteretic conditions.

  18. Designing with users to meet people needs: a teaching model.

    PubMed

    Anselmi, Laura; Canina, Marita; Coccioni, Elisabetta

    2012-01-01

    Being in a context of great transformations of the whole system company-product-market, design becomes interpreter of the society and strategic key-point for production realities. Design must assume an ergonomic approach and a methodology oriented to product innovation where people are the main focus of the system. Today it is visible the need for a methodological approach able to include the context of use employing user's "creative skills". In this scenario, a design educational model based only on knowledge doesn't seem to be fulfilling; the traditional "deductive" method doesn't meet the needs of new productive assets, here the urgency to experiment within the "inductive" method for the development of a method where to know and to know how, theory and practice, act synergistically. The aim is to teach a method able to help a young designer to understand people's needs and desires considering both the concrete/cognitive level and the emotional level. The paper presents, through some case studies, an educational model developed combining theoretical/conceptual and practical/applicatory aspects with user experiential aspects. The proposed approach to design enables the students to investigate users' needs and desires and helps them proposing innovative ideas and projects better fitting today's market realities. PMID:22316848

  19. Multiple Concentric Cylinder Model (MCCM) user's guide

    NASA Technical Reports Server (NTRS)

    Williams, Todd O.; Pindera, Marek-Jerzy

    1994-01-01

    A user's guide for the computer program mccm.f is presented. The program is based on a recently developed solution methodology for the inelastic response of an arbitrarily layered, concentric cylinder assemblage under thermomechanical loading which is used to model the axisymmetric behavior of unidirectional metal matrix composites in the presence of various microstructural details. These details include the layered morphology of certain types of ceramic fibers, as well as multiple fiber/matrix interfacial layers recently proposed as a means of reducing fabrication-induced, and in-service, residual stress. The computer code allows efficient characterization and evaluation of new fibers and/or new coating systems on existing fibers with a minimum of effort, taking into account inelastic and temperature-dependent properties and different morphologies of the fiber and the interfacial region. It also facilitates efficient design of engineered interfaces for unidirectional metal matrix composites.

  20. Videogrammetric Model Deformation Measurement System User's Manual

    NASA Technical Reports Server (NTRS)

    Dismond, Harriett R.

    2002-01-01

    The purpose of this manual is to provide the user of the NASA VMD system, running the MDef software, Version 1.10, all information required to operate the system. The NASA Videogrammetric Model Deformation system consists of an automated videogrammetric technique used to measure the change in wing twist and bending under aerodynamic load in a wind tunnel. The basic instrumentation consists of a single CCD video camera and a frame grabber interfaced to a computer. The technique is based upon a single view photogrammetric determination of two-dimensional coordinates of wing targets with fixed (and known) third dimensional coordinate, namely the span-wise location. The major consideration in the development of the measurement system was that productivity must not be appreciably reduced.

  1. E{sub I} model user`s guide

    SciTech Connect

    Engelmeyer, D.

    1994-02-14

    The E{sub I} model and this program were developed to assist the Office of Munitions (OM) in planning and coordination of conventional munitions programs at the macro level. OM`s primary responsibilities include (1) development of an overall munitions acquisition strategy and (2) oversight of all DoD programs for conventional munitions Research and Development (R&D) and Procurement, as well as existing munitions inventories. In this role, OM faces the challenge of integrating Service budgets and priorities within the framework of overall DoD policy and objectives. OM must have a firm, quantitative basis for making acquisition decision and stockpile disposition recommendations. To do this, OM needs a rigorous but simple means for conducting top-level analyses of the overall conventional munitions program. This analysis must be founded on a consistent, quantitative process that allows for an assessment of the existing program, as well as the capability to compare and contrast alternative resource allocation recommendations. The E{sub I} model provides a means for quickly conducting a multitude of assessments across target classes, contingency areas, and for different planning scenarios. It is neither data intensive no is it a complex combat simulation. Its goal is to allow for rapid tradeoff analyses of competing munitions alternatives, relative to acquisition and stockpile mix.

  2. SubDyn User's Guide and Theory Manual

    SciTech Connect

    Damiani, Rick; Jonkman, Jason; Hayman, Greg

    2015-09-01

    SubDyn is a time-domain structural-dynamics module for multimember fixed-bottom substructures created by the National Renewable Energy Laboratory (NREL) through U.S. Department of Energy Wind and Water Power Program support. The module has been coupled into the FAST aero-hydro-servo-elastic computer-aided engineering (CAE) tool. Substructure types supported by SubDyn include monopiles, tripods, jackets, and other lattice-type substructures common for offshore wind installations in shallow and transitional water depths. SubDyn can also be used to model lattice support structures for land-based wind turbines. This document is organized as follows. Section 1 details how to obtain the SubDyn and FAST software archives and run both the stand-alone SubDyn or SubDyn coupled to FAST. Section 2 describes the SubDyn input files. Section 3 discusses the output files generated by SubDyn; these include echo files, a summary file, and the results file. Section 4 provides modeling guidance when using SubDyn. The SubDyn theory is covered in Section 5. Section 6 outlines future work, and Section 7 contains a list of references. Example input files are shown in Appendixes A and B. A summary of available output channels are found in Appendix C. Instructions for compiling the stand-alone SubDyn program are detailed in Appendix D. Appendix E tracks the major changes we have made to SubDyn for each public release.

  3. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  4. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  5. The Personalized Information Retrieval Model Based on User Interest

    NASA Astrophysics Data System (ADS)

    Gong, Songjie

    Personalized information retrieval systems can help customers to gain orientation in information overload by determining which items are relevant for their interests. One type of information retrieval is content-based filtering. In content-based filtering, items contain words in natural language. Meanings of words in natural language are often ambiguous. The problem of word meaning disambiguation is often decomposed to determining semantic similarity of words. In this paper, the architecture of personalized information retrieval based on user interest is presented. The architecture includes user interface model, user interest model, detecting interest model and update model. It established a user model for personalized information retrieval based on user interest keyword list on client server, which can supply personalized information retrieval service for user with the communications and collaboration of all modules of the architecture.

  6. USER'S GUIDE FOR THE PHOTOCHEMICAL BOX MODEL (PBM)

    EPA Science Inventory

    The User's Guide for the Photochemical Box Model (PBM) attempts to describe the structure and operation of the model and its preprocessors as well as provide the potential user with guidance in setting up input data. The PBM is a simple stationary single-cell model with a variabl...

  7. Evaluation Theory, Models, and Applications

    ERIC Educational Resources Information Center

    Stufflebeam, Daniel L.; Shinkfield, Anthony J.

    2007-01-01

    "Evaluation Theory, Models, and Applications" is designed for evaluators and students who need to develop a commanding knowledge of the evaluation field: its history, theory and standards, models and approaches, procedures, and inclusion of personnel as well as program evaluation. This important book shows how to choose from a growing array of…

  8. Towards a Ubiquitous User Model for Profile Sharing and Reuse

    PubMed Central

    de Lourdes Martinez-Villaseñor, Maria; Gonzalez-Mendoza, Miguel; Hernandez-Gress, Neil

    2012-01-01

    People interact with systems and applications through several devices and are willing to share information about preferences, interests and characteristics. Social networking profiles, data from advanced sensors attached to personal gadgets, and semantic web technologies such as FOAF and microformats are valuable sources of personal information that could provide a fair understanding of the user, but profile information is scattered over different user models. Some researchers in the ubiquitous user modeling community envision the need to share user model's information from heterogeneous sources. In this paper, we address the syntactic and semantic heterogeneity of user models in order to enable user modeling interoperability. We present a dynamic user profile structure based in Simple Knowledge Organization for the Web (SKOS) to provide knowledge representation for ubiquitous user model. We propose a two-tier matching strategy for concept schemas alignment to enable user modeling interoperability. Our proposal is proved in the application scenario of sharing and reusing data in order to deal with overweight and obesity. PMID:23201995

  9. Macro System Model (MSM) User Guide, Version 1.3

    SciTech Connect

    Ruth, M.; Diakov, V.; Sa, T.; Goldsby, M.

    2011-09-01

    This user guide describes the macro system model (MSM). The MSM has been designed to allow users to analyze the financial, environmental, transitional, geographical, and R&D issues associated with the transition to a hydrogen economy. Basic end users can use the MSM to answer cross-cutting questions that were previously difficult to answer in a consistent and timely manner due to various assumptions and methodologies among different models.

  10. JEDI Marine and Hydrokinetic Model: User Reference Guide

    SciTech Connect

    Goldberg, M.; Previsic, M.

    2011-04-01

    The Jobs and Economic Development Impact Model (JEDI) for Marine and Hydrokinetics (MHK) is a user-friendly spreadsheet-based tool designed to demonstrate the economic impacts associated with developing and operating MHK power systems in the United States. The JEDI MHK User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the sources and parameters used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.

  11. Flexible dynamic models for user interfaces

    NASA Astrophysics Data System (ADS)

    Vogelsang, Holger; Brinkschulte, Uwe; Siormanolakis, Marios

    1997-04-01

    This paper describes an approach for a platform- and implementation-independent design of user interfaces using the UIMS idea. It is a result of a detailed examination of object-oriented techniques for program specification and implementation. This analysis leads to a description of the requirements for man-machine interaction from the software- developers point of view. On the other hand, the final user of the whole system has a different view of this system. He needs metaphors of his own world to fulfill his tasks. It's the job of the user interface designer to bring these views together. The approach, described in this paper, helps bringing both kinds of developers together, using a well defined interface with minimal communication overhead.

  12. The 3DGRAPE book: Theory, users' manual, examples

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.

    1989-01-01

    A users' manual for a new three-dimensional grid generator called 3DGRAPE is presented. The program, written in FORTRAN, is capable of making zonal (blocked) computational grids in or about almost any shape. Grids are generated by the solution of Poisson's differential equations in three dimensions. The program automatically finds its own values for inhomogeneous terms which give near-orthogonality and controlled grid cell height at boundaries. Grids generated by 3DGRAPE have been applied to both viscous and inviscid aerodynamic problems, and to problems in other fluid-dynamic areas. The smoothness for which elliptic methods are known is seen here, including smoothness across zonal boundaries. An introduction giving the history, motivation, capabilities, and philosophy of 3DGRAPE is presented first. Then follows a chapter on the program itself. The input is then described in detail. A chapter on reading the output and debugging follows. Three examples are then described, including sample input data and plots of output. Last is a chapter on the theoretical development of the method.

  13. Modeling User Behavior and Attention in Search

    ERIC Educational Resources Information Center

    Huang, Jeff

    2013-01-01

    In Web search, query and click log data are easy to collect but they fail to capture user behaviors that do not lead to clicks. As search engines reach the limits inherent in click data and are hungry for more data in a competitive environment, mining cursor movements, hovering, and scrolling becomes important. This dissertation investigates how…

  14. Users matter : multi-agent systems model of high performance computing cluster users.

    SciTech Connect

    North, M. J.; Hood, C. S.; Decision and Information Sciences; IIT

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex due to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.

  15. Users/consumers differences regarding ergonomics and design theory and practice.

    PubMed

    Dejean, Pierre-Henri; Wagstaff, Peter

    2012-01-01

    This paper presents the concept of direct and indirect users, a key issue to cooperation between ergonomists, designers and managers involved in a sustainable approach to design. What issues for Ergonomics and Design are launched by this concept? User/consumer differences should be approached taking into account Ergonomics and Design theory and practice. What dialogue and tools could help the ergonomist/designer/manager to respond to all the requirements of the future clients of the product? PMID:22317276

  16. A Computational Theory of Modelling

    NASA Astrophysics Data System (ADS)

    Rossberg, Axel G.

    2003-04-01

    A metatheory is developed which characterizes the relationship between a modelled system, which complies with some ``basic theory'', and a model, which does not, and yet reproduces important aspects of the modelled system. A model is represented by an (in a certain sense, s.b.) optimal algorithm which generates data that describe the model's state or evolution complying with a ``reduced theory''. Theories are represented by classes of (in a similar sense, s.b.) optimal algorithms that test if their input data comply with the theory. The metatheory does not prescribe the formalisms (data structure, language) to be used for the description of states or evolutions. Transitions to other formalisms and loss of accuracy, common to theory reduction, are explicitly accounted for. The basic assumption of the theory is that resources such as the code length (~ programming time) and the computation time for modelling and testing are costly, but the relative cost of each recourse is unknown. Thus, if there is an algorithm a for which there is no other algorithm b solving the same problem but using less of each recourse, then a is considered optimal. For tests (theories), the set X of wrongly admitted inputs is treated as another resource. It is assumed that X1 is cheaper than X2 when X1 ⊂ X2 (X1 ≠ X2). Depending on the problem, the algorithmic complexity of a reduced theory can be smaller or larger than that of the basic theory. The theory might help to distinguish actual properties of complex systems from mere mental constructs. An application to complex spatio-temporal patterns is discussed.

  17. Treatment motivation in drug users: a theory-based analysis.

    PubMed

    Longshore, Douglas; Teruya, Cheryl

    2006-02-01

    Motivation for drug use treatment is widely regarded as crucial to a client's engagement in treatment and success in quitting drug use. Motivation is typically measured with items reflecting high treatment readiness (e.g., perceived need for treatment and commitment to participate) and low treatment resistance (e.g., skepticism regarding benefits of treatment). Building upon reactance theory and the psychotherapeutic construct of resistance, we conceptualized these two aspects of treatment motivation - readiness and resistance - as distinct constructs and examined their predictive power in a sample of 1295 drug-using offenders referred to treatment while on probation. The sample was 60.7% African Americans, 33.5% non-Hispanic Whites, and 21.2% women; their ages ranged from 16 to 63 years old. Interviews occurred at treatment entry and 6 months later. Readiness (but not resistance) predicted treatment retention during the 6-month period. Resistance (but not readiness) predicted drug use, especially among offenders for whom the treatment referral was coercive. These findings suggest that readiness and resistance should both be assessed among clients entering treatment, especially when the referral is coercive. Intake and counseling protocols should address readiness and resistance separately. PMID:16051447

  18. Characterizing Drug Non-Users as Distinctive in Prevention Messages: Implications of Optimal Distinctiveness Theory

    PubMed Central

    Comello, Maria Leonora G.

    2011-01-01

    Optimal Distinctiveness Theory posits that highly valued groups are those that can simultaneously satisfy needs to belong and to be different. The success of drug-prevention messages with a social-identity theme should therefore depend on the extent to which the group is portrayed as capable of meeting these needs. Specifically, messages that portray non-users as a large and undifferentiated majority may not be as successful as messages that emphasize uniqueness of non-users. This prediction was examined using marijuana prevention messages that depicted non-users as a distinctive or a majority group. Distinctiveness characterization lowered behavioral willingness to use marijuana among non-users (Experiment 1) and served as a source of identity threat (contingent on gender) among users (Experiment 2). PMID:21409672

  19. Artificial intelligence techniques for modeling database user behavior

    NASA Technical Reports Server (NTRS)

    Tanner, Steve; Graves, Sara J.

    1990-01-01

    The design and development of the adaptive modeling system is described. This system models how a user accesses a relational database management system in order to improve its performance by discovering use access patterns. In the current system, these patterns are used to improve the user interface and may be used to speed data retrieval, support query optimization and support a more flexible data representation. The system models both syntactic and semantic information about the user's access and employs both procedural and rule-based logic to manipulate the model.

  20. A Driving Behaviour Model of Electrical Wheelchair Users

    PubMed Central

    Hamam, Y.; Djouani, K.; Daachi, B.; Steyn, N.

    2016-01-01

    In spite of the presence of powered wheelchairs, some of the users still experience steering challenges and manoeuvring difficulties that limit their capacity of navigating effectively. For such users, steering support and assistive systems may be very necessary. To appreciate the assistance, there is need that the assistive control is adaptable to the user's steering behaviour. This paper contributes to wheelchair steering improvement by modelling the steering behaviour of powered wheelchair users, for integration into the control system. More precisely, the modelling is based on the improved Directed Potential Field (DPF) method for trajectory planning. The method has facilitated the formulation of a simple behaviour model that is also linear in parameters. To obtain the steering data for parameter identification, seven individuals participated in driving the wheelchair in different virtual worlds on the augmented platform. The obtained data facilitated the estimation of user parameters, using the ordinary least square method, with satisfactory regression analysis results. PMID:27148362

  1. Jieke theory and logistic model

    SciTech Connect

    Cao, H.; Feng, G.

    1996-06-01

    What is a shell or a JIEKE (in Chinese) is introduced firstly, jieke is a sort of system boundary. From the concept of jieke theory, a new logistic model which takes account of the switch effect of the jieke is suggested. The model is analyzed and nonlinear mapping of the model is made. The results show the feature of the switch logistic model far differ from the original logistic model. {copyright} {ital 1996 American Institute of Physics.}

  2. Quantify uncertain emergency search techniques (QUEST) -- Theory and user`s guide

    SciTech Connect

    Johnson, M.M.; Goldsby, M.E.; Plantenga, T.D.; Porter, T.L.; West, T.H.; Wilcox, W.B.; Hensley, W.K.

    1998-01-01

    As recent world events show, criminal and terrorist access to nuclear materials is a growing national concern. The national laboratories are taking the lead in developing technologies to counter these potential threats to the national security. Sandia National laboratories, with support from Pacific Northwest National Laboratory and the Bechtel Nevada, Remote Sensing Laboratory, has developed QUEST (a model to Quantify Uncertain Emergency Search Techniques), to enhance the performance of organizations in the search for lost or stolen nuclear material. In addition, QUEST supports a wide range of other applications, such as environmental monitoring, nuclear facilities inspections, and searcher training. QUEST simulates the search for nuclear materials and calculates detector response for various source types and locations. The probability of detecting a radioactive source during a search is a function of many different variables, including source type, search location and structure geometry (including shielding), search dynamics (path and speed), and detector type and size. Through calculation of dynamic detector response, QUEST makes possible quantitative comparisons of various sensor technologies and search patterns. The QUEST model can be used as a tool to examine the impact of new detector technologies, explore alternative search concepts, and provide interactive search/inspector training.

  3. How Homeless Sector Workers Deal with the Death of Service Users: A Grounded Theory Study

    ERIC Educational Resources Information Center

    Lakeman, Richard

    2011-01-01

    Homeless sector workers often encounter the deaths of service users. A modified grounded theory methodology project was used to explore how workers make sense of, respond to, and cope with sudden death. In-depth interviews were undertaken with 16 paid homeless sector workers who had experienced the death of someone with whom they worked.…

  4. Involving service users in interprofessional education narrowing the gap between theory and practice.

    PubMed

    Cooper, Helen; Spencer-Dawe, Eileen

    2006-12-01

    Calls for greater collaboration between professionals in health and social care have led to pressures to move toward interprofessional education (IPE) at both pre- and post-registration levels. Whilst this move has evolved out of "common sense" demands, such a multiple systems approach to education does not fit easily into existing traditional educational frameworks and there is, as yet, no proven theoretical framework to guide its development. A research study of an IPE intervention at the University of Liverpool in the UK drew on complexity theory to conceptualize the intervention and to evaluate its impact on a group of approximately 500 students studying physiotherapy, medicine, occupational therapy, nursing and social work. The intervention blended a multidisciplinary (non-interactive) plenary with self-directed e-learning and a series of interdisciplinary (interactive) workshops. Two evaluations took place: the first when the workshops were facilitated by trained practitioners; the second when the practitioners co-facilitated with trained service users. This paper reports findings from the second evaluation which focused on narrowing the gap between theory and practice. A multi-stakeholder evaluation was used including: students' reflective narratives, a focus group with practitioners and individual semi-structured interviews with service users. Findings showed that service users can make an important contribution to IPE for health and social care students in the early stages of their training. By exposure to a service user perspective, first year students can begin to learn and apply the principles of team work, to place the service user at the centre of the care process, to make connections between theory and "real life" experiences, and to narrow the gap between theory and practice. Findings also revealed benefits for facilitators and service users. PMID:17095439

  5. FEM3C, An improved three-dimensional heavy-gas dispersion model: User`s manual

    SciTech Connect

    Chan, S.T.

    1994-03-01

    FEM3C is another upgraded version of FEM3 (a three-dimensional Finite Element Model), which was developed primarily for simulating the atmospheric dispersion of heavier-than-air gas (or heavy gas) releases, based on solving the fully three-dimensional, time-dependent conservation equations of mass, momentum, energy, and species of an inert gas or a pollutant in the form of vapor/droplets. A generalized anelastic approximation, together with the ideal gas law for the density of the gas/air mixture, is invoked to preclude sound waves and allow large density variations in both space and time. Thee numerical algorithm utilizes a modified Galerkin finite element method to discretize spatially the time-dependent conservation equations of mass, momentum, energy, and species. A consistent pressure Poisson equation is formed and solved separately from the time-dependent equations, which are sequentially solved and integrated in time via a modified forward Euler method. The model can handle instantaneous source, finite-duration, and continuous releases. Also, it is capable of treating terrain and obstructions. Besides a K-theory model using similarity functions, an advanced turbulence model based on solving the k - {var_epsilon} transport equations is available as well. Imbedded in the code are also options for solving the Boussinesq equations. In this report, an overview of the model is given, user`s guides for using the model are provided, and example problems are presented to illustrate the usage of the model.

  6. Modeling the behavior of the computer-assisted instruction user

    SciTech Connect

    Stoddard, M.L.

    1983-01-01

    The field of computer-assisted instruction CAI contains abundant studies on effectiveness of particular programs or systems. However, the nature of the field is such that the computer is the focus of research, not the users. Few research studies have focused on the behavior of the individual CAI user. Morgan (1981) stated that descriptive studies are needed to clarify what the important phenomena of user behavior are. The need for such studies is particularly acute in computer-assisted instruction. Building a behavioral model would enable us to understand problem-solving strategies and rules applied by the user during a CAI experience. Also, courseware developers could use this information to design tutoring systems that are more responsive to individual differences than our present CAI is. This paper proposes a naturalistic model for evaluating both affective and cognitive characteristics of the CAI user. It begins with a discussion of features of user behavior, followed by a description of evaluation methodology that can lead to modeling user behavior. The paper concludes with a discussion of how implementation of this model can contribute to the fields of CAI and cognitive psychology.

  7. Randomized Item Response Theory Models

    ERIC Educational Resources Information Center

    Fox, Jean-Paul

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by an item response theory (IRT) model. The RR…

  8. An Investigation of the Integrated Model of User Technology Acceptance: Internet User Samples in Four Countries

    ERIC Educational Resources Information Center

    Fusilier, Marcelline; Durlabhji, Subhash; Cucchi, Alain

    2008-01-01

    National background of users may influence the process of technology acceptance. The present study explored this issue with the new, integrated technology use model proposed by Sun and Zhang (2006). Data were collected from samples of college students in India, Mauritius, Reunion Island, and United States. Questionnaire methodology and…

  9. REGIONAL OXIDANT MODEL (ROM) USER'S GUIDE - PART 4: THE ROM SYSTEM USER TUTORIAL

    EPA Science Inventory

    This volume of the Regional Oxidant Model (ROM) User's Guide is intended to be a "cookbook" for unloading the ROM system code and benchmark (test case) data from the 19 distribution tapes. he ROM runs on the following computer systems: 1) VAX hardware for the preprocessors and th...

  10. User's instructions for the cardiovascular Walters model

    NASA Technical Reports Server (NTRS)

    Croston, R. C.

    1973-01-01

    The model is a combined, steady-state cardiovascular and thermal model. It was originally developed for interactive use, but was converted to batch mode simulation for the Sigma 3 computer. The model has the purpose to compute steady-state circulatory and thermal variables in response to exercise work loads and environmental factors. During a computer simulation run, several selected variables are printed at each time step. End conditions are also printed at the completion of the run.

  11. HYDROCARBON SPILL SCREENING MODEL (HSSM) VOLUME 1: USER'S GUIDE

    EPA Science Inventory

    This users guide describes the Hydrocarbon Spill Screening Model (HSSM). The model is intended for simulation of subsurface releases of light nonaqueous phase liquids (LNAPLs). The model consists of separate modules for LNAPL flow through the vadose zone, spreading in the capil...

  12. USERS MANUAL: LANDFILL GAS EMISSIONS MODEL - VERSION 2.0

    EPA Science Inventory

    The document is a user's guide for a computer model, Version 2.0 of the Landfill Gas Emissions Model (LandGEM), for estimating air pollution emissions from municipal solid waste (MSW) landfills. The model can be used to estimate emission rates for methane, carbon dioxide, nonmet...

  13. Do recommender systems benefit users? a modeling approach

    NASA Astrophysics Data System (ADS)

    Yeung, Chi Ho

    2016-04-01

    Recommender systems are present in many web applications to guide purchase choices. They increase sales and benefit sellers, but whether they benefit customers by providing relevant products remains less explored. While in many cases the recommended products are relevant to users, in other cases customers may be tempted to purchase the products only because they are recommended. Here we introduce a model to examine the benefit of recommender systems for users, and find that recommendations from the system can be equivalent to random draws if one always follows the recommendations and seldom purchases according to his or her own preference. Nevertheless, with sufficient information about user preferences, recommendations become accurate and an abrupt transition to this accurate regime is observed for some of the studied algorithms. On the other hand, we find that high estimated accuracy indicated by common accuracy metrics is not necessarily equivalent to high real accuracy in matching users with products. This disagreement between estimated and real accuracy serves as an alarm for operators and researchers who evaluate recommender systems merely with accuracy metrics. We tested our model with a real dataset and observed similar behaviors. Finally, a recommendation approach with improved accuracy is suggested. These results imply that recommender systems can benefit users, but the more frequently a user purchases the recommended products, the less relevant the recommended products are in matching user taste.

  14. Theory of hadronic nonperturbative models

    SciTech Connect

    Coester, F.; Polyzou, W.N.

    1995-08-01

    As more data probing hadron structure become available hadron models based on nonperturbative relativistic dynamics will be increasingly important for their interpretation. Relativistic Hamiltonian dynamics of few-body systems (constituent-quark models) and many-body systems (parton models) provides a precisely defined approach and a useful phenomenology. However such models lack a quantitative foundation in quantum field theory. The specification of a quantum field theory by a Euclidean action provides a basis for the construction of nonperturbative models designed to maintain essential features of the field theory. For finite systems it is possible to satisfy axioms which guarantee the existence of a Hilbert space with a unitary representation of the Poincare group and the spectral condition which ensures that the spectrum of the four-momentum operator is in the forward light cone. The separate axiom which guarantees locality of the field operators can be weakened for the construction for few-body models. In this context we are investigating algebraic and analytic properties of model Schwinger functions. This approach promises insight into the relations between hadronic models based on relativistic Hamiltonian dynamics on one hand and Bethe-Salpeter Green`s-function equations on the other.

  15. METAPHOR (version 1): Users guide. [performability modeling

    NASA Technical Reports Server (NTRS)

    Furchtgott, D. G.

    1979-01-01

    General information concerning METAPHOR, an interactive software package to facilitate performability modeling and evaluation, is presented. Example systems are studied and their performabilities are calculated. Each available METAPHOR command and array generator is described. Complete METAPHOR sessions are included.

  16. Users of middle atmosphere models remarks

    NASA Technical Reports Server (NTRS)

    Gamble, Joe

    1987-01-01

    The procedure followed for shuttle operations is to calculate descent trajectories for each potential shuttle landing site using the Global Reference Atmosphere Model (GRAM) to interactively compute density along the flight path 100 times to bound the statistics. The purpose is to analyze the flight dynamics, along with calculations of heat loads during reentry. The analysis program makes use of the modified version of the Jacchia-70 atmosphere, which includes He bulges over the poles and seasonal latitude variations at lower altitudes. For the troposphere, the 4-D Model is used up to 20 km, Groves from 30 km up to 90 km. It is extrapolated over the globe and faired into the Jacchia atmosphere between 90 and 115 km. Since data on the Southern Hemisphere was lacking, what was done was that the data was flipped over and lagged 6 months. Sometimes when winds are calculated from pressure data in the model there appear to be discontinuities. Modelers indicated that the GRAM was not designed to produce winds, but good wind data is needed for the landing phase of shuttle operations. Use of atmospheric models during reentry is one application where it is obvious that a single integrated atmosphere model is required.

  17. Utilizing Vector Space Models for User Modeling within e-Learning Environments

    ERIC Educational Resources Information Center

    Mangina, E.; Kilbride, J.

    2008-01-01

    User modeling has been found to enhance the effectiveness and/or usability of software systems through the representation of certain properties of a particular user. This paper presents the research and the results of the development of a user modeling system for the implementation of student models within e-learning environments, utilizing vector…

  18. Building integral projection models: a user's guide

    PubMed Central

    Rees, Mark; Childs, Dylan Z; Ellner, Stephen P; Coulson, Tim

    2014-01-01

    In order to understand how changes in individual performance (growth, survival or reproduction) influence population dynamics and evolution, ecologists are increasingly using parameterized mathematical models. For continuously structured populations, where some continuous measure of individual state influences growth, survival or reproduction, integral projection models (IPMs) are commonly used. We provide a detailed description of the steps involved in constructing an IPM, explaining how to: (i) translate your study system into an IPM; (ii) implement your IPM; and (iii) diagnose potential problems with your IPM. We emphasize how the study organism's life cycle, and the timing of censuses, together determine the structure of the IPM kernel and important aspects of the statistical analysis used to parameterize an IPM using data on marked individuals. An IPM based on population studies of Soay sheep is used to illustrate the complete process of constructing, implementing and evaluating an IPM fitted to sample data. We then look at very general approaches to parameterizing an IPM, using a wide range of statistical techniques (e.g. maximum likelihood methods, generalized additive models, nonparametric kernel density estimators). Methods for selecting models for parameterizing IPMs are briefly discussed. We conclude with key recommendations and a brief overview of applications that extend the basic model. The online Supporting Information provides commented R code for all our analyses. PMID:24219157

  19. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  20. User's instructions for the erythropoiesis regulatory model

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The purpose of the model provides a method to analyze some of the events that could account for the decrease in red cell mass observed in crewmen returning from space missions. The model is based on the premise that erythrocyte production is governed by the balance between oxygen supply and demand at a renal sensing site. Oxygen supply is taken to be a function of arterial oxygen tension, mean corpuscular hemoglobin concentration, oxy-hemoglobin carrying capacity, hematocrit, and blood flow. Erythrocyte destruction is based on the law of mass action. The instantaneous hematocrit value is derived by integrating changes in production and destruction rates and accounting for the degree of plasma dilution.

  1. Snowmelt Runoff Model (SRM) User's Manual

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This 2008 edition of the User’s Manual presents a new computer program, the Windows Version 1.11 of the Snowmelt Runoff Model (WinSRM). The popular Version 4 is also preserved in the Appendix because it is still in demand to be used within its limits. The Windows version adds new capabilities: it ac...

  2. Simple Thermal Environment Model (STEM) User's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C.G.; Batts, G. W.; Anderson, B. J.; James, B. F.

    2001-01-01

    This report presents a Simple Thermal Environment Model (STEM) for determining appropriate engineering design values to specify the thermal environment of Earth-orbiting satellites. The thermal environment of a satellite, consists of three components: (1) direct solar radiation, (2) Earth-atmosphere reflected shortwave radiation, as characterized by Earth's albedo, and (3) Earth-atmosphere-emitted outgoing longwave radiation (OLR). This report, together with a companion "guidelines" report provides methodology and guidelines for selecting "design points" for thermal environment parameters for satellites and spacecraft systems. The methods and models reported here are outgrowths of Earth Radiation Budget Experiment (ERBE) satellite data analysis and thermal environment specifications discussed by Anderson and Smith (1994). In large part, this report is intended to update (and supersede) those results.

  3. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  4. Supplement to wellbore models GWELL, GWNACL, and HOLA User`s Guide

    SciTech Connect

    Hadgu, T.; Bodvarsson, G.S.

    1992-09-01

    A study was made on improving the applicability and ease of usage of the wellbore simulators HOLA, GWELL and GWNACL (Bjornsson, 1987; Aunzo et al., 1991). The study concentrated mainly on the usage of Option 2 (please refer to the User`s Guide; Aunzo et al., 1991) and modeling flow of superheated steam when using these computer codes. Amendments were made to the simulators to allow implementation of a variety of input data. A wide range of input data was used to test the modifications to the codes. The study did not attempt to modify or improve the physics or formulations which were used in the models. It showed that a careful check of the input data is required. This report addresses these two areas of interest: usage of Option 2, and simulation of wellbore flow of superheated steam.

  5. Solid rocket booster performance evaluation model. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    This users manual for the solid rocket booster performance evaluation model (SRB-II) contains descriptions of the model, the program options, the required program inputs, the program output format and the program error messages. SRB-II is written in FORTRAN and is operational on both the IBM 370/155 and the MSFC UNIVAC 1108 computers.

  6. Modeling mutual feedback between users and recommender systems

    NASA Astrophysics Data System (ADS)

    Zeng, An; Yeung, Chi Ho; Medo, Matúš; Zhang, Yi-Cheng

    2015-07-01

    Recommender systems daily influence our decisions on the Internet. While considerable attention has been given to issues such as recommendation accuracy and user privacy, the long-term mutual feedback between a recommender system and the decisions of its users has been neglected so far. We propose here a model of network evolution which allows us to study the complex dynamics induced by this feedback, including the hysteresis effect which is typical for systems with non-linear dynamics. Despite the popular belief that recommendation helps users to discover new things, we find that the long-term use of recommendation can contribute to the rise of extremely popular items and thus ultimately narrow the user choice. These results are supported by measurements of the time evolution of item popularity inequality in real systems. We show that this adverse effect of recommendation can be tamed by sacrificing part of short-term recommendation accuracy.

  7. Earthquake Early Warning Beta Users: Java, Modeling, and Mobile Apps

    NASA Astrophysics Data System (ADS)

    Strauss, J. A.; Vinci, M.; Steele, W. P.; Allen, R. M.; Hellweg, M.

    2014-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The beta users receive earthquake information very rapidly in real-time and are providing feedback on their experiences of performance and potential uses within their organization. Beta user interactions allow the ShakeAlert team to discern: which alert delivery options are most effective, what changes would make the UserDisplay more useful in a pre-disaster situation, and most importantly, what actions users plan to take for various scenarios. Actions could include: personal safety approaches, such as drop cover, and hold on; automated processes and procedures, such as opening elevator or fire stations doors; or situational awareness. Users are beginning to determine which policy and technological changes may need to be enacted, and funding requirements to implement their automated controls. The use of models and mobile apps are beginning to augment the basic Java desktop applet. Modeling allows beta users to test their early warning responses against various scenarios without having to wait for a real event. Mobile apps are also changing the possible response landscape, providing other avenues for people to receive information. All of these combine to improve business continuity and resiliency.

  8. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  9. Geothermal loan guaranty cash flow model: description and users' manual

    SciTech Connect

    Keimig, M.A.; Rosenberg, J.I.; Entingh, D.J.

    1980-11-01

    This is the users guide for the Geothermal Loan Guaranty Cash Flow Model (GCFM). GCFM is a Fortran code which designs and costs geothermal fields and electric power plants. It contains a financial analysis module which performs life cycle costing analysis taking into account various types of taxes, costs and financial structures. The financial module includes a discounted cash flow feature which calculates a levelized breakeven price for each run. The user's guide contains descriptions of the data requirements and instructions for using the model.

  10. Understanding Deep Representations Learned in Modeling Users Likes.

    PubMed

    Guntuku, Sharath Chandra; Zhou, Joey Tianyi; Roy, Sujoy; Lin, Weisi; Tsang, Ivor W

    2016-08-01

    Automatically understanding and discriminating different users' liking for an image is a challenging problem. This is because the relationship between image features (even semantic ones extracted by existing tools, viz., faces, objects, and so on) and users' likes is non-linear, influenced by several subtle factors. This paper presents a deep bi-modal knowledge representation of images based on their visual content and associated tags (text). A mapping step between the different levels of visual and textual representations allows for the transfer of semantic knowledge between the two modalities. Feature selection is applied before learning deep representation to identify the important features for a user to like an image. The proposed representation is shown to be effective in discriminating users based on images they like and also in recommending images that a given user likes, outperforming the state-of-the-art feature representations by  ∼ 15 %-20%. Beyond this test-set performance, an attempt is made to qualitatively understand the representations learned by the deep architecture used to model user likes. PMID:27295666

  11. Tree Theory: A Theory-Generative Measurement Model.

    ERIC Educational Resources Information Center

    Airasian, Peter W.; Bart, William M.

    The inadequacies in present measurement models are indicated and a description is given of how tree theory, a theory-generative model, overcomes these inadequacies. Among the weaknesses cited in many measurement models are their untested assumptions of linear order and unidimensionality and their inability to generate non-associational…

  12. A mixing evolution model for bidirectional microblog user networks

    NASA Astrophysics Data System (ADS)

    Yuan, Wei-Guo; Liu, Yun

    2015-08-01

    Microblogs have been widely used as a new form of online social networking. Based on the user profile data collected from Sina Weibo, we find that the number of microblog user bidirectional friends approximately corresponds with the lognormal distribution. We then build two microblog user networks with real bidirectional relationships, both of which have not only small-world and scale-free but also some special properties, such as double power-law degree distribution, disassortative network, hierarchical and rich-club structure. Moreover, by detecting the community structures of the two real networks, we find both of their community scales follow an exponential distribution. Based on the empirical analysis, we present a novel evolution network model with mixed connection rules, including lognormal fitness preferential and random attachment, nearest neighbor interconnected in the same community, and global random associations in different communities. The simulation results show that our model is consistent with real network in many topology features.

  13. Designing visual displays and system models for safe reactor operations based on the user`s perspective of the system

    SciTech Connect

    Brown-VanHoozer, S.A.

    1995-12-31

    Most designers are not schooled in the area of human-interaction psychology and therefore tend to rely on the traditional ergonomic aspects of human factors when designing complex human-interactive workstations related to reactor operations. They do not take into account the differences in user information processing behavior and how these behaviors may affect individual and team performance when accessing visual displays or utilizing system models in process and control room areas. Unfortunately, by ignoring the importance of the integration of the user interface at the information process level, the result can be sub-optimization and inherently error- and failure-prone systems. Therefore, to minimize or eliminate failures in human-interactive systems, it is essential that the designers understand how each user`s processing characteristics affects how the user gathers information, and how the user communicates the information to the designer and other users. A different type of approach in achieving this understanding is Neuro Linguistic Programming (NLP). The material presented in this paper is based on two studies involving the design of visual displays, NLP, and the user`s perspective model of a reactor system. The studies involve the methodology known as NLP, and its use in expanding design choices from the user`s ``model of the world,`` in the areas of virtual reality, workstation design, team structure, decision and learning style patterns, safety operations, pattern recognition, and much, much more.

  14. H2A Production Model, Version 2 User Guide

    SciTech Connect

    Steward, D.; Ramsden, T.; Zuboy, J.

    2008-09-01

    The H2A Production Model analyzes the technical and economic aspects of central and forecourt hydrogen production technologies. Using a standard discounted cash flow rate of return methodology, it determines the minimum hydrogen selling price, including a specified after-tax internal rate of return from the production technology. Users have the option of accepting default technology input values--such as capital costs, operating costs, and capacity factor--from established H2A production technology cases or entering custom values. Users can also modify the model's financial inputs. This new version of the H2A Production Model features enhanced usability and functionality. Input fields are consolidated and simplified. New capabilities include performing sensitivity analyses and scaling analyses to various plant sizes. This User Guide helps users already familiar with the basic tenets of H2A hydrogen production cost analysis get started using the new version of the model. It introduces the basic elements of the model then describes the function and use of each of its worksheets.

  15. An information theory criteria based blind method for enumerating active users in DS-CDMA system

    NASA Astrophysics Data System (ADS)

    Samsami Khodadad, Farid; Abed Hodtani, Ghosheh

    2014-11-01

    In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.

  16. Shawnee flue gas desulfurization computer model users manual

    SciTech Connect

    Sudhoff, F.A.; Torstrick, R.L.

    1985-03-01

    In conjunction with the US Enviromental Protection Agency sponsored Shawnee test program, Bechtel National, Inc., and the Tennessee Valley Authority jointly developed a computer model capable of projecting preliminary design and economics for lime- and limestone-scrubbing flue gas desulfurization systems. The model is capable of projecting relative economics for spray tower, turbulent contact absorber, and venturi-spray tower scrubbing options. It may be used to project the effect on system design and economics of variations in required SO/sub 2/ removal, scrubber operating parameters (gas velocity, liquid-to-gas (L/G) ration, alkali stoichiometry, liquor hold time in slurry recirculation tanks), reheat temperature, and scrubber bypass. It may also be used to evaluate alternative waste disposal methods or additives (MgO or adipic acid) on costs for the selected process. Although the model is not intended to project the economics of an individual system to a high degree of accuracy, it allows prospective users to quickly project comparative design and costs for limestone and lime case variations on a common design and cost basis. The users manual provides a general descripton of the Shawnee FGD computer model and detailed instructions for its use. It describes and explains the user-supplied input data which are required such as boiler size, coal characteristics, and SO/sub 2/ removal requirments. Output includes a material balance, equipment list, and detailed capital investment and annual revenue requirements. The users manual provides information concerning the use of the overall model as well as sample runs to serve as a guide to prospective users in identifying applications. The FORTRAN-based model is maintained by TVA, from whom copies or individual runs are available. 25 refs., 3 figs., 36 tabs.

  17. Using Partial Credit and Response History to Model User Knowledge

    ERIC Educational Resources Information Center

    Van Inwegen, Eric G.; Adjei, Seth A.; Wang, Yan; Heffernan, Neil T.

    2015-01-01

    User modelling algorithms such as Performance Factors Analysis and Knowledge Tracing seek to determine a student's knowledge state by analyzing (among other features) right and wrong answers. Anyone who has ever graded an assignment by hand knows that some answers are "more wrong" than others; i.e. they display less of an understanding…

  18. Dynamic User Modeling within a Game-Based ITS

    ERIC Educational Resources Information Center

    Snow, Erica L.

    2015-01-01

    Intelligent tutoring systems are adaptive learning environments designed to support individualized instruction. The adaptation embedded within these systems is often guided by user models that represent one or more aspects of students' domain knowledge, actions, or performance. The proposed project focuses on the development and testing of user…

  19. USER-FRIENDLY DATA ENTRY ROUTINE FOR THE ESP MODEL

    EPA Science Inventory

    The report is a user's manual for an interactive data entry program that greatly simplifies the creation and modification of electrostatic precipitator (ESP) model data files. outine use of the interactive program, written for IBM PC-compatible computers, will eliminate a major s...

  20. ABAREX -- A neutron spherical optical-statistical-model code -- A user`s manual

    SciTech Connect

    Smith, A.B.; Lawson, R.D.

    1998-06-01

    The contemporary version of the neutron spherical optical-statistical-model code ABAREX is summarized with the objective of providing detailed operational guidance for the user. The physical concepts involved are very briefly outlined. The code is described in some detail and a number of explicit examples are given. With this document one should very quickly become fluent with the use of ABAREX. While the code has operated on a number of computing systems, this version is specifically tailored for the VAX/VMS work station and/or the IBM-compatible personal computer.

  1. User interface for ground-water modeling: Arcview extension

    USGS Publications Warehouse

    Tsou, M.-S.; Whittemore, D.O.

    2001-01-01

    Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.

  2. Plotting program for aerodynamic lifting surface theory. [user manual for FORTRAN computer program

    NASA Technical Reports Server (NTRS)

    Medan, R. T.; Ray, K. S.

    1973-01-01

    A description of and users manual for a USA FORTRAN IV computer program which plots the planform and control points of a wing are presented. The program also plots some of the configuration data such as the aspect ratio. The planform data is stored on a disc file which is created by a geometry program. This program, the geometry program, and several other programs are used together in the analysis of lifting, thin wings in steady, subsonic flow according to a kernel function lifting surface theory.

  3. Airport-Noise Levels and Annoyance Model (ALAMO) user's guide

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    A guide for the use of the Airport-Noise Level and Annoyance MOdel (ALAMO) at the Langley Research Center computer complex is provided. This document is divided into 5 primary sections, the introduction, the purpose of the model, and an in-depth description of the following subsystems: baseline, noise reduction simulation and track analysis. For each subsystem, the user is provided with a description of architecture, an explanation of subsystem use, sample results, and a case runner's check list. It is assumed that the user is familiar with the operations at the Langley Research Center (LaRC) computer complex, the Network Operating System (NOS 1.4) and CYBER Control Language. Incorporated within the ALAMO model is a census database system called SITE II.

  4. The Snowmelt-Runoff Model (SRM) user's manual

    NASA Technical Reports Server (NTRS)

    Martinec, J.; Rango, A.; Major, E.

    1983-01-01

    A manual to provide a means by which a user may apply the snowmelt runoff model (SRM) unaided is presented. Model structure, conditions of application, and data requirements, including remote sensing, are described. Guidance is given for determining various model variables and parameters. Possible sources of error are discussed and conversion of snowmelt runoff model (SRM) from the simulation mode to the operational forecasting mode is explained. A computer program is presented for running SRM is easily adaptable to most systems used by water resources agencies.

  5. A modeling framework for resource-user-infrastructure systems

    NASA Astrophysics Data System (ADS)

    Muneepeerakul, R.; Qubbaj, M.; Anderies, J. M.; Aggarwal, R.; Janssen, M.

    2012-12-01

    A compact modeling framework is developed to supplement a conceptual framework of coupled natural-human systems. The framework consists of four components: resource (R), users (U), public infrastructure (PI), and public infrastructure providers (PIP), the last two of which have not been adequately addressed in many existing modeling studies. The modeling approach employed here is a set of replicator equations describing the dynamical frequencies of social strategies (of U and PIP), whose payoffs are explicit and dynamical functions of biophysical components (R and PI). Model development and preliminary results from specific implementation will be reported and discussed.

  6. SN_GUI: a graphical user interface for snowpack modeling

    NASA Astrophysics Data System (ADS)

    Spreitzhofer, G.; Fierz, C.; Lehning, M.

    2004-10-01

    SNOWPACK is a physical snow cover model. The model not only serves as a valuable research tool, but also runs operationally on a network of high Alpine automatic weather and snow measurement sites. In order to facilitate the operation of SNOWPACK and the interpretation of the results obtained by this model, a user-friendly graphical user interface for snowpack modeling, named SN_GUI, was created. This Java-based and thus platform-independent tool can be operated in two modes, one designed to fulfill the requirements of avalanche warning services (e.g. by providing information about critical layers within the snowpack that are closely related to the avalanche activity), and the other one offering a variety of additional options satisfying the needs of researchers. The user of SN_GUI is graphically guided through the entire process of creating snow cover simulations. The starting point is the efficient creation of input parameter files for SNOWPACK, followed by the launching of SNOWPACK with a variety of parameter settings. Finally, after the successful termination of the run, a number of interactive display options may be used to visualize the model output. Among these are vertical profiles and time profiles for many parameters. Besides other features, SN_GUI allows the use of various color, time and coordinate scales, and the comparison of measured and observed parameters.

  7. Effectiveness of Anabolic Steroid Preventative Intervention among Gym Users: Applying Theory of Planned Behavior

    PubMed Central

    Jalilian, Farzad; Allahverdipour, Hamid; Moeini, Babak; Moghimbeigi, Abbas

    2011-01-01

    Background: Use of anabolic androgenic steroids (AAS) has been associated with adverse physical and psychiatric effects and it is known as rising problem among youth people. This study was conducted to evaluate anabolic steroids preventative intervention efficiency among gym users in Iran and theory of planned behaviour was applied as theoretical framework. Methods: Overall, 120 male gym users participated in this study as intervention and control group. This was a longitudinal randomized pretest - posttest series control group design panel study to implement a behaviour modification based intervention to prevent AAS use. Cross -tabulation and t-test by using SPSS statistical package, version 13 was used for the statistical analysis. Results: It was found significant improvements in average response for knowledge about side effects of AAS (P<0.001), attitude toward, and intention not to use AAS. Additionally after intervention, the rate of AAS and supplements use was decreased among intervention group. Conclusion: Comprehensive implementation against AAS abuse among gym users and ado­lescences would be effective to improve adolescents’ healthy behaviors and intend them not to use AAS. PMID:24688897

  8. Agile IT: Thinking in User-Centric Models

    NASA Astrophysics Data System (ADS)

    Margaria, Tiziana; Steffen, Bernhard

    We advocate a new teaching direction for modern CS curricula: extreme model-driven development (XMDD), a new development paradigm designed to continuously involve the customer/application expert throughout the whole systems' life cycle. Based on the `One-Thing Approach', which works by successively enriching and refining one single artifact, system development becomes in essence a user-centric orchestration of intuitive service functionality. XMDD differs radically from classical software development, which, in our opinion is no longer adequate for the bulk of application programming - in particular when it comes to heterogeneous, cross organizational systems which must adapt to rapidly changing market requirements. Thus there is a need for new curricula addressing this model-driven, lightweight, and cooperative development paradigm that puts the user process in the center of the development and the application expert in control of the process evolution.

  9. EpiPOD : community vaccination and dispensing model user's guide.

    SciTech Connect

    Berry, M.; Samsa, M.; Walsh, D.; Decision and Information Sciences

    2009-01-09

    EpiPOD is a modeling system that enables local, regional, and county health departments to evaluate and refine their plans for mass distribution of antiviral and antibiotic medications and vaccines. An intuitive interface requires users to input as few or as many plan specifics as are available in order to simulate a mass treatment campaign. Behind the input interface, a system dynamics model simulates pharmaceutical supply logistics, hospital and first-responder personnel treatment, population arrival dynamics and treatment, and disease spread. When the simulation is complete, users have estimates of the number of illnesses in the population at large, the number of ill persons seeking treatment, and queuing and delays within the mass treatment system--all metrics by which the plan can be judged.

  10. Simplified analytical model of penetration with lateral loading -- User`s guide

    SciTech Connect

    Young, C.W.

    1998-05-01

    The SAMPLL (Simplified Analytical Model of Penetration with Lateral Loading) computer code was originally developed in 1984 to realistically yet economically predict penetrator/target interactions. Since the code`s inception, its use has spread throughout the conventional and nuclear penetrating weapons community. During the penetrator/target interaction, the resistance of the material being penetrated imparts both lateral and axial loads on the penetrator. These loads cause changes to the penetrator`s motion (kinematics). SAMPLL uses empirically based algorithms, formulated from an extensive experimental data base, to replicate the loads the penetrator experiences during penetration. The lateral loads resulting from angle of attack and trajectory angle of the penetrator are explicitly treated in SAMPLL. The loads are summed and the kinematics calculated at each time step. SAMPLL has been continually improved, and the current version, Version 6.0, can handle cratering and spall effects, multiple target layers, penetrator damage/failure, and complex penetrator shapes. Version 6 uses the latest empirical penetration equations, and also automatically adjusts the penetrability index for certain target layers to account for layer thickness and confinement. This report describes the SAMPLL code, including assumptions and limitations, and includes a user`s guide.

  11. Regional Ionospheric Modelling for Single-Frequency Users

    NASA Astrophysics Data System (ADS)

    Boisits, Janina; Joldzic, Nina; Weber, Robert

    2016-04-01

    Ionospheric signal delays are a main error source in GNSS-based positioning. Thus, single-frequency receivers, which are frequently used nowadays, require additional ionospheric information to mitigate these effects. Within the Austrian Research Promotion Agency (FFG) project Regiomontan (Regional Ionospheric Modelling for Single-Frequency Users) a new and as realistic as possible model is used to obtain precise GNSS ionospheric signal delays. These delays will be provided to single-frequency users to significantly increase positioning accuracy. The computational basis is the Thin-Shell Model. For regional modelling a thin electron layer of the underlying model is approximated by a Taylor series up to degree two. The network used includes 22 GNSS Reference Stations in Austria and nearby. First results were calculated from smoothed code observations by forming the geometry-free linear combination. Satellite and station DCBs were applied. In a least squares adjustment the model parameters, consisting of the VTEC0 at the origin of the investigated area, as well as the first and the second derivatives of the electron content in longitude and latitude, were obtained with a temporal resolution of 1 hour. The height of the layer was kept fixed. The formal errors of the model parameters suggest an accuracy of the VTEC slightly better than 1TECU for a user location within Austria. In a further step, the model parameters were derived from sole phase observations by using a levelling approach to mitigate common range biases. The formal errors of this model approach suggest an accuracy of about a few tenths of a TECU. For validation, the Regiomontan VTEC was compared to IGS TEC maps depicting a very good agreement. Further, a comparison of pseudoranges has been performed to calculate the 'true' error by forming the ionosphere-free linear combination on the one hand, and by applying the Regiomontan model to L1 pseudoranges on the other hand. The resulting differences are mostly

  12. Five-Factor Model personality profiles of drug users

    PubMed Central

    Terracciano, Antonio; Löckenhoff, Corinna E; Crum, Rosa M; Bienvenu, O Joseph; Costa, Paul T

    2008-01-01

    Background Personality traits are considered risk factors for drug use, and, in turn, the psychoactive substances impact individuals' traits. Furthermore, there is increasing interest in developing treatment approaches that match an individual's personality profile. To advance our knowledge of the role of individual differences in drug use, the present study compares the personality profile of tobacco, marijuana, cocaine, and heroin users and non-users using the wide spectrum Five-Factor Model (FFM) of personality in a diverse community sample. Method Participants (N = 1,102; mean age = 57) were part of the Epidemiologic Catchment Area (ECA) program in Baltimore, MD, USA. The sample was drawn from a community with a wide range of socio-economic conditions. Personality traits were assessed with the Revised NEO Personality Inventory (NEO-PI-R), and psychoactive substance use was assessed with systematic interview. Results Compared to never smokers, current cigarette smokers score lower on Conscientiousness and higher on Neuroticism. Similar, but more extreme, is the profile of cocaine/heroin users, which score very high on Neuroticism, especially Vulnerability, and very low on Conscientiousness, particularly Competence, Achievement-Striving, and Deliberation. By contrast, marijuana users score high on Openness to Experience, average on Neuroticism, but low on Agreeableness and Conscientiousness. Conclusion In addition to confirming high levels of negative affect and impulsive traits, this study highlights the links between drug use and low Conscientiousness. These links provide insight into the etiology of drug use and have implications for public health interventions. PMID:18405382

  13. Halo modelling in chameleon theories

    SciTech Connect

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu E-mail: kazuya.koyama@port.ac.uk

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  14. Stochastic models: theory and simulation.

    SciTech Connect

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  15. User's guide for the Photochemical Box Model (PBM)

    NASA Astrophysics Data System (ADS)

    Schere, K. L.; Demerjian, K. L.

    1984-11-01

    The user's guide for the photochemical box model (PBM) describes the structure and operation of the model and its preprocessors and provides the potential user with guidance in setting up input data. The PBM is a simple stationary single-cell model with a variable height lid designed to provide volume-integrated hour averages of O3 and other photochemical smog pollutants of interest for an urban area for a single day of simulation. The PBM is most appropriate for application in air stagnation conditions with light and variable winds. The PBM assumes that emission sources are homogeneously distributed across the surface face of the box volume and that the volume is well mixed at all times. The user must provide the PBM with initial species concentrations, hourly inputs of wind speed, source emission fluxes of CO, NC(x), THC, and hydrocarbon reactivity classes, and boundary species concentrations. Values of measured solar radiation and mixed layer depth may be specified at subhourly intervals throughout a simulation.

  16. Supercell thunderstorm modeling and theory

    NASA Astrophysics Data System (ADS)

    Rotunno, Richard

    Tornadoes occur in thunderstorms. Ferrel [1889] theorized that tornadoes form when the thunderstorm updraft encounters a preexisting "gyratory" wind field. Only lately has it been found that tornadoes/waterspouts can be produced by nonrotating thunderstorms forming in environments with a preexisting low-level gyratory wind field [Wakimoto and Wilson, 1989]. However, the most intense, and long-lived, tornadoes occur in a special type of thunderstorm known as the "supercell," which generates its own gyratory wind field. That it does so is interesting, but perhaps the most fascinating aspect of rotation in the supercell, which became clear in the past decade or so, is the rotating wind field's vital role in producing the supercell's extraordinary properties of long life and deviate motion. Thus the present review will focus on what was learned from modeling and theory about the rotation and propagation of, and the relation of tornadoes to, supercell thunderstorms.

  17. NASA AVOSS Fast-Time Wake Prediction Models: User's Guide

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew

    2014-01-01

    The National Aeronautics and Space Administration (NASA) is developing and testing fast-time wake transport and decay models to safely enhance the capacity of the National Airspace System (NAS). The fast-time wake models are empirical algorithms used for real-time predictions of wake transport and decay based on aircraft parameters and ambient weather conditions. The aircraft dependent parameters include the initial vortex descent velocity and the vortex pair separation distance. The atmospheric initial conditions include vertical profiles of temperature or potential temperature, eddy dissipation rate, and crosswind. The current distribution includes the latest versions of the APA (3.4) and the TDP (2.1) models. This User's Guide provides detailed information on the model inputs, file formats, and the model output. An example of a model run and a brief description of the Memphis 1995 Wake Vortex Dataset is also provided.

  18. Design of personalized search engine based on user-webpage dynamic model

    NASA Astrophysics Data System (ADS)

    Li, Jihan; Li, Shanglin; Zhu, Yingke; Xiao, Bo

    2013-12-01

    Personalized search engine focuses on establishing a user-webpage dynamic model. In this model, users' personalized factors are introduced so that the search engine is better able to provide the user with targeted feedback. This paper constructs user and webpage dynamic vector tables, introduces singular value decomposition analysis in the processes of topic categorization, and extends the traditional PageRank algorithm.

  19. Quiver gauge theories and integrable lattice models

    NASA Astrophysics Data System (ADS)

    Yagi, Junya

    2015-10-01

    We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.

  20. The European ALMA Regional Centre: a model of user support

    NASA Astrophysics Data System (ADS)

    Andreani, P.; Stoehr, F.; Zwaan, M.; Hatziminaoglou, E.; Biggs, A.; Diaz-Trigo, M.; Humphreys, E.; Petry, D.; Randall, S.; Stanke, T.; van Kampen, E.; Bárta, M.; Brand, J.; Gueth, F.; Hogerheijde, M.; Bertoldi, F.; Muxlow, T.; Richards, A.; Vlemmings, W.

    2014-08-01

    The ALMA Regional Centres (ARCs) form the interface between the ALMA observatory and the user community from the proposal preparation stage to the delivery of data and their subsequent analysis. The ARCs provide critical services to both the ALMA operations in Chile and to the user community. These services were split by the ALMA project into core and additional services. The core services are financed by the ALMA operations budget and are critical to the successful operation of ALMA. They are contractual obligations and must be delivered to the ALMA project. The additional services are not funded by the ALMA project and are not contractual obligations, but are critical to achieve ALMA full scientific potential. A distributed network of ARC nodes (with ESO being the central ARC) has been set up throughout Europe at the following seven locations: Bologna, Bonn-Cologne, Grenoble, Leiden, Manchester, Ondrejov, Onsala. These ARC nodes are working together with the central node at ESO and provide both core and additional services to the ALMA user community. This paper presents the European ARC, and how it operates in Europe to support the ALMA community. This model, although complex in nature, is turning into a very successful one, providing a service to the scientific community that has been so far highly appreciated. The ARC could become a reference support model in an age where very large collaborations are required to build large facilities, and support is needed for geographically and culturally diverse communities.

  1. HIGHWAY, a transportation routing model: program description and users' manual

    SciTech Connect

    Joy, D.S.; Johnson, P.E.; Gibson, S.M.

    1982-12-01

    A computerized transportation routing model has been developed at the Oak Ridge National Laboratory to be used for predicting likely routes for shipping radioactive materials. The HIGHWAY data base is a computerized road atlas containing descriptions of the entire interstate highway system, the federal highway system, and most of the principal state roads. In addition to its prediction of the most likely commercial route, options incorporated in the HIGHWAY model can allow for maximum use of interstate highways or routes that will bypass urbanized areas containing populations > 100,000. The user may also interactively modify the data base to predict routes that bypass any particular state, city, town, or specific highway segment.

  2. User-friendly graph editing for procedural modeling of buildings.

    PubMed

    Patow, Gustavo

    2012-01-01

    A proposed rule-based editing metaphor intuitively lets artists create buildings without changing their workflow. It's based on the realization that the rule base represents a directed acyclic graph and on a shift in the development paradigm from product-based to rule-based representations. Users can visually add or edit rules, connect them to control the workflow, and easily create commands that expand the artist's toolbox (for example, Boolean operations or local controlling operators). This approach opens new possibilities, from model verification to model editing through graph rewriting. PMID:24804948

  3. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  4. ACE2 Global Digital Elevation Model : User Analysis

    NASA Astrophysics Data System (ADS)

    Smith, R. G.; Berry, P. A. M.; Benveniste, J.

    2013-12-01

    Altimeter Corrected Elevations 2 (ACE2), first released in October 2009, is the Global Digital Elevation Model (GDEM) created by fusing the high accuracy of over 100 million altimeter retracked height estimates, derived primarily from the ERS-1 Geodetic Mission, with the high frequency content available within the near-global Shuttle Radar Topography Mission. This novel ACE2 GDEM is freely available at 3”, 9”, 30” and 5' and has been distributed via the web to over 680 subscribers. This paper presents the results of a detailed analysis of geographical distribution of subscribed users, along with fields of study and potential uses. Investigations have also been performed to determine the most popular spatial resolutions and the impact these have on the scope of data downloaded. The analysis has shown that, even though the majority of users have come from Europe and America, a significant number of website hits have been received from South America, Africa and Asia. Registered users also vary widely, from research institutions and major companies down to individual hobbyists looking at data for single projects.

  5. Engine structures modeling software system: Computer code. User's manual

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.

  6. Groundwater modeling and remedial optimization design using graphical user interfaces

    SciTech Connect

    Deschaine, L.M.

    1997-05-01

    The ability to accurately predict the behavior of chemicals in groundwater systems under natural flow circumstances or remedial screening and design conditions is the cornerstone to the environmental industry. The ability to do this efficiently and effectively communicate the information to the client and regulators is what differentiates effective consultants from ineffective consultants. Recent advances in groundwater modeling graphical user interfaces (GUIs) are doing for numerical modeling what Windows{trademark} did for DOS{trademark}. GUI facilitates both the modeling process and the information exchange. This Test Drive evaluates the performance of two GUIs--Groundwater Vistas and ModIME--on an actual groundwater model calibration and remedial design optimization project. In the early days of numerical modeling, data input consisted of large arrays of numbers that required intensive labor to input and troubleshoot. Model calibration was also manual, as was interpreting the reams of computer output for each of the tens or hundreds of simulations required to calibrate and perform optimal groundwater remedial design. During this period, the majority of the modelers effort (and budget) was spent just getting the model running, as opposed to solving the environmental challenge at hand. GUIs take the majority of the grunt work out of the modeling process, thereby allowing the modeler to focus on designing optimal solutions.

  7. Care 3 model overview and user's guide, first revision

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Petersen, P. L.

    1985-01-01

    A manual was written to introduce the CARE III (Computer-Aided Reliability Estimation) capability to reliability and design engineers who are interested in predicting the reliability of highly reliable fault-tolerant systems. It was also structured to serve as a quick-look reference manual for more experienced users. The guide covers CARE III modeling and reliability predictions for execution in the CDC CYber 170 series computers, DEC VAX-11/700 series computer, and most machines that compile ANSI Standard FORTRAN 77.

  8. Workstation Modelling and Development: Clinical Definition of a Picture Archiving and Communications System (PACS) User Interface

    NASA Astrophysics Data System (ADS)

    Braudes, Robert E.; Mun, Seong K.; Sibert, John L.; Schnizlein, John; Horii, Steven C.

    1989-05-01

    A PACS must provide a user interface which is acceptable to all potential users of the system. Observations and interviews have been conducted with six radiology services at the Georgetown University Medical Center, Department of Radiology, in order to evaluate user interface requirements for a PACS system. Based on these observations, a conceptual model of radiology has been developed. These discussions have also revealed some significant differences in the user interface requirements between the various services. Several underlying factors have been identified which may be used as initial predictors of individual user interface styles. A user model has been developed which incorporates these factors into the specification of a tailored PACS user interface.

  9. User-friendly software for modeling collective spin wave excitations

    NASA Astrophysics Data System (ADS)

    Hahn, Steven; Peterson, Peter; Fishman, Randy; Ehlers, Georg

    There exists a great need for user-friendly, integrated software that assists in the scientific analysis of collective spin wave excitations measured with inelastic neutron scattering. SpinWaveGenie is a C + + software library that simplifies the modeling of collective spin wave excitations, allowing scientists to analyze neutron scattering data with sophisticated models fast and efficiently. Furthermore, one can calculate the four-dimensional scattering function S(Q,E) to directly compare and fit calculations to experimental measurements. Its generality has been both enhanced and verified through successful modeling of a wide array of magnetic materials. Recently, we have spent considerable effort transforming SpinWaveGenie from an early prototype to a high quality free open source software package for the scientific community. S.E.H. acknowledges support by the Laboratory's Director's fund, ORNL. Work was sponsored by the Division of Scientific User Facilities, Office of Basic Energy Sciences, US Department of Energy, under Contract No. DE-AC05-00OR22725 with UT-Battelle, LLC.

  10. Users manual for a one-dimensional Lagrangian transport model

    USGS Publications Warehouse

    Schoellhamer, D.H.; Jobson, H.E.

    1986-01-01

    A Users Manual for the Lagrangian Transport Model (LTM) is presented. The LTM uses Lagrangian calculations that are based on a reference frame moving with the river flow. The Lagrangian reference frame eliminates the need to numerically solve the convective term of the convection-diffusion equation and provides significant numerical advantages over the more commonly used Eulerian reference frame. When properly applied, the LTM can simulate riverine transport and decay processes within the accuracy required by most water quality studies. The LTM is applicable to steady or unsteady one-dimensional unidirectional flows in fixed channels with tributary and lateral inflows. Application of the LTM is relatively simple and optional capabilities improve the model 's convenience. Appendices give file formats and three example LTM applications that include the incorporation of the QUAL II water quality model 's reaction kinetics into the LTM. (Author 's abstract)

  11. COSTEAM, an industrial steam generation cost model: updated users' manual

    SciTech Connect

    Murphy, Mary; Reierson, James; Lethi, Minh- Triet

    1980-10-01

    COSTEAM is a tool for designers and managers faced with choosing among alternative systems for generating process steam, whether for new or replacement applications. Such a decision requires a series of choices among overall system concepts, component characteristics, fuel types and financial assumptions, all of which are interdependent and affect the cost of steam. COSTEAM takes the user's input on key characteristics of a proposed process steam generation facility, and computes its capital, operating and maintenance costs. Versatility and simplicity of operation are major goals of the COSTEAM system. As a user, you can work to almost any level of detail necessary and appropriate to a given stage of planning. Since the values you specify are retained and used by the computer throughout each terminal session, you can set up a hypothetical steam generation system fixed in all characteristics but one or two of special interest. It is then quick and easy to obtain a series of results by changing only those one or two values between computer runs. This updated version of the Users' Manual contains instructions for using the expanded and improved COSTEAM model. COSTEAM has three technology submodels which address conventional coal, conventional oil and atmospheric fluidized bed combustion. The structure and calculation methods of COSTEAM are not discussed in this guide, and need not be understood in order to use the model. However, you may consult the companion volume of this report, COSTEAM Expansion and Improvements: Design of a Coal-Fired Atmospheric Fluidized Bed Submodel, an Oil-Fired Submodel, and Input/Output Improvements, MTR80W00048, which presents the design details.

  12. Theory, modeling, and simulation annual report, 1992

    SciTech Connect

    Not Available

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  13. Beliefs and Attitudes Regarding Drug Treatment: Application of the Theory of Planned Behavior in African American Cocaine Users

    PubMed Central

    Booth, Brenda M.; Stewart, Katharine E.; Curran, Geoffrey M.; Cheney, Ann M.; Borders, Tyrone F.

    2014-01-01

    Background The Theory of Planned Behavior (TPB) can provide insights into perceived need for cocaine treatment among African American cocaine users. Methods A cross-sectional community sample of 400 (50% rural) not-in-treatment African American cocaine users was identified through respondent-driven sampling in one urban and two rural counties in Arkansas. Measures included self-reports of attitudes and beliefs about cocaine treatment, perceived need and perceived effectiveness of treatment, and positive and negative cocaine expectancies. Normative beliefs were measured by perceived stigma and consequences of stigma regarding drug use and drug treatment. Perceived control was measured by readiness for treatment, prior drug treatment, and perceived ability to cut down on cocaine use without treatment. Findings Multiple regression analysis found that older age (standardized regression coefficient β = 0.15, P < 0.001), rural residence (β = −0.09, P = 0.025), effectiveness of treatment (β = 0.39, P < 0.001), negative cocaine expectancies (β = 0.138, P = 0.003), experiences of rejection (β = 0.18, P < 0.001), need for secrecy (β = 0.12, P = 0.002), and readiness for treatment (β = 0.15, P < 0.001), were independently associated with perceived need for cocaine treatment. Conclusions TPB is a relevant model for understanding perceived need for treatment among African American cocaine users. Research has shown perceived need to be a major correlate of treatment participation. Study results should be applicable for designing interventions to encourage treatment participation. PMID:24930051

  14. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

    2009-08-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several

  15. Intelligent User Interfaces for Information Analysis: A Cognitive Model

    SciTech Connect

    Schwarting, Irene S.; Nelson, Rob A.; Cowell, Andrew J.

    2006-01-29

    Intelligent user interfaces (IUIs) for information analysis (IA) need to be designed with an intrinsic understanding of the analytical objectives and the dimensions of the information space. These analytical objectives are oriented around the requirement to provide decision makers with courses of action. Most tools available to support analysis barely skim the surface of the dimensions and categories of information used in analysis, and almost none are designed to address the ultimate requirement of decision support. This paper presents a high-level model of the cognitive framework of information analysts in the context of doing their jobs. It is intended that this model will enable the derivation of design requirements for advanced IUIs for IA.

  16. SIMULATION MODEL FOR WATERSHED MANAGEMENT PLANNING. VOLUME 2. MODEL USER MANUAL

    EPA Science Inventory

    This report provides a user manual for the hydrologic, nonpoint source pollution simulation of the generalized planning model for evaluating forest and farming management alternatives. The manual contains an explanation of application of specific code and indicates changes that s...

  17. Hanford Soil Inventory Model (SIM) Rev. 1 Users Guide

    SciTech Connect

    Simpson, Brett C.; Corbin, Rob A.; Anderson, Michael J.; Kincaid, Charles T.

    2006-09-25

    The focus of the development and application of a soil inventory model as part of the Remediation and Closure Science (RCS) Project managed by PNNL was to develop a probabilistic approach to estimate comprehensive, mass balanced-based contaminant inventories for the Hanford Site post-closure setting. The outcome of this effort was the Hanford Soil Inventory Model (SIM). This document is a user's guide for the Hanford SIM. The principal project requirement for the SIM was to provide comprehensive quantitative estimates of contaminant inventory and its uncertainty for the various liquid waste sites, unplanned releases, and past tank farm leaks as a function of time and location at Hanford. The majority, but not all of these waste sites are in the 200 Areas of Hanford where chemical processing of spent fuel occurred. A computer model capable of performing these calculations and providing satisfactory quantitative output representing a robust description of contaminant inventory and uncertainty for use in other subsequent models was determined to be satisfactory to address the needs of the RCS Project. The ability to use familiar, commercially available software on high-performance personal computers for data input, modeling, and analysis, rather than custom software on a workstation or mainframe computer for modeling, was desired.

  18. User-Defined Material Model for Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)

    2006-01-01

    An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.

  19. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  20. A Markov Chain Model for Changes in Users' Assessment of Search Results.

    PubMed

    Zhitomirsky-Geffet, Maayan; Bar-Ilan, Judit; Levene, Mark

    2016-01-01

    Previous research shows that users tend to change their assessment of search results over time. This is a first study that investigates the factors and reasons for these changes, and describes a stochastic model of user behaviour that may explain these changes. In particular, we hypothesise that most of the changes are local, i.e. between results with similar or close relevance to the query, and thus belong to the same"coarse" relevance category. According to the theory of coarse beliefs and categorical thinking, humans tend to divide the range of values under consideration into coarse categories, and are thus able to distinguish only between cross-category values but not within them. To test this hypothesis we conducted five experiments with about 120 subjects divided into 3 groups. Each student in every group was asked to rank and assign relevance scores to the same set of search results over two or three rounds, with a period of three to nine weeks between each round. The subjects of the last three-round experiment were then exposed to the differences in their judgements and were asked to explain them. We make use of a Markov chain model to measure change in users' judgments between the different rounds. The Markov chain demonstrates that the changes converge, and that a majority of the changes are local to a neighbouring relevance category. We found that most of the subjects were satisfied with their changes, and did not perceive them as mistakes but rather as a legitimate phenomenon, since they believe that time has influenced their relevance assessment. Both our quantitative analysis and user comments support the hypothesis of the existence of coarse relevance categories resulting from categorical thinking in the context of user evaluation of search results. PMID:27171426

  1. User's guide to the MESOI diffusion model: Version 1. 1 (for Data General Eclipse S/230 with AFOS)

    SciTech Connect

    Athey, G.F.; Ramsdell, J.V.

    1982-09-01

    MESOI is an interactive, Langrangian puff trajectory model. The model theory is documented separately (Ramsdell and Athey, 1981). Version 1.1 is a modified form of the original 1.0. It is designed to run on a Data General Eclipse computer. The model has improved support features which make it useful as an emergency response tool. This report is intended to provide the user with the information necessary to successfully conduct model simulations using MESOI Version 1.1 and to use the support programs STAPREP and EXPLT. The user is also provided information on the use of the data file maintenance and review program UPDATE. Examples are given for the operation of the program. Test data sets are described which allow the user to practice with the programs and to confirm proper implementation and execution.

  2. WASP7 BENTHIC ALGAE - MODEL THEORY AND USER'S GUIDE

    EPA Science Inventory

    The standard WASP7 eutrophication module includes nitrogen and phosphorus cycling, dissolved oxygen-organic matter interactions, and phytoplankton kinetics. In many shallow streams and rivers, however, the attached algae (benthic algae, or periphyton, attached to submerged substr...

  3. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  4. Modeling of rolling element bearing mechanics. Computer program user's manual

    NASA Technical Reports Server (NTRS)

    Greenhill, Lyn M.; Merchant, David H.

    1994-01-01

    This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  5. GCFM Users Guide Revision for Model Version 5.0

    SciTech Connect

    Keimig, Mark A.; Blake, Coleman

    1981-08-10

    This paper documents alterations made to the MITRE/DOE Geothermal Cash Flow Model (GCFM) in the period of September 1980 through September 1981. Version 4.0 of GCFM was installed on the computer at the DOE San Francisco Operations Office in August 1980. This Version has also been distributed to about a dozen geothermal industry firms, for examination and potential use. During late 1980 and 1981, a few errors detected in the Version 4.0 code were corrected, resulting in Version 4.1. If you are currently using GCFM Version 4.0, it is suggested that you make the changes to your code that are described in Section 2.0. User's manual changes listed in Section 3.0 and Section 4.0 should then also be made.

  6. CORCON-MOD3: An integrated computer model for analysis of molten core-concrete interactions. User`s manual

    SciTech Connect

    Bradley, D.R.; Gardner, D.R.; Brockmann, J.E.; Griffith, R.O.

    1993-10-01

    The CORCON-Mod3 computer code was developed to mechanistically model the important core-concrete interaction phenomena, including those phenomena relevant to the assessment of containment failure and radionuclide release. The code can be applied to a wide range of severe accident scenarios and reactor plants. The code represents the current state of the art for simulating core debris interactions with concrete. This document comprises the user`s manual and gives a brief description of the models and the assumptions and limitations in the code. Also discussed are the input parameters and the code output. Two sample problems are also given.

  7. Dimer models and quiver gauge theories

    NASA Astrophysics Data System (ADS)

    Pichai, Ramadevi

    2013-12-01

    = 1 quiver gauge theories on coincident D3 branes placed at a tip of a Calabi-Yau singularity C are dual to string theories on AdS5×X5 where X5 are Sasaki-Einstein spaces. We present a neat combinatorial approach called dimer model to understand interrelations between toric quiver gauge theories and toric data representing the Calabi-Yau singularities.

  8. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    SciTech Connect

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  9. Crisis in Context Theory: An Ecological Model

    ERIC Educational Resources Information Center

    Myer, Rick A.; Moore, Holly B.

    2006-01-01

    This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…

  10. Graphical User Interface for Simulink Integrated Performance Analysis Model

    NASA Technical Reports Server (NTRS)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  11. Long Fibre Composite Modelling Using Cohesive User's Element

    NASA Astrophysics Data System (ADS)

    Kozák, Vladislav; Chlup, Zdeněk

    2010-09-01

    The development glass matrix composites reinforced by unidirectional long ceramic fibre has resulted in a family of very perspective structural materials. The only disadvantage of such materials is relatively high brittleness at room temperature. The main micromechanisms acting as toughening mechanism are the pull out, crack bridging, matrix cracking. There are other mechanisms as crack deflection etc. but the primer mechanism is mentioned pull out which is governed by interface between fibre and matrix. The contribution shows a way how to predict and/or optimise mechanical behaviour of composite by application of cohesive zone method and write user's cohesive element into the FEM numerical package Abaqus. The presented results from numerical calculations are compared with experimental data. Crack extension is simulated by means of element extinction algorithms. The principal effort is concentrated on the application of the cohesive zone model with the special traction separation (bridging) law and on the cohesive zone modelling. Determination of micro-mechanical parameters is based on the combination of static tests, microscopic observations and numerical calibration procedures.

  12. Propeller aircraft interior noise model: User's manual for computer program

    NASA Astrophysics Data System (ADS)

    Wilby, E. G.; Pope, L. D.

    1985-01-01

    A computer program entitled PAIN (Propeller Aircraft Interior Noise) has been developed to permit calculation of the sound levels in the cabin of a propeller-driven airplane. The fuselage is modeled as a cylinder with a structurally integral floor, the cabin sidewall and floor being stiffened by ring frames, stringers and floor beams of arbitrary configurations. The cabin interior is covered with acoustic treatment and trim. The propeller noise consists of a series of tones at harmonics of the blade passage frequency. Input data required by the program include the mechanical and acoustical properties of the fuselage structure and sidewall trim. Also, the precise propeller noise signature must be defined on a grid that lies in the fuselage skin. The propeller data are generated with a propeller noise prediction program such as the NASA Langley ANOPP program. The program PAIN permits the calculation of the space-average interior sound levels for the first ten harmonics of a propeller rotating alongside the fuselage. User instructions for PAIN are given in the report. Development of the analytical model is presented in NASA CR 3813.

  13. BPACK -- A computer model package for boiler reburning/co-firing performance evaluations. User`s manual, Volume 1

    SciTech Connect

    Wu, K.T.; Li, B.; Payne, R.

    1992-06-01

    This manual presents and describes a package of computer models uniquely developed for boiler thermal performance and emissions evaluations by the Energy and Environmental Research Corporation. The model package permits boiler heat transfer, fuels combustion, and pollutant emissions predictions related to a number of practical boiler operations such as fuel-switching, fuels co-firing, and reburning NO{sub x} reductions. The models are adaptable to most boiler/combustor designs and can handle burner fuels in solid, liquid, gaseous, and slurried forms. The models are also capable of performing predictions for combustion applications involving gaseous-fuel reburning, and co-firing of solid/gas, liquid/gas, gas/gas, slurry/gas fuels. The model package is conveniently named as BPACK (Boiler Package) and consists of six computer codes, of which three of them are main computational codes and the other three are input codes. The three main codes are: (a) a two-dimensional furnace heat-transfer and combustion code: (b) a detailed chemical-kinetics code; and (c) a boiler convective passage code. This user`s manual presents the computer model package in two volumes. Volume 1 describes in detail a number of topics which are of general users` interest, including the physical and chemical basis of the models, a complete description of the model applicability, options, input/output, and the default inputs. Volume 2 contains a detailed record of the worked examples to assist users in applying the models, and to illustrate the versatility of the codes.

  14. Theories of addiction: methamphetamine users' explanations for continuing drug use and relapse.

    PubMed

    Newton, Thomas F; De La Garza, Richard; Kalechstein, Ari D; Tziortzis, Desey; Jacobsen, Caitlin A

    2009-01-01

    A variety of preclinical models have been constructed to emphasize unique aspects of addiction-like behavior. These include Negative Reinforcement ("Pain Avoidance"), Positive Reinforcement ("Pleasure Seeking"), Incentive Salience ("Craving"), Stimulus Response Learning ("Habits"), and Inhibitory Control Dysfunction ("Impulsivity"). We used a survey to better understand why methamphetamine-dependent research volunteers (N = 73) continue to use methamphetamine, or relapse to methamphetamine use after a period of cessation of use. All participants met DSM-IV criteria for methamphetamine abuse or dependence, and did not meet criteria for other current Axis I psychiatric disorders or dependence on other drugs of abuse, other than nicotine. The questionnaire consisted of a series of face-valid questions regarding drug use, which in this case referred to methamphetamine use. Examples of questions include: "Do you use drugs mostly to make bad feelings like boredom, loneliness, or apathy go away?", "Do you use drugs mostly because you want to get high?", "Do you use drugs mostly because of cravings?", "Do you find yourself getting ready to take drugs without thinking about it?", and "Do you impulsively take drugs?". The scale was anchored at 1 (not at all) and 7 (very much). For each question, the numbers of participants rating each question negatively (1 or 2), neither negatively or affirmatively (3-5), and affirmatively (6 or 7) were tabulated. The greatest number of respondents (56%) affirmed that they used drugs due to "pleasure seeking." The next highest categories selected were "impulsivity" (27%) and "habits"(25%). Surprisingly, many participants reported that "pain avoidance" (30%) and "craving" (30%) were not important for their drug use. Results from this study support the contention that methamphetamine users (and probably other drug users as well) are more heterogeneous than is often appreciated, and imply that treatment development might be more successful if

  15. User Acceptance of Long-Term Evolution (LTE) Services: An Application of Extended Technology Acceptance Model

    ERIC Educational Resources Information Center

    Park, Eunil; Kim, Ki Joon

    2013-01-01

    Purpose: The aim of this paper is to propose an integrated path model in order to explore user acceptance of long-term evolution (LTE) services by examining potential causal relationships between key psychological factors and user intention to use the services. Design/methodology/approach: Online survey data collected from 1,344 users are analysed…

  16. Towards Model-Driven End-User Development in CALL

    ERIC Educational Resources Information Center

    Farmer, Rod; Gruba, Paul

    2006-01-01

    The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…

  17. Supersymmetric F-theory GUT models

    NASA Astrophysics Data System (ADS)

    Chung, Yu-Chieh

    F-theory is a twelve-dimensional geometric version of string theory and is believed to be a natural framework for GUT model building. The aim of this dissertation is to study how gauge theories realized by F-theory can accommodate GUT models. In this dissertation, we focus on local and semi-local GUT model building in F-theory. For local GUT models, we build SU(5) GUTs by using abelian U(1) fluxes via theSU6) gauge group. Doing so, we obtain non-minimal spectra of the MSSM with doublet-triplet splitting by switching on abelian U(1)2 fluxes. We also classify all supersymmetric U(1)2 fluxes by requiring an exotic-free bulk spectrum. For semi-local GUT models, we start with an E8 singularity and obtain lower rank gauge groups by unfolding the singularity governed by spectral covers. In this framework, the spectra can be calculated by the intersection numbers of spectral covers and matter curves. In particular, we useSU4) spectral covers and abelian U(1)X fluxes to build flippedSU5) models. We show that three-generation spectra of flippedSU5) models can be achieved by turning on suitable fluxes. To construct E6 GUTs, we consider SU3) spectral covers breaking E8 down to E6. Also three-generation extended MSSM can be obtained by using non-abelian SU2) x U(1)2 fluxes.

  18. Physician's information customizer (PIC): using a shareable user model to filter the medical literature.

    PubMed

    Pratt, W; Sim, I

    1995-01-01

    The practice of medicine is information-intensive. From reviewing the literature to formulating therapeutic plans, each physician handles information differently. Yet rarely does a representation of the user's information needs and preferences--a user model--get incorporated into information management tools, even though we might reasonably expect better acceptance and effectiveness if the tools' presentation and processing were customized to the user. We developed the Physician's Information Customizer (PIC), which generates a shareable user model that can be used in any medical information-management application. PIC elicits the stable, long-term attributes of a physician through simple questions about her specialty, research focus, areas of interest, patient characteristics (e.g., ages), and practice locale. To show the utility of this user model in customizing a medical informatics application, PIC custom-filters and ranks articles from Medline, using the user model to determine what would be most interesting to the user. Preliminary evaluation on all 99 unselected articles from a recent issue of six prominent medical journals shows that PIC ranks 66% of the articles as the user would. This demonstrates the feasibility of using easily acquired physician attributes to develop a user model that can successfully filter articles of interest from a large undifferentiated collection. Further testing and development is required to optimize the custom filter and to determine which characteristics should be included in the shareable user model and which should be obtained by individual applications. PMID:8591472

  19. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  20. A User Modeling System for Personalized Interaction and Tailored Retrieval in Interactive IR.

    ERIC Educational Resources Information Center

    Kelly, Diane; Belkin, Nicholas J.

    2002-01-01

    Presents a user modeling system for personalized interaction and tailored retrieval that tracks interactions over time, represents multiple information needs, allows for changes in information needs, acquires and updates the user model automatically, and accounts for contextual factors. Describes three models: general behavioral, personal…

  1. Graphical Model Theory for Wireless Sensor Networks

    SciTech Connect

    Davis, William B.

    2002-12-08

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm.

  2. Self Modeling: Expanding the Theories of Learning

    ERIC Educational Resources Information Center

    Dowrick, Peter W.

    2012-01-01

    Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…

  3. Understanding the Impact of User Frustration Intensities on Task Performance Using the OCC Theory of Emotions

    NASA Technical Reports Server (NTRS)

    Washington, Gloria

    2012-01-01

    Have you heard the saying "frustration is written all over your falce"? Well this saying is true, but that is not the only place. Frustration is written all over your face and your body. The human body has various means to communicate an emotion without the utterance of a single word. The Media Equation says that people interact with computers as if they are human: this includes experiencing frustration. This research measures frustration by monitoring human body-based measures such as heart rate, posture, skin temperature. and respiration. The OCC Theory of Emotions is used to separate frustration into different levels or intensities. The results of this study showed that individual intensities of frustration exist, so that task performance is not degraded. Results from this study can be used by usability testers to model how much frustration is needed before task performance measures start to decrease.

  4. A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James

    1994-01-01

    Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.

  5. User`s guide to the Yucca Mountain Integrating Model (YMIM) Version 2.1

    SciTech Connect

    Gansemer, J.; Lamont, A.

    1995-04-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the engineered barrier system. It contains models of the processes of waste container failure and nuclide release from the fuel rods. YMIM is driven by scenarios of container and rod temperature, near-field chemistry, and near-field hydrology provided by other modules. It is designed to be highly modular so that a model of an individual process can be easily modified to replaced without interfering with the models of other processes. This manual describes the process models and provides instructions for setting up and running YMIM Version 2.1.

  6. APEX user`s guide - (Argonne production, expansion, and exchange model for electrical systems), version 3.0

    SciTech Connect

    VanKuiken, J.C.; Veselka, T.D.; Guziel, K.A.; Blodgett, D.W.; Hamilton, S.; Kavicky, J.A.; Koritarov, V.S.; North, M.J.; Novickas, A.A.; Paprockas, K.R.

    1994-11-01

    This report describes operating procedures and background documentation for the Argonne Production, Expansion, and Exchange Model for Electrical Systems (APEX). This modeling system was developed to provide the U.S. Department of Energy, Division of Fossil Energy, Office of Coal and Electricity with in-house capabilities for addressing policy options that affect electrical utilities. To meet this objective, Argonne National Laboratory developed a menu-driven programming package that enables the user to develop and conduct simulations of production costs, system reliability, spot market network flows, and optimal system capacity expansion. The APEX system consists of three basic simulation components, supported by various databases and data management software. The components include (1) the investigation of Costs and Reliability in Utility Systems (ICARUS) model, (2) the Spot Market Network (SMN) model, and (3) the Production and Capacity Expansion (PACE) model. The ICARUS model provides generating-unit-level production-cost and reliability simulations with explicit recognition of planned and unplanned outages. The SMN model addresses optimal network flows with recognition of marginal costs, wheeling charges, and transmission constraints. The PACE model determines long-term (e.g., longer than 10 years) capacity expansion schedules on the basis of candidate expansion technologies and load growth estimates. In addition, the Automated Data Assembly Package (ADAP) and case management features simplify user-input requirements. The ADAP, ICARUS, and SMN modules are described in detail. The PACE module is expected to be addressed in a future publication.

  7. User`s guide for the CALPUFF dispersion model. Final report

    SciTech Connect

    1995-07-01

    This report describes the CALPUFF dispersion model and associated processing programs. The CALPUFF model described in this report reflect improvements to the model including (1) new modules to treat buoyant rise and dispersion from area sources (such as forest fires), buoyant line sources, and volume sources, (2) an improved treatment of complex terrain, (3) additional model switches to facilitate its use in regulatory applications, (4) an enhanced treatment of wind shear through puff splitting, and (4) an optional PC-based GUI. CALPUFF has been coupled to the Emissions Production Model (EPM) developed by the Forest Service through an interface processor. EPM provides time-dependent emissions and heat release data for use in modeling controlled burns and wildfires.

  8. Satellite services system analysis study. Volume 2: Satellite and services user model

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Satellite services needs are analyzed. Topics include methodology: a satellite user model; representative servicing scenarios; potential service needs; manned, remote, and automated involvement; and inactive satellites/debris. Satellite and services user model development is considered. Groundrules and assumptions, servicing, events, and sensitivity analysis are included. Selection of references satellites is also discussed.

  9. INDUSTRIAL COMBUSTION EMISSIONS (ICE) MODEL, VERSION 6.0. USER'S MANUAL

    EPA Science Inventory

    The report is a user's manual for the Industrial Combustion Emissions (ICE) model. It summarizes user options and software characteristics, and describes both the input data files and procedures for operating the model. It discusses proper formatting of files and creation of job ...

  10. EPA third-generation air quality modeling system: Models-3 user manual. Standard tutorial

    SciTech Connect

    1998-09-01

    Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric chemistry and physics. The initial version of Models-3 contains a Community Multi-scale Air Quality (CMAQ) modeling system for urban to regional scale air quality simulation of tropospheric ozone, acid deposition, visibility, and fine particles. Models-3 and CMAQ in combination form a powerful third generation air quality modeling and assessment system that enables a user to execute air quality simulation models and visualize their results. Models-3/CMAQ also assists the model developer to assemble, test, and evaluate science process components and their impact on chemistry-transport model predictions by facilitating the interchange of science codes, transparent use of multiple computing platforms, and access of data across the network. The Models-3/CMAQ provides flexibility to change key model specifications such as grid resolution and chemistry mechanism without rewriting the code. Models-3/CMAQ is intended to serve as a community framework for continual advancement and use of environmental assessment tools. This User Manual Tutorial serves as a guide to show the steps necessary to implement an application in Models-3/CMAQ.

  11. User-Centered Innovation: A Model for "Early Usability Testing."

    ERIC Educational Resources Information Center

    Sugar, William A.; Boling, Elizabeth

    The goal of this study is to show how some concepts and techniques from disciplines outside Instructional Systems Development (ISD) have the potential to extend and enhance the traditional view of ISD practice when they are employed very early in the ISD process. The concepts and techniques employed were user-centered in design and usability, and…

  12. Model for a fundamental theory with supersymmetry

    NASA Astrophysics Data System (ADS)

    Yokoo, Seiichiro

    Physics in the year 2006 is tightly constrained by experiment, observation, and mathematical consistency. The Standard Model provides a remarkably precise description of particle physics, and general relativity is quite successful in describing gravitational phenomena. At the same time, it is clear that a more fundamental theory is needed for several distinct reasons. Here we consider a new approach, which begins with the unusually ambitious point of view that a truly fundamental theory should aspire to explaining the origins of Lorentz invariance, gravity, gauge fields and their symmetry, supersymmetry, fermionic fields, bosonic fields, quantum mechanics and spacetime. The present dissertation is organized so that it starts with the most conventional ideas for extending the Standard Model and ends with a microscopic statistical picture, which is actually the logical starting point of the theory, but which is also the most remote excursion from conventional physics. One motivation for the present work is the fact that a Euclidean path integral in quantum physics is equivalent to a partition function in statistical physics. This suggests that the most fundamental description of nature may be statistical. This dissertation may be regarded as an attempt to see how far one can go with this premise in explaining the observed phenomena, starting with the simplest statistical picture imaginable. It may be that nature is richer than the model assumed here, but the present results are quite suggestive, because, with a set of assumptions that are not unreasonable, one recovers the phenomena listed above. At the end, the present theory leads back to conventional physics, except that Lorentz invariance and supersymmetry are violated at extremely high energy. To be more specific, one obtains local Lorentz invariance (at low energy compared to the Planck scale), an SO( N) unified gauge theory (with N = 10 as the simplest possibility), supersymmetry of Standard Model fermions and

  13. Incorporation of Decision and Game Theories in Early-Stage Complex Product Design to Model End-Use

    NASA Astrophysics Data System (ADS)

    Mesmer, Bryan L.

    The need for design models that accurately capture the complexities of products increase as products grow ever more complicated. The accuracies of these models depend upon the inputs and the methods used on those inputs to determine an output. Product designers must determine the dominant inputs and make assumptions concerning inputs that have less of an effect. Properly capturing the important inputs in the early design stages, where designs are being simulated, allows for modifications of the design at a relatively low cost. In this dissertation, an input that has a high impact on product performance but is usually neglected until later design stages is examined. The end-users of a product interact with the product and with each other in ways that affect the performance of that product. End-users are typically brought in at the later design stages, or as representations on the design team. They are rarely used as input variables in the product models. By incorporating the end-users in the early models and simulations, the end-users' impact on performance are captured when modifications to the designs are cheaper. The methodology of capturing end-user decision making in product models, developed in this dissertation, is created using the methods of decision and game theory. These theories give a mathematical basis for decision making based on the end-users' beliefs and preferences. Due to the variations that are present in end-users' preferences, their interactions with the product cause variations in the performance. This dissertation shows that capturing the end-user interactions in simulations enables the designer to create products that are more robust to the variations of the end-users. The manipulation of a game that an individual plays to drive an outcome desired by a designer is referred to as mechanism design. This dissertation also shows how a designer can influence the end-users' decisions to optimize the designer's goals. How product controlled

  14. Recursive renormalization group theory based subgrid modeling

    NASA Technical Reports Server (NTRS)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  15. Engaging Theories and Models to Inform Practice

    ERIC Educational Resources Information Center

    Kraus, Amanda

    2012-01-01

    Helping students prepare for the complex transition to life after graduation is an important responsibility shared by those in student affairs and others in higher education. This chapter explores theories and models that can inform student affairs practitioners and faculty in preparing students for life after college. The focus is on roles,…

  16. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  17. A Catastrophe Theory Model of Attitude Change.

    ERIC Educational Resources Information Center

    Flay, Brian R.

    Within the large body of literature on attitude change, many diverse and sometimes apparently conflicting findings have been reported. A catastrophe theory model of attitude change that attempts to synthesize many of these diverse findings is proposed. Attitude change is usually monotonic with message content or the strength of the persuasion…

  18. Theory and modeling of stellar atmospheres

    NASA Astrophysics Data System (ADS)

    Hubeny, Ivan

    2010-08-01

    I will briefly outline basic concepts of the stellar atmospheres theory. After summarizing basic structural equations describing a stellar atmospheres, an emphasis is given to describing efficient numerical methods developed to deal with the stellar atmosphere problem, namely the method of complete linearization ant its recent variants, and the whole class of methods known by name Accelerated Lambda Iteration. In the next part of the lectures I will briefly summarize existing computer codes, with an emphasis on our code TLUSTY, and list some of the most useful grids of model atmospheres that are publicly available. Next, I will show how the model atmospheres and synthetic spectra are used in quantitative stellar spectroscopy in order to determine basic stellar parameters and chemical abundances. Finally, I will briefly describe an application of model atmosphere theory and models to related objects, such as accretion disks around various accretors, and atmospheres of substellar-mass objects-extrasolar giant planets and brown dwarfs.

  19. A Study of Context-Awareness RBAC Model Using User Profile on Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Jang, Bokman; Park, Sungdo; Chang, Hyokyung; Ahn, Hyosik; Choi, Euiin

    Recently, With the IT technique growth, there is getting formed to convert to ubiquitous environment that means it can access information everywhere and every-time using various devices, and the computer can decide to provide useful services to users. But, in this computing environment will be connected to wireless network and various devices. According to, recklessness approaches of information resource make trouble to system. So, access authority management is very important issue both information resource and adapt to system through founding security policy to need a system. So, this model has a problem that is not concerned about user's context information as user's profile. In this paper suppose to context-awareness RABC model that based on profile about which user's information which provide efficiently access control to user through active classification, inference and judgment about user who access to system and resource.

  20. LDA-Based Unified Topic Modeling for Similar TV User Grouping and TV Program Recommendation.

    PubMed

    Pyo, Shinjee; Kim, Eunhui; Kim, Munchurl

    2015-08-01

    Social TV is a social media service via TV and social networks through which TV users exchange their experiences about TV programs that they are viewing. For social TV service, two technical aspects are envisioned: grouping of similar TV users to create social TV communities and recommending TV programs based on group and personal interests for personalizing TV. In this paper, we propose a unified topic model based on grouping of similar TV users and recommending TV programs as a social TV service. The proposed unified topic model employs two latent Dirichlet allocation (LDA) models. One is a topic model of TV users, and the other is a topic model of the description words for viewed TV programs. The two LDA models are then integrated via a topic proportion parameter for TV programs, which enforces the grouping of similar TV users and associated description words for watched TV programs at the same time in a unified topic modeling framework. The unified model identifies the semantic relation between TV user groups and TV program description word groups so that more meaningful TV program recommendations can be made. The unified topic model also overcomes an item ramp-up problem such that new TV programs can be reliably recommended to TV users. Furthermore, from the topic model of TV users, TV users with similar tastes can be grouped as topics, which can then be recommended as social TV communities. To verify our proposed method of unified topic-modeling-based TV user grouping and TV program recommendation for social TV services, in our experiments, we used real TV viewing history data and electronic program guide data from a seven-month period collected by a TV poll agency. The experimental results show that the proposed unified topic model yields an average 81.4% precision for 50 topics in TV program recommendation and its performance is an average of 6.5% higher than that of the topic model of TV users only. For TV user prediction with new TV programs, the average

  1. Theory, Modeling, and Simulation of Semiconductor Lasers

    NASA Technical Reports Server (NTRS)

    Ning, Cun-Zheng; Saini, Subbash (Technical Monitor)

    1998-01-01

    Semiconductor lasers play very important roles in many areas of information technology. In this talk, I will first give an overview of semiconductor laser theory. This will be followed by a description of different models and their shortcomings in modeling and simulation. Our recent efforts in constructing a fully space and time resolved simulation model will then be described. Simulation results based on our model will be presented. Finally the effort towards a self-consistent and comprehensive simulation capability for the opto-electronics integrated circuits (OEICs) will be briefly reviewed.

  2. Viscous wing theory development. Volume 2: GRUMWING computer program user's manual

    NASA Technical Reports Server (NTRS)

    Chow, R. R.; Ogilvie, P. L.

    1986-01-01

    This report is a user's manual which describes the operation of the computer program, GRUMWING. The program computes the viscous transonic flow over three-dimensional wings using a boundary layer type viscid-inviscid interaction approach. The inviscid solution is obtained by an approximate factorization (AFZ)method for the full potential equation. The boundary layer solution is based on integral entrainment methods.

  3. Informatic system for a global tissue-fluid biorepository with a graph theory-oriented graphical user interface.

    PubMed

    Butler, William E; Atai, Nadia; Carter, Bob; Hochberg, Fred

    2014-01-01

    The Richard Floor Biorepository supports collaborative studies of extracellular vesicles (EVs) found in human fluids and tissue specimens. The current emphasis is on biomarkers for central nervous system neoplasms but its structure may serve as a template for collaborative EV translational studies in other fields. The informatic system provides specimen inventory tracking with bar codes assigned to specimens and containers and projects, is hosted on globalized cloud computing resources, and embeds a suite of shared documents, calendars, and video-conferencing features. Clinical data are recorded in relation to molecular EV attributes and may be tagged with terms drawn from a network of externally maintained ontologies thus offering expansion of the system as the field matures. We fashioned the graphical user interface (GUI) around a web-based data visualization package. This system is now in an early stage of deployment, mainly focused on specimen tracking and clinical, laboratory, and imaging data capture in support of studies to optimize detection and analysis of brain tumour-specific mutations. It currently includes 4,392 specimens drawn from 611 subjects, the majority with brain tumours. As EV science evolves, we plan biorepository changes which may reflect multi-institutional collaborations, proteomic interfaces, additional biofluids, changes in operating procedures and kits for specimen handling, novel procedures for detection of tumour-specific EVs, and for RNA extraction and changes in the taxonomy of EVs. We have used an ontology-driven data model and web-based architecture with a graph theory-driven GUI to accommodate and stimulate the semantic web of EV science. PMID:25317275

  4. Application of model search to lattice theory.

    SciTech Connect

    Rose, M.; Wilkinson, K.; Mathematics and Computer Science

    2001-08-01

    We have used the first-order model-searching programs MACE and SEM to study various problems in lattice theory. First, we present a case study in which the two programs are used to examine the differences between the stages along the way from lattice theory to Boolean algebra. Second, we answer several questions posed by Norman Megill and Mladen Pavicic on ortholattices and orthomodular lattices. The questions from Megill and Pavicic arose in their study of quantum logics, which are being investigated in connection with proposed computing devices based on quantum mechanics. Previous questions of a similar nature were answered by McCune and MACE in [2].

  5. The Oak Ridge National Laboratory automobile heat pump model: User`s guide

    SciTech Connect

    Kyle, D.M.

    1993-05-01

    A computer program has been developed to predict the steady-state performance of vapor compression automobile air conditioners and heat pumps. The code is based on the residential heat pump model developed at Oak Ridge National Laboratory. Most calculations are based on fundamental physical principles, in conjunction with generalized correlations available in the research literature. Automobile air conditioning components that can be specified as inputs to the program include open and hermetic compressors; finned tube condensers; finned tube and plate-fin style evaporators; thermal expansion valve, capillary tube and short tube expansion devices; refrigerant mass; evaporator pressure regulator; and all interconnecting tubing. The program can be used with a variety of refrigerants, including R134a. Methodologies are discussed for using the model as a tool for designing all new systems or, alternatively, as a tool for simulating a known system for a variety of operating conditions.

  6. Density Functional Theory Models for Radiation Damage

    NASA Astrophysics Data System (ADS)

    Dudarev, S. L.

    2013-07-01

    Density functional theory models developed over the past decade provide unique information about the structure of nanoscale defects produced by irradiation and about the nature of short-range interaction between radiation defects, clustering of defects, and their migration pathways. These ab initio models, involving no experimental input parameters, appear to be as quantitatively accurate and informative as the most advanced experimental techniques developed for the observation of radiation damage phenomena. Density functional theory models have effectively created a new paradigm for the scientific investigation and assessment of radiation damage effects, offering new insight into the origin of temperature- and dose-dependent response of materials to irradiation, a problem of pivotal significance for applications.

  7. Crack propagation modeling using Peridynamic theory

    NASA Astrophysics Data System (ADS)

    Hafezi, M. H.; Alebrahim, R.; Kundu, T.

    2016-04-01

    Crack propagation and branching are modeled using nonlocal peridynamic theory. One major advantage of this nonlocal theory based analysis tool is the unifying approach towards material behavior modeling - irrespective of whether the crack is formed in the material or not. No separate damage law is needed for crack initiation and propagation. This theory overcomes the weaknesses of existing continuum mechanics based numerical tools (e.g. FEM, XFEM etc.) for identifying fracture modes and does not require any simplifying assumptions. Cracks grow autonomously and not necessarily along a prescribed path. However, in some special situations such as in case of ductile fracture, the damage evolution and failure depend on parameters characterizing the local stress state instead of peridynamic damage modeling technique developed for brittle fracture. For brittle fracture modeling the bond is simply broken when the failure criterion is satisfied. This simulation helps us to design more reliable modeling tool for crack propagation and branching in both brittle and ductile materials. Peridynamic analysis has been found to be very demanding computationally, particularly for real-world structures (e.g. vehicles, aircrafts, etc.). It also requires a very expensive visualization process. The goal of this paper is to bring awareness to researchers the impact of this cutting-edge simulation tool for a better understanding of the cracked material response. A computer code has been developed to implement the peridynamic theory based modeling tool for two-dimensional analysis. A good agreement between our predictions and previously published results is observed. Some interesting new results that have not been reported earlier by others are also obtained and presented in this paper. The final objective of this investigation is to increase the mechanics knowledge of self-similar and self-affine cracks.

  8. Vulnerability and the intention to anabolic steroids use among Iranian gym users: an application of the theory of planned behavior.

    PubMed

    Allahverdipour, Hamid; Jalilian, Farzad; Shaghaghi, Abdolreza

    2012-02-01

    This correlational study explored the psychological antecedents of 253 Iranian gym users' intentions to use the anabolic-androgenic steroids (AAS), based on the Theory of Planned Behavior (TPB). The three predictor variables of (1) attitude, (2) subjective norms, and (3) perceived behavioral control accounted for 63% of the variation in the outcome measure of the intention to use the AAS. There is some support to use the TPB to design and implement interventions to modify and/or improve individuals' beliefs that athletic goals are achievable without the use of the AAS. PMID:22217129

  9. The partonic interpretation of reggeon theory models

    NASA Astrophysics Data System (ADS)

    Boreskov, K. G.; Kaidalov, A. B.; Khoze, V. A.; Martin, A. D.; Ryskin, M. G.

    2005-12-01

    We review the physical content of the two simplest models of reggeon field theory: namely the eikonal and the Schwimmer models. The AGK cutting rules are used to obtain the inclusive, the inelastic and the diffractive cross sections. The system of non-linear equations for these cross sections is written down and analytic expressions for its solution are obtained. We derive the rapidity gap dependence of the differential cross sections for diffractive dissociation in the Schwimmer model and in its eikonalized extension. The results are interpreted from the partonic viewpoint of the interaction at high energies.

  10. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    SciTech Connect

    Bloyd, C.; Camp, J.; Conzelmann, G.

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  11. Solid Waste Projection Model: Database User`s Guide. Version 1.4

    SciTech Connect

    Blackburn, C.L.

    1993-10-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for using Version 1.4 of the SWPM database: system requirements and preparation, entering and maintaining data, and performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not Provide instruction in the use of Paradox, the database management system in which the SWPM database is established.

  12. Unique Metadata Schemas: A Model for User-Centric Design of a Performance Support System

    ERIC Educational Resources Information Center

    Schatz, Steven C.

    2005-01-01

    Learning object technology is viewed as a method for fast retrieval. This effort is on developing unique schemas for a targeted group to aid efficient retrieval. In this article, I study a user-centric model for developing tags for K-12 educators that is based on user needs, expectations, and problems. I use a combination of techniques from human…

  13. UNAMAP: user's network for applied modeling of air pollution, Version 6. Model

    SciTech Connect

    Turner, D.B.; Busse, A.D.

    1986-08-01

    UNAMAP (Version 6) represents the 1986 update to the users network for applied modeling of air pollution. UNAMAP consists of an ASCII magnetic tape containing FORTRAN codes an test data for 25 air-quality simulation models (AQSM) as well as associated documentation. AQSM's and supporting programs and data are arranged in six sections: (1) Guideline (appendix A) models..(files 2 through 9); (2) Other models or processors (new models). .(files 10 through 19 and 33); (3) Other models and processors (revised)..(files 20 through 27 and 32); (4) Additional models for regulatory use (files 28 through 31); (5) Data files..(files 34 through 39); and (6) Output print files..(files 40 through 68). There are 68 files on this tape..Software Description: The system is written in FORTRAN for implementation on a UNIVAC 1100/82 using the 39R2 operating system.

  14. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    NASA Astrophysics Data System (ADS)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  15. User's guide to the MESOI diffusion model and to the utility programs UPDATE and LOGRVU

    SciTech Connect

    Athey, G.F.; Allwine, K.J.; Ramsdell, J.V.

    1981-11-01

    MESOI is an interactive, Lagrangian puff trajectory diffusion model. The model is documented separately (Ramsdell and Athey, 1981); this report is intended to provide MESOI users with the information needed to successfully conduct model simulations. The user is also provided with guidance in the use of the data file maintenance and review programs; UPDATE and LOGRVU. Complete examples are given for the operaton of all three programs and an appendix documents UPDATE and LOGRVU.

  16. Physics models and user`s guide for the neutral beam module of the SUPERCODE

    SciTech Connect

    Mandrekas, J.

    1992-08-01

    This report contains a description of the neutral beam heating and current drive module Beams, that was developed at Georgia Tech for the SUPERCODE, the new systems and operations code for the ITER EDA. The NB module calculates profiles of the neutral beam deposition, fast ion pressure, beam heating power, and neutral beam driven current density. It also computes global parameters such as current drive efficiencies, beam shinethrough, fast beam ion beta, and the fusion power and neutron production due to beam-plasma interactions. The most important consideration during the development of this module was to make it compute normally fast without compromising physical accuracy. We believe that through careful selection of physical models and optimized coding, these conflicting requirements have been largely met. As a result, the SUPERCODE has now the ability to perform self-consistent calculations involving NB heating and current drive. This capability is very important for the study of sub-ignited, hybrid, or steady-state ITER and post-TFIR reactor operating scenarios. It is also the first time that a systems code has had such capabilities, usually found only in 1-1/2D plasma transport codes.

  17. Topos models for physics and topos theory

    SciTech Connect

    Wolters, Sander

    2014-08-15

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  18. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia

    PubMed Central

    Segkouli, Sofia; Paliokas, Ioannis; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  19. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia.

    PubMed

    Segkouli, Sofia; Paliokas, Ioannis; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  20. Prospects for Advanced RF Theory and Modeling

    SciTech Connect

    Batchelor, D.B.

    1999-04-12

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed.

  1. Communications network design and costing model users manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.

  2. Quantum mechanical model in gravity theory

    NASA Astrophysics Data System (ADS)

    Losyakov, V. V.

    2016-05-01

    We consider a model of a real massive scalar field defined as homogeneous on a d-dimensional sphere such that the sphere radius, time scale, and scalar field are related by the equations of the general theory of relativity. We quantize this system with three degrees of freedom, define the observables, and find dynamical mean values of observables in the regime where the scalar field mass is much less than the Planck mass.

  3. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    ERIC Educational Resources Information Center

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  4. Modeling Integrated Water-User Decisions with Intermittent Supplies

    NASA Astrophysics Data System (ADS)

    Lund, J. R.; Rosenberg, D.

    2006-12-01

    We present an economic-engineering method to estimate urban water use demands with intermittent water supplies. A two-stage, probabilistic optimization formulation includes a wide variety of water supply enhancement and conservation actions that individual households can adopt to meet multiple water quality uses with uncertain water availability. We embed the optimization in Monte-Carlo simulations to show aggregate effects at a utility (citywide) scale for a population of user conditions and decisions. Parametric analysis provides derivations of supply curves to subsidize conservation, demand responses to alternative pricing, and customer willingness-to-pay to avoid shortages. Results show a good empirical fit for the average and distribution of billed residential water use in Amman, Jordan. Additional outputs give likely market penetration rates for household conservation actions, associated water savings, and subsidies required to entice further adoption. We discuss new insights to size, target, market, and finance conservation programs and interpret a demand curve with block pricing.

  5. OCD: The offshore and coastal dispersion model. Volume 1. User's guide

    SciTech Connect

    DiCristofaro, D.C.; Hanna, S.R.

    1989-11-01

    The Offshore and Coastal Dispersion (OCD) Model has been developed to simulate the effect of offshore emissions from point, area, or line sources on the air quality of coastal regions. The OCD model was adapted from the EPA guideline model MPTER (EPA, 1980). Modifications were made to incorporate overwater plume transport and dispersion as well as changes that occur as the plume crosses the shoreline. This is a revised OCD model, the fourth version to date. The volume is the User's Guide which includes a Model overview, technical description, user's instructions, and notes on model evaluation and results.

  6. Users guide for the hydroacoustic coverage assessment model (HydroCAM)

    SciTech Connect

    Farrell, T., LLNL

    1997-12-01

    A model for predicting the detection and localization performance of hydroacoustic monitoring networks has been developed. The model accounts for major factors affecting global-scale acoustic propagation in the ocean. including horizontal refraction, travel time variability due to spatial and temporal fluctuations in the ocean, and detailed characteristics of the source. Graphical user interfaces are provided to setup the models and visualize the results. The model produces maps of network detection coverage and localization area of uncertainty, as well as intermediate results such as predicted path amplitudes, travel time and travel time variance. This Users Guide for the model is organized into three sections. First a summary of functionality available in the model is presented, including example output products. The second section provides detailed descriptions of each of models contained in the system. The last section describes how to run the model, including a summary of each data input form in the user interface.

  7. Theory, modeling and simulation: Annual report 1993

    SciTech Connect

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  8. Empirical Analysis and Modeling of Users' Topic Interests in Online Forums

    PubMed Central

    Xiong, Fei; Liu, Yun

    2012-01-01

    Bulletin Board Systems (BBSs) have demonstrated their usefulness in spreading information. In BBS forums, a few posts that address currently popular social topics attract a lot of attention, and different users are interested in many different discussion topics. We investigate topic cluster features and user interests of an actual BBS forum, analyzing user posting and replying behavior. According to the growing process of BBS, we suggest a network model in which each agent only replies to the posts that belong to its specific topics of interest. A post that is replied to will be immediately assigned the highest priority on the post list. Simulation results show that characteristics of our model are similar to those of the real BBS. The model with heterogeneous user interests promotes the occurrence of popular posts, and the user relationship network possesses a large clustering coefficient. Bursts and long waiting time exist in user replying behavior, leading to non-Poisson user activity pattern. In addition, the model produces an analogous evolving trend of Gini coefficients for posts' and clusters' participants as BBS forums. PMID:23251401

  9. USER'S GUIDE TO THE MESOPUFF II MODEL AND RELATED PROCESSOR PROGRAMS

    EPA Science Inventory

    A complete set of user instructions are provided for the MESOPUFF II regional-scale air quality modeling package. The MESOPUFF II model is a Lagrangian variable-trajectory puff superposition model suitable for modeling the transport, diffusion, and removal of air pollutants from ...

  10. General topology meets model theory, on and

    PubMed Central

    Malliaris, Maryanthe; Shelah, Saharon

    2013-01-01

    Cantor proved in 1874 [Cantor G (1874) J Reine Angew Math 77:258–262] that the continuum is uncountable, and Hilbert’s first problem asks whether it is the smallest uncountable cardinal. A program arose to study cardinal invariants of the continuum, which measure the size of the continuum in various ways. By Gödel [Gödel K (1939) Proc Natl Acad Sci USA 25(4):220–224] and Cohen [Cohen P (1963) Proc Natl Acad Sci USA 50(6):1143–1148], Hilbert’s first problem is independent of ZFC (Zermelo-Fraenkel set theory with the axiom of choice). Much work both before and since has been done on inequalities between these cardinal invariants, but some basic questions have remained open despite Cohen’s introduction of forcing. The oldest and perhaps most famous of these is whether “,” which was proved in a special case by Rothberger [Rothberger F (1948) Fund Math 35:29–46], building on Hausdorff [Hausdorff (1936) Fund Math 26:241–255]. In this paper we explain how our work on the structure of Keisler’s order, a large-scale classification problem in model theory, led to the solution of this problem in ZFC as well as of an a priori unrelated open question in model theory. PMID:23836659

  11. Algorithm for model validation: Theory and applications

    PubMed Central

    Sornette, D.; Davis, A. B.; Ide, K.; Vixie, K. R.; Pisarenko, V.; Kamm, J. R.

    2007-01-01

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer–Meshkov instability. PMID:17420476

  12. Algorithm for model validation: theory and applications.

    PubMed

    Sornette, D; Davis, A B; Ide, K; Vixie, K R; Pisarenko, V; Kamm, J R

    2007-04-17

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer-Meshkov instability. PMID:17420476

  13. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    EPA Science Inventory

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  14. User's manual for the Human Exposure Model (HEM). Interim report

    SciTech Connect

    Not Available

    1986-06-01

    This document describes the Human Exposure Model, furnishes contact personnel to establish access to the UNIVAC System, and provides step-by-step instructions for operating both the SHED and SHEAR portions of the model. The manual also lists caveats that should be considered when using the HEM and criteria to distinguish situations that are appropriately modeled by each portion of HEM. The intended audience ranges from someone with limited knowledge of modeling to someone well acquainted with the UNIVAC.

  15. USER GUIDE FOR THE ENHANCED HYDRODYNAMICAL-NUMERICAL MODEL

    EPA Science Inventory

    This guide provides the documentation required for used of the Enhanced Hydrodynamical-Numerical Model on operational problems. The enhanced model is a multilayer Hansen type model extended to handle near-shore processes by including: Non-linear term extension to facilitate small...

  16. Jobs and Economic Development Impact (JEDI) Model Geothermal User Reference Guide

    SciTech Connect

    Johnson, C.; Augustine, C.; Goldberg, M.

    2012-09-01

    The Geothermal Jobs and Economic Development Impact (JEDI) model, developed through the National Renewable Energy Laboratory (NREL), is an Excel-based user-friendly tools that estimates the economic impacts of constructing and operating hydrothermal and Enhanced Geothermal System (EGS) power generation projects at the local level for a range of conventional and renewable energy technologies. The JEDI Model Geothermal User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.

  17. User's manual for interactive LINEAR: A FORTRAN program to derive linear aircraft models

    NASA Technical Reports Server (NTRS)

    Antoniewicz, Robert F.; Duke, Eugene L.; Patterson, Brian P.

    1988-01-01

    An interactive FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft aerodynamic models is documented in this report. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied linear or nonlinear aerodynamic model. The nonlinear equations of motion used are six-degree-of-freedom equations with stationary atmosphere and flat, nonrotating earth assumptions. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.

  18. Standard Model as a Double Field Theory.

    PubMed

    Choi, Kang-Sin; Park, Jeong-Hyuck

    2015-10-23

    We show that, without any extra physical degree introduced, the standard model can be readily reformulated as a double field theory. Consequently, the standard model can couple to an arbitrary stringy gravitational background in an O(4,4) T-duality covariant manner and manifest two independent local Lorentz symmetries, Spin(1,3)×Spin(3,1). While the diagonal gauge fixing of the twofold spin groups leads to the conventional formulation on the flat Minkowskian background, the enhanced symmetry makes the standard model more rigid, and also stringy, than it appeared. The CP violating θ term may no longer be allowed by the symmetry, and hence the strong CP problem can be solved. There are now stronger constraints imposed on the possible higher order corrections. We speculate that the quarks and the leptons may belong to the two different spin classes. PMID:26551099

  19. Queuing theory models for computer networks

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  20. Standard Model as a Double Field Theory

    NASA Astrophysics Data System (ADS)

    Choi, Kang-Sin; Park, Jeong-Hyuck

    2015-10-01

    We show that, without any extra physical degree introduced, the standard model can be readily reformulated as a double field theory. Consequently, the standard model can couple to an arbitrary stringy gravitational background in an O (4 ,4 ) T -duality covariant manner and manifest two independent local Lorentz symmetries, Spin(1 ,3 )×Spin(3 ,1 ) . While the diagonal gauge fixing of the twofold spin groups leads to the conventional formulation on the flat Minkowskian background, the enhanced symmetry makes the standard model more rigid, and also stringy, than it appeared. The C P violating θ term may no longer be allowed by the symmetry, and hence the strong C P problem can be solved. There are now stronger constraints imposed on the possible higher order corrections. We speculate that the quarks and the leptons may belong to the two different spin classes.

  1. HIGHWAY 3.1: An enhanced HIGHWAY routing model: Program description, methodology, and revised user`s manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S.; Clarke, D.B.; Jacobi, J.M.

    1993-03-01

    The HIGHWAY program provides a flexible tool for predicting highway routes for transporting radioactive materials in the United States. The HIGHWAY data base is essentially a computerized road atlas that currently describes over 240,000 miles of highways. Complete descriptions of all Interstate System and most US highways (except those that parallel a nearby Interstate highway) are included in the data base. Many of the principal state highways and a number of local and county highways are also identified. The data base also includes locations of nuclear facilities and major airports. Several different types of routes may be calculated, depending on a set of user-supplied constraints. Routes are calculated by minimizing the total impedance between the origin and the destination. Basically, the impedance is defined as a function of distance and driving time along a particular highway segment. Several routing constraints can be imposed during the calculations. One of the special features of the HIGHWAY model is its ability to calculate routes that maximize use of Interstate System highways. This feature allows the user to predict routes for shipments of radioactive materials that conform to the US Department of Transportation routing regulations. Other features of the model include the ability to predict routes that bypass a specific state, city, town, or highway segment. Two special features have been incorporated in HIGHWAY, version 3.1. The first is the ability to automatically calculate alternative routes. Frequently, there are a number of routes between the source and destination that vary slightly in distance and estimated driving time. The HIGHWAY program offers a selection of different but nearly equal routes. The second special feature is the capability to calculate route-specific population density statistics. The population density distribution is calculated for each highway segment in the route and is reported on a state-by-state basis.

  2. Solar Advisor Model User Guide for Version 2.0

    SciTech Connect

    Gilman, P.; Blair, N.; Mehos, M.; Christensen, C.; Janzou, S.; Cameron, C.

    2008-08-01

    The Solar Advisor Model (SAM) provides a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, from photovoltaic systems for residential and commercial markets to concentrating solar power and large photovoltaic systems for utility markets. This manual describes Version 2.0 of the software, which can model photovoltaic and concentrating solar power technologies for electric applications for several markets. The current version of the Solar Advisor Model does not model solar heating and lighting technologies.

  3. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable

  4. Compass models: Theory and physical motivations

    NASA Astrophysics Data System (ADS)

    Nussinov, Zohar; van den Brink, Jeroen

    2015-01-01

    Compass models are theories of matter in which the couplings between the internal spin (or other relevant field) components are inherently spatially (typically, direction) dependent. A simple illustrative example is furnished by the 90° compass model on a square lattice in which only couplings of the form τixτjx (where {τia}a denote Pauli operators at site i ) are associated with nearest-neighbor sites i and j separated along the x axis of the lattice while τiyτjy couplings appear for sites separated by a lattice constant along the y axis. Similar compass-type interactions can appear in diverse physical systems. For instance, compass models describe Mott insulators with orbital degrees of freedom where interactions sensitively depend on the spatial orientation of the orbitals involved as well as the low-energy effective theories of frustrated quantum magnets, and a host of other systems such as vacancy centers, and cold atomic gases. The fundamental interdependence between internal (spin, orbital, or other) and external (i.e., spatial) degrees of freedom which underlies compass models generally leads to very rich behaviors, including the frustration of (semi-)classical ordered states on nonfrustrated lattices, and to enhanced quantum effects, prompting, in certain cases, the appearance of zero-temperature quantum spin liquids. As a consequence of these frustrations, new types of symmetries and their associated degeneracies may appear. These intermediate symmetries lie midway between the extremes of global symmetries and local gauge symmetries and lead to effective dimensional reductions. In this article, compass models are reviewed in a unified manner, paying close attention to exact consequences of these symmetries and to thermal and quantum fluctuations that stabilize orders via order-out-of-disorder effects. This is complemented by a survey of numerical results. In addition to reviewing past works, a number of other models are introduced and new results

  5. Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation.

    NASA Astrophysics Data System (ADS)

    Lawley, Russell; Barron, Mark; Lee, Katy

    2014-05-01

    Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation. R. Lawley, M. Barron and K. Lee. NERC - British Geological Survey, Environmental Science Centre, Keyworth, Nottingham, UK, NG12 5GG The boundaries mapped in traditional field geological survey are subject to a wide range of inherent uncertainties. A map at a survey-scale of 1:10,000 is created by a combination of terrain interpretation, direct observations from boreholes and exposures (often sparsely distributed), and indirect interpretation of proxy variables such as soil properties, vegetation and remotely sensed images. A critical factor influencing the quality of the final map is the skill and experience of the surveyor to bring this information together in a coherent conceptual model. The users of geological data comprising or based on mapped boundaries are increasingly aware of these uncertainties, and want to know how to manage them. The growth of 3D modelling, which takes 2D surveys as a starting point, adds urgency to the need for a better understanding of survey uncertainties; particularly where 2D mapping of variable vintage has been compiled into a national coverage. Previous attempts to apply confidence on the basis of metrics such as data density, survey age or survey techniques have proved useful for isolating single, critical, factors but do not generally succeed in evaluating geological mapping 'in the round', because they cannot account for the 'conceptual' skill set of the surveyor. The British Geological Survey (BGS) is using expert elicitation methods to gain a better understanding of uncertainties within the national geological map of Great Britain. The expert elicitation approach starts with the assumption that experienced surveyors have an intuitive sense of the uncertainty of the boundaries that they map, based on a tacit model of geology and its complexity and the nature of the surveying process. The objective of

  6. Polarimetric clutter modeling: Theory and application

    NASA Technical Reports Server (NTRS)

    Kong, J. A.; Lin, F. C.; Borgeaud, M.; Yueh, H. A.; Swartz, A. A.; Lim, H. H.; Shim, R. T.; Novak, L. M.

    1988-01-01

    The two-layer anisotropic random medium model is used to investigate fully polarimetric scattering properties of earth terrain media. The polarization covariance matrices for the untilted and tilted uniaxial random medium are evaluated using the strong fluctuation theory and distorted Born approximation. In order to account for the azimuthal randomness in the growth direction of leaves in tree and grass fields, an averaging scheme over the azimuthal direction is also applied. It is found that characteristics of terrain clutter can be identified through the analysis of each element of the covariance matrix. Theoretical results are illustrated by the comparison with experimental data provided by MIT Lincoln Laboratory for tree and grass fields.

  7. STORM WATER MANAGEMENT MODEL, VERSION 4. PART A: USER'S MANUAL

    EPA Science Inventory

    The EPA Storm Water Management Model (SWMM) is a comprehensive mathematical model for simulation of urban runoff water quality and quantity in storm and combined sewer systems. All aspects of the urban hydrologic and quality cycles are simulated, including surface and subsurface ...

  8. USER'S GUIDE FOR PEM-2: POLLUTION EPISODIC MODEL (VERSION 2)

    EPA Science Inventory

    The Pollution Episodic Model Version 2 (PEM-2) is an urban-scale model designed to predict short term average ground-level concentrations and deposition fluxes of one or two gaseous or particulate pollutants at multiple receptors. The two pollutants may be non-reactive, or chemic...

  9. FABRIC FILTER MODEL FORMAT CHANGE; VOLUME II. USER'S GUIDE

    EPA Science Inventory

    The report describes an improved mathematical model for use by control personnel to determine the adequacy of existing or proposed filter systems designed to minimize coal fly ash emissions. Several time-saving steps have been introduced to facilitate model application by Agency ...

  10. PESTICIDE ORCHARD ECOSYSTEM MODEL (POEM): A USER'S GUIDE

    EPA Science Inventory

    A mathematical model was developed to predict the transport and effects of a pesticide in an orchard ecosystem. The environmental behavior of azinphosmethyl was studied over a two-year period in a Michigan apple orchard. Data were gathered for the model on initial distribution wi...

  11. SHAWNEE FLUE GAS DESULFURIZATION COMPUTER MODEL USERS MANUAL

    EPA Science Inventory

    The manual describes a Shawnee flue gas desulfurization (FGD) computer model and gives detailed instructions for its use. The model, jointly developed by Bechtel National, Inc. and TVA (in conjunction with the EPA-sponsored Shawnee test program), is capable of projecting prelimin...

  12. STORM WATER MANAGEMENT MODEL USER'S MANUAL VERSION 5.0

    EPA Science Inventory

    The EPA Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. SWMM was first developed in 1971 and has undergone several major upgrade...

  13. LIME SPRAY DRYER FLUE GAS DESULFURIZATION COMPUTER MODEL USERS MANUAL

    EPA Science Inventory

    The report describes a lime spray dryer/baghouse (FORTRAN) computer model that simulates SO2 removal and permits study of related impacts on design and economics as functions of design parameters and operating conditions for coal-fired electric generating units. The model allows ...

  14. RELMAP: A REGIONAL LAGRANGIAN MODEL OF AIR POLLUTION - USER'S GUIDE

    EPA Science Inventory

    The regional Lagrangian Model of Air Pollution (RELMAP) is a mass conserving, Lagrangian model that simulates ambient concentrations and wet and dry depositions of SO2, SO4=, and fine and coarse particulate matter over the eastern United States and southeastern Canada (default do...

  15. Using chemical organization theory for model checking

    PubMed Central

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter

    2009-01-01

    Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. Results: First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. Availability: All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools Contact: dittrich@minet.uni-jena.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19468053

  16. Optimization model for UDWDM-PON deployment based on physical restrictions and asymmetric user's clustering

    NASA Astrophysics Data System (ADS)

    Arévalo, Germán. V.; Hincapié, Roberto C.; Sierra, Javier E.

    2015-09-01

    UDWDM PON is a leading technology oriented to provide ultra-high bandwidth to final users while profiting the physical channels' capability. One of the main drawbacks of UDWDM technique is the fact that the nonlinear effects, like FWM, become stronger due to the close spectral proximity among channels. This work proposes a model for the optimal deployment of this type of networks taking into account the fiber length limitations imposed by physical restrictions related with the fiber's data transmission as well as the users' asymmetric distribution in a provided region. The proposed model employs the data transmission related effects in UDWDM PON as restrictions in the optimization problem and also considers the user's asymmetric clustering and the subdivision of the users region though a Voronoi geometric partition technique. Here it is considered de Voronoi dual graph, it is the Delaunay Triangulation, as the planar graph for resolving the problem related with the minimum weight of the fiber links.

  17. A Neural Network Approach to Intention Modeling for User-Adapted Conversational Agents.

    PubMed

    Griol, David; Callejas, Zoraida

    2016-01-01

    Spoken dialogue systems have been proposed to enable a more natural and intuitive interaction with the environment and human-computer interfaces. In this contribution, we present a framework based on neural networks that allows modeling of the user's intention during the dialogue and uses this prediction to dynamically adapt the dialogue model of the system taking into consideration the user's needs and preferences. We have evaluated our proposal to develop a user-adapted spoken dialogue system that facilitates tourist information and services and provide a detailed discussion of the positive influence of our proposal in the success of the interaction, the information and services provided, and the quality perceived by the users. PMID:26819592

  18. A Neural Network Approach to Intention Modeling for User-Adapted Conversational Agents

    PubMed Central

    Griol, David

    2016-01-01

    Spoken dialogue systems have been proposed to enable a more natural and intuitive interaction with the environment and human-computer interfaces. In this contribution, we present a framework based on neural networks that allows modeling of the user's intention during the dialogue and uses this prediction to dynamically adapt the dialogue model of the system taking into consideration the user's needs and preferences. We have evaluated our proposal to develop a user-adapted spoken dialogue system that facilitates tourist information and services and provide a detailed discussion of the positive influence of our proposal in the success of the interaction, the information and services provided, and the quality perceived by the users. PMID:26819592

  19. Application of Chaos Theory to Psychological Models

    NASA Astrophysics Data System (ADS)

    Blackerby, Rae Fortunato

    This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in

  20. Users guide for SAMM: A prototype southeast Alaska multiresource model. Forest Service general technical report

    SciTech Connect

    Weyermann, D.L.; Fight, R.D.; Garrett, F.D.

    1991-08-01

    This paper instructs resource analysts on using the southeast Alaska multiresource model (SAMM). SAMM is an interactive microcomputer program that allows users to explore relations among several resources in southeast Alaska (timber, anadromous fish, deer, and hydrology) and the effects of timber management activities (logging, thinning, and road building) on those relations and resources. This guide assists users in installing SAMM on a microcomputer, developing input data files, making simulation runs, and strong output data for external analysis and graphic display.

  1. PARFUME Theory and Model basis Report

    SciTech Connect

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  2. User-owned utility models for rural electrification

    SciTech Connect

    Waddle, D.

    1997-12-01

    The author discusses the history of rural electric cooperatives (REC) in the United States, and the broader question of whether such organizations can serve as a model for rural electrification in other countries. The author points out the features of such cooperatives which have given them stability and strength, and emphasizes that for success of such programs, many of these same features must be present. He definitely feels the cooperative models are not outdated, but they need strong local support, and a governmental structure which is supportive, and in particular not negative.

  3. Mars Global Reference Atmospheric Model 2010 Version: Users Guide

    NASA Technical Reports Server (NTRS)

    Justh, H. L.

    2014-01-01

    This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.

  4. Offshore and coastal dispersion (OCD) model. Users guide

    SciTech Connect

    Hanna, S.R.; Schulman, L.L.; Paine, R.J.; Pleim, J.E.

    1984-09-01

    The Offshore and Coastal Dispersion (OCD) model was adapted from the EPA guideline model MPTER to simulate the effect of offshore emissions from point sources in coastal regions. Modifications were made to incorporate overwater plume transport and dispersion as well as changes that occur as the plume crosses the shoreline. Hourly meteorological data are needed from overwater and overland locations. Turbulence intensities are used but are not mandatory. For overwater dispersion, the turbulence intensities are parameterized from boundary-layer similarity relationships if they are not measured. Specifications of emission characteristics and receptor locations are the same as for MPTER; 250 point sources and 180 receptors may be used.

  5. Offshore and coastal dispersion (OCD) model. User's guide

    SciTech Connect

    Hanna, S.R.; Schulman, L.L.; Paine, R.J.; Pleim, J.E.

    1984-09-01

    The Offshore Coastal Dispersion (OCD) model was adapted from the EPA guideline model MPTER to simulate the effect of offshore emissions from point sources in coastal regions. Modifications were made to incorporate overwater plume transport and dispersion as well as changes that occur as the plume crosses the shoreline. Hourly meteorological data are needed from overwater and overland locations. For overwater dispersion, the turbulence intensities are parameterized from boundary layer similarity relationships if they are not measured. A virtual source technique is used to change the rate of plume growth as the overwater plume intercepts the overland internal boundary layer.

  6. Petroleum Refinery Jobs and Economic Development Impact (JEDI) Model User Reference Guide

    SciTech Connect

    Goldberg, M.

    2013-12-31

    The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are user-friendly tools utilized to estimate the economic impacts at the local level of constructing and operating fuel and power generation projects for a range of conventional and renewable energy technologies. The JEDI Petroleum Refinery Model User Reference Guide was developed to assist users in employing and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted. Based on project-specific inputs from the user, the model estimates job creation, earning and output (total economic activity) for a given petroleum refinery. This includes the direct, indirect and induced economic impacts to the local economy associated with the refinery's construction and operation phases. Project cost and job data used in the model are derived from the most current cost estimations available. Local direct and indirect economic impacts are estimated using economic multipliers derived from IMPLAN software. By determining the regional economic impacts and job creation for a proposed refinery, the JEDI Petroleum Refinery model can be used to field questions about the added value refineries may bring to the local community.

  7. Supporting user-defined granularities in a spatiotemporal conceptual model

    USGS Publications Warehouse

    Khatri, V.; Ram, S.; Snodgrass, R.T.; O'Brien, G. M.

    2002-01-01

    Granularities are integral to spatial and temporal data. A large number of applications require storage of facts along with their temporal and spatial context, which needs to be expressed in terms of appropriate granularities. For many real-world applications, a single granularity in the database is insufficient. In order to support any type of spatial or temporal reasoning, the semantics related to granularities needs to be embedded in the database. Specifying granularities related to facts is an important part of conceptual database design because under-specifying the granularity can restrict an application, affect the relative ordering of events and impact the topological relationships. Closely related to granularities is indeterminacy, i.e., an occurrence time or location associated with a fact that is not known exactly. In this paper, we present an ontology for spatial granularities that is a natural analog of temporal granularities. We propose an upward-compatible, annotation-based spatiotemporal conceptual model that can comprehensively capture the semantics related to spatial and temporal granularities, and indeterminacy without requiring new spatiotemporal constructs. We specify the formal semantics of this spatiotemporal conceptual model via translation to a conventional conceptual model. To underscore the practical focus of our approach, we describe an on-going case study. We apply our approach to a hydrogeologic application at the United States Geologic Survey and demonstrate that our proposed granularity-based spatiotemporal conceptual model is straightforward to use and is comprehensive.

  8. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    EPA Science Inventory

    The Exposure Analysis Modeling System (EXAMS) was designed for rapid evaluation of the behavior of synthetic organic chemicals in aquatic ecosystems. From the chemistry of a compound and the relevant transport and physical/chemical characteristics of the ecosystem, EXAMS computes...

  9. COAL PREPARATION PLANT COMPUTER MODEL: VOLUME I. USER DOCUMENTATION

    EPA Science Inventory

    The two-volume report describes a steady state modeling system that simulates the performance of coal preparation plants. The system was developed originally under the technical leadership of the U.S. Bureau of Mines and the sponsorship of the EPA. The modified form described in ...

  10. VOC (VOLATILE ORGANIC COMPOUND) FUGITIVE EMISSION PREDICTIVE MODEL - USER'S GUIDE

    EPA Science Inventory

    The report discusses a mathematical model that can be used to evaluate the effectiveness of various leak detection and repair (LDAR) programs on controlling volatile organic compound (VOC) fugitive emissions from chemical, petroleum, and other process units. The report also descr...

  11. Policy Building--An Extension to User Modeling

    ERIC Educational Resources Information Center

    Yudelson, Michael V.; Brunskill, Emma

    2012-01-01

    In this paper we combine a logistic regression student model with an exercise selection procedure. As opposed to the body of prior work on strategies for selecting practice opportunities, we are working on an assumption of a finite amount of opportunities to teach the student. Our goal is to prescribe activities that would maximize the amount…

  12. COMPUTERIZED SHAWNEE LIME/LIMESTONE SCRUBBING MODEL USERS MANUAL

    EPA Science Inventory

    The manual gives a general description of a computerized model for estimating design and cost of lime or limestone scrubber systems for flue gas desulfurization (FGD). It supplements PB80-123037 by extending the number of scrubber options which can be evaluated. It includes spray...

  13. User's manual for LINEAR, a FORTRAN program to derive linear aircraft models

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Patterson, Brian P.; Antoniewicz, Robert F.

    1987-01-01

    This report documents a FORTRAN program that provides a powerful and flexible tool for the linearization of aircraft models. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.

  14. IoT-based user-driven service modeling environment for a smart space management system.

    PubMed

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-01-01

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service. PMID:25420153

  15. IoT-Based User-Driven Service Modeling Environment for a Smart Space Management System

    PubMed Central

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-01-01

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service. PMID:25420153

  16. Future Air Traffic Growth and Schedule Model User's Guide

    NASA Technical Reports Server (NTRS)

    Kimmel, William M. (Technical Monitor); Smith, Jeremy C.; Dollyhigh, Samuel M.

    2004-01-01

    The Future Air Traffic Growth and Schedule Model was developed as an implementation of the Fratar algorithm to project future traffic flow between airports in a system and of then scheduling the additional flights to reflect current passenger time-of-travel preferences. The methodology produces an unconstrained future schedule from a current (or baseline) schedule and the airport operations growth rates. As an example of the use of the model, future schedules are projected for 2010 and 2022 for all flights arriving at, departing from, or flying between all continental United States airports that had commercial scheduled service for May 17, 2002. Inter-continental US traffic and airports are included and the traffic is also grown with the Fratar methodology to account for their arrivals and departures to the continental US airports. Input data sets derived from the Official Airline Guide (OAG) data and FAA Terminal Area Forecast (TAF) are included in the examples of the computer code execution.

  17. Targeting Parents for Childhood Weight Management: Development of a Theory-Driven and User-Centered Healthy Eating App

    PubMed Central

    Lahiri, Sudakshina; Brown, Katherine Elizabeth

    2015-01-01

    Background The proliferation of health promotion apps along with mobile phones' array of features supporting health behavior change offers a new and innovative approach to childhood weight management. However, despite the critical role parents play in children’s weight related behaviors, few industry-led apps aimed at childhood weight management target parents. Furthermore, industry-led apps have been shown to lack a basis in behavior change theory and evidence. Equally important remains the issue of how to maximize users’ engagement with mobile health (mHealth) interventions where there is growing consensus that inputs from the commercial app industry and the target population should be an integral part of the development process. Objective The aim of this study is to systematically design and develop a theory and evidence-driven, user-centered healthy eating app targeting parents for childhood weight management, and clearly document this for the research and app development community. Methods The Behavior Change Wheel (BCW) framework, a theoretically-based approach for intervention development, along with a user-centered design (UCD) philosophy and collaboration with the commercial app industry, guided the development process. Current evidence, along with a series of 9 focus groups (total of 46 participants) comprised of family weight management case workers, parents with overweight and healthy weight children aged 5-11 years, and consultation with experts, provided data to inform the app development. Thematic analysis of focus groups helped to extract information related to relevant theoretical, user-centered, and technological components to underpin the design and development of the app. Results Inputs from parents and experts working in the area of childhood weight management helped to identify the main target behavior: to help parents provide appropriate food portion sizes for their children. To achieve this target behavior, the behavioral diagnosis

  18. Surface matching for correlation of virtual models: Theory and application

    NASA Technical Reports Server (NTRS)

    Caracciolo, Roberto; Fanton, Francesco; Gasparetto, Alessandro

    1994-01-01

    Virtual reality can enable a robot user to off line generate and test in a virtual environment a sequence of operations to be executed by the robot in an assembly cell. Virtual models of objects are to be correlated to the real entities they represent by means of a suitable transformation. A solution to the correlation problem, which is basically a problem of 3-dimensional adjusting, has been found exploiting the surface matching theory. An iterative algorithm has been developed, which matches the geometric surface representing the shape of the virtual model of an object, with a set of points measured on the surface in the real world. A peculiar feature of the algorithm is to work also if there is no one-to-one correspondence between the measured points and those representing the surface model. Furthermore the problem of avoiding convergence to local minima is solved, by defining a starting point of states ensuring convergence to the global minimum. The developed algorithm has been tested by simulation. Finally, this paper proposes a specific application, i.e., correlating a robot cell, equipped for biomedical use with its virtual representation.

  19. Measurement of Multiple Nicotine Dependence Domains Among Cigarette, Non-cigarette and Poly-tobacco Users: Insights from Item Response Theory*

    PubMed Central

    Strong, David R; Messer, Karen; Hartman, Sheri J.; Conway, Kevin P.; Hoffman, Allison; Pharris-Ciurej, Nikolas; White, Martha; Green, Victoria R.; Compton, Wilson M.; Pierce, John

    2015-01-01

    Background Nicotine dependence (ND) is a key construct that organizes physiological and behavioral symptoms associated with persistent nicotine intake. Measurement of ND has focused primarily on cigarette smokers. Thus, validation of brief instruments that apply to a broad spectrum of tobacco product users is needed. Methods We examined multiple domains of ND in a longitudinal national study of the United States population, the United States National Epidemiological Survey of Alcohol and Related Conditions (NESARC). We used methods based in item response theory to identify and validate increasingly brief measures of ND that included symptoms to assess ND similarly among cigarette, cigar, smokeless, and poly tobacco users. Results Confirmatory factor analytic models supported a single, primary dimension underlying symptoms of ND across tobacco use groups. Differential Item Functioning (DIF) analysis generated little support for systematic differences in response to symptoms of ND across tobacco use groups. We established significant concurrent and predictive validity of brief 3- and 5- symptom indices for measuring ND. Conclusions Measuring ND across tobacco use groups with a common set of symptoms facilitates evaluation of tobacco use in an evolving marketplace of tobacco and nicotine products. PMID:26005043

  20. Spring-Model-Based Wireless Localization in Cooperative User Environments

    NASA Astrophysics Data System (ADS)

    Ke, Wei; Wu, Lenan; Qi, Chenhao

    To overcome the shortcomings of conventional cellular positioning, a novel cooperative location algorithm that uses the available peer-to-peer communication between the mobile terminals (MTs) is proposed. The main idea behind the proposed approach is to incorporate the long- and short-range location information to improve the estimation of the MT's coordinates. Since short-range communications among MTs are characterized by high line-of-sight (LOS) probability, an improved spring-model-based cooperative location method can be exploited to provide low-cost improvement for cellular-based location in the non-line-of-sight (NLOS) environments.

  1. Nonsingular models of universes in teleparallel theories.

    PubMed

    de Haro, Jaume; Amoros, Jaume

    2013-02-15

    Different models of universes are considered in the context of teleparallel theories. Assuming that the universe is filled by a fluid with an equation of state P=-ρ-f(ρ), for different teleparallel theories and different equation of state we study its dynamics. Two particular cases are studied in detail: in the first one we consider a function f with two zeros (two de Sitter solutions) that mimics a huge cosmological constant at early times and a pressureless fluid at late times; in the second one, in the context of loop quantum cosmology with a small cosmological constant, we consider a pressureless fluid (P=0⇔f(ρ)=-ρ) which means there are de Sitter and anti-de Sitter solutions. In both cases one obtains a nonsingular universe that at early times is in an inflationary phase; after leaving this phase, it passes trough a matter dominated phase and finally at late times it expands in an accelerated way. PMID:25166366

  2. SWAT Check: A screening tool to assist users in the identification of potential model application problems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Soil and Water Assessment Tool (SWAT) is a basin scale hydrologic model developed by the US Department of Agriculture-Agricultural Research Service. SWAT's broad applicability, user friendly model interfaces, and automatic calibration software have led to a rapid increase in the number of new u...

  3. DESCRIPTION OF UNAMAP (USER'S NETWORK FOR APPLIED MODELING OF AIR POLLUTION) (VERSION 6)

    EPA Science Inventory

    UNAMAP (VERSION 6) represents the 1986 update to the User's Network for Applied Modeling of Air Pollution. UNAMAP consists of an ASCII magnetic tape containing FORTRAN codes and test data for 25 air quality simulation models as well as associated documentation. The tape and docum...

  4. STORM WATER MANAGEMENT MODEL USER'S MANUAL, VERSION 3. ADDENDUM 1: EXTRAN (EXTENDED TRANSPORT)

    EPA Science Inventory

    This report contains the documentation and user's manual for the Extended Transport (EXTRAN) Block of the Storm Water Management Model (SWMM). EXTRAN is a dynamic flow routing model used to compute backwater profiles in open channel and/or closed conduit systems experiencing unst...

  5. USER'S MANUAL FOR THE INSTREAM SEDIMENT-CONTAMINANT TRANSPORT MODEL SERATRA

    EPA Science Inventory

    This manual guides the user in applying the sediment-contaminant transport model SERATRA. SERATRA is an unsteady, two-dimensional code that uses the finite element computation method with the Galerkin weighted residual technique. The model has general convection-diffusion equatio...

  6. A Model for Determining the Costs of Vocational Education Programs and Courses. User's Manual.

    ERIC Educational Resources Information Center

    Hale, James A.; Starnes, Paul M.

    One of a three-volume series concerning the development and testing of a model for determining the costs of vocational education programs and courses, this user's manual presents an overall conceptualization of the model and briefly describes each dimension. The data elements, organized into data collection instruments and recommended procedures,…

  7. Digital Avionics Information System (DAIS): Training Requirements Analysis Model Users Guide. Final Report.

    ERIC Educational Resources Information Center

    Czuchry, Andrew J.; And Others

    This user's guide describes the functions, logical operations and subroutines, input data requirements, and available outputs of the Training Requirements Analysis Model (TRAMOD), a computerized analytical life cycle cost modeling system for use in the early stages of system design. Operable in a stand-alone mode, TRAMOD can be used for the…

  8. STORMWATER AND WATER QUALITY MODEL USERS GROUP MEETING - PROCEEDINGS HELD ON JANUARY 27-28, 1983

    EPA Science Inventory

    This report includes 17 papers on topics related to the development and application of computer-based mathematical models for water quality and quantity management presented at the semi-annual meeting of the Joint U.S. Canadian Storm-water and Water Quality Model Users Group held...

  9. ESPVI 4.0 ELECTROSTATIC PRECIPITATOR V-I AND PERFORMANCE MODEL: USER'S MANUAL

    EPA Science Inventory

    The manual is the companion document for the microcomputer program ESPVI 4.0, Electrostatic Precipitation VI and Performance Model. The program was developed to provide a user- friendly interface to an advanced model of electrostatic precipitation (ESP) performance. The program i...

  10. ESPVI 4.0 ELECTROSTATIS PRECIPITATOR V-1 AND PERFORMANCE MODEL: USER'S MANUAL

    EPA Science Inventory

    The manual is the companion document for the microcomputer program ESPVI 4.0, Electrostatic Precipitation VI and Performance Model. The program was developed to provide a user- friendly interface to an advanced model of electrostatic precipitation (ESP) performance. The program i...

  11. User's instructions for the 41-node thermoregulatory model (steady state version)

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1974-01-01

    A user's guide for the steady-state thermoregulatory model is presented. The model was modified to provide conversational interaction on a remote terminal, greater flexibility for parameter estimation, increased efficiency of convergence, greater choice of output variable and more realistic equations for respiratory and skin diffusion water losses.

  12. Disconfirming User Expectations of the Online Service Experience: Inferred versus Direct Disconfirmation Modeling.

    ERIC Educational Resources Information Center

    O'Neill, Martin; Palmer, Adrian; Wright, Christine

    2003-01-01

    Disconfirmation models of online service measurement seek to define service quality as the difference between user expectations of the service to be received and perceptions of the service actually received. Two such models-inferred and direct disconfirmation-for measuring quality of the online experience are compared (WebQUAL, SERVQUAL). Findings…

  13. Location Contexts of User Check-Ins to Model Urban Geo Life-Style Patterns

    PubMed Central

    Hasan, Samiul; Ukkusuri, Satish V.

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items—either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior. PMID:25970430

  14. User's manual for ADAM (Advanced Dynamic Airfoil Model)

    SciTech Connect

    Oler, J.W.; Strickland, J.H.; Im, B.J.

    1987-06-01

    The computer code for an advanced dynamic airfoil model (ADAM) is described. The code is capable of calculating steady or unsteady flow over two-dimensional airfoils with allowances for boundary layer separation. Specific types of airfoil motions currently installed are steady rectilinear motion, impulsively started rectilinear motion, constant rate pitching, sinusoidal pitch oscillations, sinusoidal lateral plunging, and simulated Darrieus turbine motion. Other types of airfoil motion may be analyzed through simple modifications of a single subroutine. The code has a built-in capability to generate the geometric parameters for a cylinder, the NACA four-digit series of airfoils, and a NASA NLF-0416 laminar airfoil. Other types of airfoils are easily incorporated. The code ADAM is currently in a state of development. It is theoretically consistent and complete. However, further work is needed on the numerical implementation of the method.

  15. Theory and modelling of nanocarbon phase stability.

    SciTech Connect

    Barnard, A. S.

    2006-01-01

    The transformation of nanodiamonds into carbon-onions (and vice versa) has been observed experimentally and has been modeled computationally at various levels of sophistication. Also, several analytical theories have been derived to describe the size, temperature and pressure dependence of this phase transition. However, in most cases a pure carbon-onion or nanodiamond is not the final product. More often than not an intermediary is formed, known as a bucky-diamond, with a diamond-like core encased in an onion-like shell. This has prompted a number of studies investigating the relative stability of nanodiamonds, bucky-diamonds, carbon-onions and fullerenes, in various size regimes. Presented here is a review outlining results of numerous theoretical studies examining the phase diagrams and phase stability of carbon nanoparticles, to clarify the complicated relationship between fullerenic and diamond structures at the nanoscale.

  16. Intersecting brane models and F-theory in six dimensions

    NASA Astrophysics Data System (ADS)

    Nagaoka, Satoshi

    2012-11-01

    We analyze six-dimensional supergravity theories coming from intersecting brane models on the toroidal orbifold T4/Z2. We use recently developed tools for mapping general 6D supergravity theories to F-theory to identify F-theory constructions dual to the intersecting brane models. The F-theory picture illuminates several aspects of these models. In particular, we have some new insight into the matter spectrum on intersecting branes, and analyze gauge group enhancement as branes approach orbifold points. These novel features of intersecting brane models are also relevant in four dimensions, and are confirmed in 6D using more standard Chan-Paton methods.

  17. Modeling missing data in knowledge space theory.

    PubMed

    de Chiusole, Debora; Stefanutti, Luca; Anselmi, Pasquale; Robusto, Egidio

    2015-12-01

    Missing data are a well known issue in statistical inference, because some responses may be missing, even when data are collected carefully. The problem that arises in these cases is how to deal with missing data. In this article, the missingness is analyzed in knowledge space theory, and in particular when the basic local independence model (BLIM) is applied to the data. Two extensions of the BLIM to missing data are proposed: The former, called ignorable missing BLIM (IMBLIM), assumes that missing data are missing completely at random; the latter, called missing BLIM (MissBLIM), introduces specific dependencies of the missing data on the knowledge states, thus assuming that the missing data are missing not at random. The IMBLIM and the MissBLIM modeled the missingness in a satisfactory way, in both a simulation study and an empirical application, depending on the process that generates the missingness: If the missing data-generating process is of type missing completely at random, then either IMBLIM or MissBLIM provide adequate fit to the data. However, if the pattern of missingness is functionally dependent upon unobservable features of the data (e.g., missing answers are more likely to be wrong), then only a correctly specified model of the missingness distribution provides an adequate fit to the data. PMID:26651988

  18. Modeling active memory: Experiment, theory and simulation

    NASA Astrophysics Data System (ADS)

    Amit, Daniel J.

    2001-06-01

    Neuro-physiological experiments on cognitively performing primates are described to argue that strong evidence exists for localized, non-ergodic (stimulus specific) attractor dynamics in the cortex. The specific phenomena are delay activity distributions-enhanced spike-rate distributions resulting from training, which we associate with working memory. The anatomy of the relevant cortex region and the physiological characteristics of the participating elements (neural cells) are reviewed to provide a substrate for modeling the observed phenomena. Modeling is based on the properties of the integrate-and-fire neural element in presence of an input current of Gaussian distribution. Theory of stochastic processes provides an expression for the spike emission rate as a function of the mean and the variance of the current distribution. Mean-field theory is then based on the assumption that spike emission processes in different neurons in the network are independent, and hence the input current to a neuron is Gaussian. Consequently, the dynamics of the interacting network is reduced to the computation of the mean and the variance of the current received by a cell of a given population in terms of the constitutive parameters of the network and the emission rates of the neurons in the different populations. Within this logic we analyze the stationary states of an unstructured network, corresponding to spontaneous activity, and show that it can be stable only if locally the net input current of a neuron is inhibitory. This is then tested against simulations and it is found that agreement is excellent down to great detail. A confirmation of the independence hypothesis. On top of stable spontaneous activity, keeping all parameters fixed, training is described by (Hebbian) modification of synapses between neurons responsive to a stimulus and other neurons in the module-synapses are potentiated between two excited neurons and depressed between an excited and a quiescent neuron

  19. A mathematical model of vowel identification by users of cochlear implants

    PubMed Central

    Sagi, Elad; Meyer, Ted A.; Kaiser, Adam R.; Teoh, Su Wooi; Svirsky, Mario A.

    2010-01-01

    A simple mathematical model is presented that predicts vowel identification by cochlear implant users based on these listeners’ resolving power for the mean locations of first, second, and∕or third formant energies along the implanted electrode array. This psychophysically based model provides hypotheses about the mechanism cochlear implant users employ to encode and process the input auditory signal to extract information relevant for identifying steady-state vowels. Using one free parameter, the model predicts most of the patterns of vowel confusions made by users of different cochlear implant devices and stimulation strategies, and who show widely different levels of speech perception (from near chance to near perfect). Furthermore, the model can predict results from the literature, such as Skinner, et al. [(1995). Ann. Otol. Rhinol. Laryngol. 104, 307–311] frequency mapping study, and the general trend in the vowel results of Zeng and Galvin’s [(1999). Ear Hear. 20, 60–74] studies of output electrical dynamic range reduction. The implementation of the model presented here is specific to vowel identification by cochlear implant users, but the framework of the model is more general. Computational models such as the one presented here can be useful for advancing knowledge about speech perception in hearing impaired populations, and for providing a guide for clinical research and clinical practice. PMID:20136228

  20. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  1. UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.

    PubMed

    Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun

    2013-12-01

    Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets. PMID:24051765

  2. Gravothermal Star Clusters - Theory and Computer Modelling

    NASA Astrophysics Data System (ADS)

    Spurzem, Rainer

    2010-11-01

    In the George Darwin lecture, delivered to the British Royal Astronomical Society in 1960 by Viktor A. Ambartsumian he wrote on the evolution of stellar systems that it can be described by the "dynamic evolution of a gravitating gas" complemented by "a statistical description of the changes in the physical states of stars". This talk will show how this physical concept has inspired theoretical modeling of star clusters in the following decades up to the present day. The application of principles of thermodynamics shows, as Ambartsumian argued in his 1960 lecture, that there is no stable state of equilibrium of a gravitating star cluster. The trend to local thermodynamic equilibrium is always disturbed by escaping stars (Ambartsumian), as well as by gravothermal and gravogyro instabilities, as it was detected later. Here the state-of-the-art of modeling the evolution of dense stellar systems based on principles of thermodynamics and statistical mechanics (Fokker-Planck approximation) will be reviewed. Recent progress including rotation and internal correlations (primordial binaries) is presented. The models have also very successfully been used to study dense star clusters around massive black holes in galactic nuclei and even (in a few cases) relativistic supermassive dense objects in centres of galaxies (here again briefly touching one of the many research fields of V.A. Ambartsumian). For the modern present time of high-speed supercomputing, where we are tackling direct N-body simulations of star clusters, we will show that such direct modeling supports and proves the concept of the statistical models based on the Fokker-Planck theory, and that both theoretical concepts and direct computer simulations are necessary to support each other and make scientific progress in the study of star cluster evolution.

  3. The NATA code: Theory and analysis, volume 1. [user manuals (computer programming) - gas dynamics, wind tunnels

    NASA Technical Reports Server (NTRS)

    Bade, W. L.; Yos, J. M.

    1975-01-01

    A computer program for calculating quasi-one-dimensional gas flow in axisymmetric and two-dimensional nozzles and rectangular channels is presented. Flow is assumed to start from a state of thermochemical equilibrium at a high temperature in an upstream reservoir. The program provides solutions based on frozen chemistry, chemical equilibrium, and nonequilibrium flow with finite reaction rates. Electronic nonequilibrium effects can be included using a two-temperature model. An approximate laminar boundary layer calculation is given for the shear and heat flux on the nozzle wall. Boundary layer displacement effects on the inviscid flow are considered also. Chemical equilibrium and transport property calculations are provided by subroutines. The code contains precoded thermochemical, chemical kinetic, and transport cross section data for high-temperature air, CO2-N2-Ar mixtures, helium, and argon. It provides calculations of the stagnation conditions on axisymmetric or two-dimensional models, and of the conditions on the flat surface of a blunt wedge. The primary purpose of the code is to describe the flow conditions and test conditions in electric arc heated wind tunnels.

  4. PRay - A graphical user interface for interactive visualization and modification of rayinvr models

    NASA Astrophysics Data System (ADS)

    Fromm, T.

    2016-01-01

    PRay is a graphical user interface for interactive displaying and editing of velocity models for seismic refraction. It is optimized for editing rayinvr models but can also be used as a dynamic viewer for ray tracing results from other software. The main features are the graphical editing of nodes and fast adjusting of the display (stations and phases). It can be extended by user-defined shell scripts and links to phase picking software. PRay is open source software written in the scripting language Perl, runs on Unix-like operating systems including Mac OS X and provides a version controlled source code repository for community development.

  5. User-defined Material Model for Thermo-mechanical Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.

  6. Emergent User Behavior on Twitter Modelled by a Stochastic Differential Equation

    PubMed Central

    Mollgaard, Anders; Mathiesen, Joachim

    2015-01-01

    Data from the social-media site, Twitter, is used to study the fluctuations in tweet rates of brand names. The tweet rates are the result of a strongly correlated user behavior, which leads to bursty collective dynamics with a characteristic 1/f noise. Here we use the aggregated "user interest" in a brand name to model collective human dynamics by a stochastic differential equation with multiplicative noise. The model is supported by a detailed analysis of the tweet rate fluctuations and it reproduces both the exact bursty dynamics found in the data and the 1/f noise. PMID:25955783

  7. Using the Theory of Planned Behavior to predict implementation of harm reduction strategies among MDMA/ecstasy users.

    PubMed

    Davis, Alan K; Rosenberg, Harold

    2016-06-01

    This prospective study was designed to test whether the variables proposed by the Theory of Planned Behavior (TPB) were associated with baseline intention to implement and subsequent use of 2 MDMA/ecstasy-specific harm reduction interventions: preloading/postloading and pill testing/pill checking. Using targeted Facebook advertisements, an international sample of 391 recreational ecstasy users were recruited to complete questionnaires assessing their ecstasy consumption history, and their attitudes, subjective norms, perceived behavioral control, habit strength (past strategy use), and intention to use these two strategies. Attitudes, subjective norms, and perceived behavioral control were significantly associated with baseline intention to preload/postload and pill test/pill check. Out of the 391 baseline participants, 100 completed the two-month follow-up assessment. Baseline habit strength and frequency of ecstasy consumption during the three months prior to baseline were the only significant predictors of how often participants used the preloading/postloading strategy during the follow-up. Baseline intention to pill test/pill check was the only significant predictor of how often participants used this strategy during the follow-up. These findings provide partial support for TPB variables as both correlates of baseline intention to implement and predictors of subsequent use of these two strategies. Future investigations could assess whether factors related to ecstasy consumption (e.g., subjective level of intoxication, craving, negative consequences following consumption), and environmental factors (e.g., accessibility and availability of harm reduction resources) improve the prediction of how often ecstasy users employ these and other harm reduction strategies. (PsycINFO Database Record PMID:27322805

  8. User's manual for the REEDM (Rocket Exhaust Effluent Diffusion Model) computer program

    NASA Technical Reports Server (NTRS)

    Bjorklund, J. R.; Dumbauld, R. K.; Cheney, C. S.; Geary, H. V.

    1982-01-01

    The REEDM computer program predicts concentrations, dosages, and depositions downwind from normal and abnormal launches of rocket vehicles at NASA's Kennedy Space Center. The atmospheric dispersion models, cloud-rise models, and other formulas used in the REEDM model are described mathematically Vehicle and source parameters, other pertinent physical properties of the rocket exhaust cloud, and meteorological layering techniques are presented as well as user's instructions for REEDM. Worked example problems are included.

  9. Theory and Modeling in Support of Tether

    NASA Technical Reports Server (NTRS)

    Chang, C. L.; Bergeron, G.; Drobot, A. D.; Papadopoulos, K.; Riyopoulos, S.; Szuszczewicz, E.

    1999-01-01

    This final report summarizes the work performed by SAIC's Applied Physics Operation on the modeling and support of Tethered Satellite System missions (TSS-1 and TSS-1R). The SAIC team, known to be Theory and Modeling in Support of Tether (TMST) investigation, was one of the original twelve teams selected in July, 1985 for the first TSS mission. The accomplishments described in this report cover the period December 19, 1985 to September 31, 1999 and are the result of a continuous effort aimed at supporting the TSS missions in the following major areas. During the contract period, the SAIC's TMST investigation acted to: Participate in the planning and the execution on both of the TSS missions; Provide scientific understanding on the issues involved in the electrodynamic tether system operation prior to the TSS missions; Predict ionospheric conditions encountered during the re-flight mission (TSS-lR) based on realtime global ionosounde data; Perform post mission analyses to enhance our understanding on the TSS results. Specifically, we have 1) constructed and improved current collection models and enhanced our understanding on the current-voltage data; 2) investigated the effects of neutral gas in the current collection processes; 3) conducted laboratory experiments to study the discharge phenomena during and after tether-break; and 4) perform numerical simulations to understand data collected by plasma instruments SPES onboard the TSS satellite; Design and produce multi-media CD that highlights TSS mission achievements and convey the knowledge of the tether technology to the general public. Along with discussions of this work, a list of publications and presentations derived from the TMST investigation spanning the reporting period is compiled.

  10. Modeling the heterogeneity of human dynamics based on the measurements of influential users in Sina Microblog

    NASA Astrophysics Data System (ADS)

    Wang, Chenxu; Guan, Xiaohong; Qin, Tao; Yang, Tao

    2015-06-01

    Online social network has become an indispensable communication tool in the information age. The development of microblog also provides us a great opportunity to study human dynamics that play a crucial role in the design of efficient communication systems. In this paper we study the characteristics of the tweeting behavior based on the data collected from Sina Microblog. The user activity level is measured to characterize how often a user posts a tweet. We find that the user activity level follows a bimodal distribution. That is, the microblog users tend to be either active or inactive. The inter-tweeting time distribution is then measured at both the aggregate and individual levels. We find that the inter-tweeting time follows a piecewise power law distribution of two tails. Furthermore, the exponents of the two tails have different correlations with the user activity level. These findings demonstrate that the dynamics of the tweeting behavior are heterogeneous in different time scales. We then develop a dynamic model co-driven by the memory and the interest mechanism to characterize the heterogeneity. The numerical simulations validate the model and verify that the short time interval tweeting behavior is driven by the memory mechanism while the long time interval behavior by the interest mechanism.

  11. An Introduction to Polytomous Item Response Theory Models.

    ERIC Educational Resources Information Center

    De Ayala, R. J.

    1993-01-01

    Notes that polytomous item response theory (IRT) models are appropriate for Likert scale and other polytomous item types. Presents polytomous IRT models, including graded response, nominal response, partial credit, and rating scale models. (Author/NB)

  12. Users of withdrawal method in the Islamic Republic of Iran: are they intending to use oral contraceptives? Applying the theory of planned behaviour.

    PubMed

    Rahnama, P; Hidarnia, A; Shokravi, F A; Kazemnejad, A; Montazeri, A; Najorkolaei, F R; Saburi, A

    2013-09-01

    Many couples in the Islamic Republic of Iran rely on coital withdrawal for contraception. The purpose of this cross-sectional study was to use the theory of planned behaviour to explore factors that influence withdrawal users' intent to switch to oral contraception (OC). Participants were 336 sexually active, married women, who were current users of withdrawal and were recruited from 5 public family planning clinics in Tehran. A questionnair included measures of the theory of planned behaviour: attitude (behavioural beliefs, outcome evaluations), subjective norms (normative beliefs, motivation to comply), perceived behaviour control, past behaviour and behavioural intention. Linear regression analyses showed that past behaviour, perceived behaviour control, attitude and subjective norms accounted for the highest percentage of total variance observed for intention to use OC (36%). Beliefs-based family planning education and counsellingshould to be designed for users of the withdrawal method. PMID:24313039

  13. Theory and Modeling of Asymmetric Catalytic Reactions.

    PubMed

    Lam, Yu-Hong; Grayson, Matthew N; Holland, Mareike C; Simon, Adam; Houk, K N

    2016-04-19

    Modern density functional theory and powerful contemporary computers have made it possible to explore complex reactions of value in organic synthesis. We describe recent explorations of mechanisms and origins of stereoselectivities with density functional theory calculations. The specific functionals and basis sets that are routinely used in computational studies of stereoselectivities of organic and organometallic reactions in our group are described, followed by our recent studies that uncovered the origins of stereocontrol in reactions catalyzed by (1) vicinal diamines, including cinchona alkaloid-derived primary amines, (2) vicinal amidophosphines, and (3) organo-transition-metal complexes. Two common cyclic models account for the stereoselectivity of aldol reactions of metal enolates (Zimmerman-Traxler) or those catalyzed by the organocatalyst proline (Houk-List). Three other models were derived from computational studies described in this Account. Cinchona alkaloid-derived primary amines and other vicinal diamines are venerable asymmetric organocatalysts. For α-fluorinations and a variety of aldol reactions, vicinal diamines form enamines at one terminal amine and activate electrophilically with NH(+) or NF(+) at the other. We found that the stereocontrolling transition states are cyclic and that their conformational preferences are responsible for the observed stereoselectivity. In fluorinations, the chair seven-membered cyclic transition states is highly favored, just as the Zimmerman-Traxler chair six-membered aldol transition state controls stereoselectivity. In aldol reactions with vicinal diamine catalysts, the crown transition states are favored, both in the prototype and in an experimental example, shown in the graphic. We found that low-energy conformations of cyclic transition states occur and control stereoselectivities in these reactions. Another class of bifunctional organocatalysts, the vicinal amidophosphines, catalyzes the (3 + 2) annulation

  14. Big bang models in string theory

    NASA Astrophysics Data System (ADS)

    Craps, Ben

    2006-11-01

    These proceedings are based on lectures delivered at the 'RTN Winter School on Strings, Supergravity and Gauge Theories', CERN, 16 20 January 2006. The school was mainly aimed at PhD students and young postdocs. The lectures start with a brief introduction to spacetime singularities and the string theory resolution of certain static singularities. Then they discuss attempts to resolve cosmological singularities in string theory, mainly focusing on two specific examples: the Milne orbifold and the matrix big bang.

  15. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 7: User Models: A System Assessment

    NASA Technical Reports Server (NTRS)

    1974-01-01

    User models defined as any explicit process or procedure used to transform information extracted from remotely sensed data into a form useful as a resource management information input are discussed. The role of the user models as information, technological, and operations interfaces between the TERSSE and the resource managers is emphasized. It is recommended that guidelines and management strategies be developed for a systems approach to user model development.

  16. Transmission Line Jobs and Economic Development Impact (JEDI) Model User Reference Guide

    SciTech Connect

    Goldberg, M.; Keyser, D.

    2013-10-01

    The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are freely available, user-friendly tools that estimate the potential economic impacts of constructing and operating power generation projects for a range of conventional and renewable energy technologies. The Transmission Line JEDI model can be used to field questions about the economic impacts of transmission lines in a given state, region, or local community. This Transmission Line JEDI User Reference Guide was developed to provide basic instruction on operating the model and understanding the results. This guide also provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data contained in the model.

  17. ModelMuse: A U.S. Geological Survey Open-Source, Graphical User Interface for Groundwater Models

    NASA Astrophysics Data System (ADS)

    Winston, R. B.

    2013-12-01

    ModelMuse is a free publicly-available graphical preprocessor used to generate the input and display the output for several groundwater models. It is written in Object Pascal and the source code is available on the USGS software web site. Supported models include the MODFLOW family of models, PHAST (version 1), and SUTRA version 2.2. With MODFLOW and PHAST, the user generates a grid and uses 'objects' (points, lines, and polygons) to define boundary conditions and the spatial variation in aquifer properties. Because the objects define the spatial variation, the grid can be changed without the user needing to re-enter spatial data. The same paradigm is used with SUTRA except that the user generates a quadrilateral finite-element mesh instead of a rectangular grid. The user interacts with the model in a top view and in a vertical cross section. The cross section can be at any angle or location. There is also a three-dimensional view of the model. For SUTRA, a new method of visualizing the permeability and related properties has been introduced. In three dimensional SUTRA models, the user specifies the permeability tensor by specifying permeability in three mutually orthogonal directions that can be oriented in space in any direction. Because it is important for the user to be able to check both the magnitudes and directions of the permeabilities, ModelMuse displays the permeabilities as either a two-dimensional or a three-dimensional vector plot. Color is used to differentiate the maximum, middle, and minimum permeability vectors. The magnitude of the permeability is shown by the vector length. The vector angle shows the direction of the maximum, middle, or minimum permeability. Contour and color plots can also be used to display model input and output data.

  18. A Leadership Identity Development Model: Applications from a Grounded Theory

    ERIC Educational Resources Information Center

    Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.

    2006-01-01

    This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…

  19. Visual imagery and the user model applied to fuel handling at EBR-II

    SciTech Connect

    Brown-VanHoozer, S.A.

    1995-06-01

    The material presented in this paper is based on two studies involving visual display designs and the user`s perspective model of a system. The studies involved a methodology known as Neuro-Linguistic Programming (NLP), and its use in expanding design choices which included the ``comfort parameters`` and ``perspective reality`` of the user`s model of the world. In developing visual displays for the EBR-II fuel handling system, the focus would be to incorporate the comfort parameters that overlap from each of the representation systems: visual, auditory and kinesthetic then incorporate the comfort parameters of the most prominent group of the population, and last, blend in the other two representational system comfort parameters. The focus of this informal study was to use the techniques of meta-modeling and synesthesia to develop a virtual environment that closely resembled the operator`s perspective of the fuel handling system of Argonne`s Experimental Breeder Reactor - II. An informal study was conducted using NLP as the behavioral model in a v reality (VR) setting.

  20. A Monthly Water-Balance Model Driven By a Graphical User Interface

    USGS Publications Warehouse

    McCabe, Gregory J.; Markstrom, Steven L.

    2007-01-01

    This report describes a monthly water-balance model driven by a graphical user interface, referred to as the Thornthwaite monthly water-balance program. Computations of monthly water-balance components of the hydrologic cycle are made for a specified location. The program can be used as a research tool, an assessment tool, and a tool for classroom instruction.

  1. A Multilayer Naïve Bayes Model for Analyzing User's Retweeting Sentiment Tendency.

    PubMed

    Wang, Mengmeng; Zuo, Wanli; Wang, Ying

    2015-01-01

    Today microblogging has increasingly become a means of information diffusion via user's retweeting behavior. Since retweeting content, as context information of microblogging, is an understanding of microblogging, hence, user's retweeting sentiment tendency analysis has gradually become a hot research topic. Targeted at online microblogging, a dynamic social network, we investigate how to exploit dynamic retweeting sentiment features in retweeting sentiment tendency analysis. On the basis of time series of user's network structure information and published text information, we first model dynamic retweeting sentiment features. Then we build Naïve Bayes models from profile-, relationship-, and emotion-based dimensions, respectively. Finally, we build a multilayer Naïve Bayes model based on multidimensional Naïve Bayes models to analyze user's retweeting sentiment tendency towards a microblog. Experiments on real-world dataset demonstrate the effectiveness of the proposed framework. Further experiments are conducted to understand the importance of dynamic retweeting sentiment features and temporal information in retweeting sentiment tendency analysis. What is more, we provide a new train of thought for retweeting sentiment tendency analysis in dynamic social networks. PMID:26417367

  2. A Multilayer Naïve Bayes Model for Analyzing User's Retweeting Sentiment Tendency

    PubMed Central

    Wang, Mengmeng; Zuo, Wanli; Wang, Ying

    2015-01-01

    Today microblogging has increasingly become a means of information diffusion via user's retweeting behavior. Since retweeting content, as context information of microblogging, is an understanding of microblogging, hence, user's retweeting sentiment tendency analysis has gradually become a hot research topic. Targeted at online microblogging, a dynamic social network, we investigate how to exploit dynamic retweeting sentiment features in retweeting sentiment tendency analysis. On the basis of time series of user's network structure information and published text information, we first model dynamic retweeting sentiment features. Then we build Naïve Bayes models from profile-, relationship-, and emotion-based dimensions, respectively. Finally, we build a multilayer Naïve Bayes model based on multidimensional Naïve Bayes models to analyze user's retweeting sentiment tendency towards a microblog. Experiments on real-world dataset demonstrate the effectiveness of the proposed framework. Further experiments are conducted to understand the importance of dynamic retweeting sentiment features and temporal information in retweeting sentiment tendency analysis. What is more, we provide a new train of thought for retweeting sentiment tendency analysis in dynamic social networks. PMID:26417367

  3. Object relations theory and activity theory: a proposed link by way of the procedural sequence model.

    PubMed

    Ryle, A

    1991-12-01

    An account of object relations theory (ORT), represented in terms of the procedural sequence model (PSM), is compared to the ideas of Vygotsky and activity theory (AT). The two models are seen to be compatible and complementary and their combination offers a satisfactory account of human psychology, appropriate for the understanding and integration of psychotherapy. PMID:1786224

  4. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    ERIC Educational Resources Information Center

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  5. Chaos Theory as a Model for Managing Issues and Crises.

    ERIC Educational Resources Information Center

    Murphy, Priscilla

    1996-01-01

    Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…

  6. Modeling Diagnostic Reasoning: A Summary of Parsimonious Covering Theory

    PubMed Central

    Reggia, James A.; Peng, Yun

    1986-01-01

    Parsimonious covering theory is a formal model of diagnostic reasoning. Diagnostic knowledge is represented in the theory as a network of causal associations, and problem-solving is represented in algorithms that support a hypothesize-and-test inference process. This paper summarizes in informal terms the basic ideas in parsimonious covering theory.

  7. Theory and modeling of active brazing.

    SciTech Connect

    van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.

    2013-09-01

    Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.

  8. Jobs and Economic Development Impact (JEDI) User Reference Guide: Fast Pyrolysis Biorefinery Model

    SciTech Connect

    Zhang, Y.; Goldberg, M.

    2015-02-01

    This guide -- the JEDI Fast Pyrolysis Biorefinery Model User Reference Guide -- was developed to assist users in operating and understanding the JEDI Fast Pyrolysis Biorefinery Model. The guide provides information on the model's underlying methodology, as well as the parameters and data sources used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features and a discussion of how the results should be interpreted. Based on project-specific inputs from the user, the JEDI Fast Pyrolysis Biorefinery Model estimates local (e.g., county- or state-level) job creation, earnings, and output from total economic activity for a given fast pyrolysis biorefinery. These estimates include the direct, indirect and induced economic impacts to the local economy associated with the construction and operation phases of biorefinery projects.Local revenue and supply chain impacts as well as induced impacts are estimated using economic multipliers derived from the IMPLAN software program. By determining the local economic impacts and job creation for a proposed biorefinery, the JEDI Fast Pyrolysis Biorefinery Model can be used to field questions about the added value biorefineries might bring to a local community.

  9. BEN: A model to calculate the economic benefit of noncompliance. User's manual

    SciTech Connect

    Not Available

    1992-10-01

    The Agency developed the BEN computer model to calculate the economic benefit a violator derives from delaying or avoiding compliance with environmental statutes. In general, the Agency uses the BEN computer model to assist its own staff in developing settlement penalty figures. While the primary purpose of the BEN model is to calculate the economic benefit of noncompliance, the model may also be used to calculate the after tax net present value of a pollution prevention or mitigation project and to calculate 'cash outs' in Superfund cases. The document, the BEN User's Manual, contains all the formulas that make up the BEN computer model and is freely available to the public upon request.

  10. Development and implementation of (Q)SAR modeling within the CHARMMing web-user interface.

    PubMed

    Weidlich, Iwona E; Pevzner, Yuri; Miller, Benjamin T; Filippov, Igor V; Woodcock, H Lee; Brooks, Bernard R

    2015-01-01

    Recent availability of large publicly accessible databases of chemical compounds and their biological activities (PubChem, ChEMBL) has inspired us to develop a web-based tool for structure activity relationship and quantitative structure activity relationship modeling to add to the services provided by CHARMMing (www.charmming.org). This new module implements some of the most recent advances in modern machine learning algorithms-Random Forest, Support Vector Machine, Stochastic Gradient Descent, Gradient Tree Boosting, so forth. A user can import training data from Pubchem Bioassay data collections directly from our interface or upload his or her own SD files which contain structures and activity information to create new models (either categorical or numerical). A user can then track the model generation process and run models on new data to predict activity. PMID:25362883

  11. Development and implementation of (Q)SAR modeling within the CHARMMing Web-user interface

    PubMed Central

    Weidlich, Iwona E.; Pevzner, Yuri; Miller, Benjamin T.; Filippov, Igor V.; Woodcock, H. Lee; Brooks, Bernard R.

    2014-01-01

    Recent availability of large publicly accessible databases of chemical compounds and their biological activities (PubChem, ChEMBL) has inspired us to develop a Web-based tool for SAR and QSAR modeling to add to the services provided by CHARMMing (www.charmming.org). This new module implements some of the most recent advances in modern machine learning algorithms – Random Forest, Support Vector Machine (SVM), Stochastic Gradient Descent, Gradient Tree Boosting etc. A user can import training data from Pubchem Bioassay data collections directly from our interface or upload his or her own SD files which contain structures and activity information to create new models (either categorical or numerical). A user can then track the model generation process and run models on new data to predict activity. PMID:25362883

  12. The European ALMA Regional Centre Network: A Geographically Distributed User Support Model

    NASA Astrophysics Data System (ADS)

    Hatziminaoglou, E.; Zwaan, M.; Andreani, P.; Barta, M.; Bertoldi, F.; Brand, J.; Gueth, F.; Hogerheijde, M.; Maercker, M.; Massardi, M.; Muehle, S.; Muxlow, Th.; Richards, A.; Schilke, P.; Tilanus, R.; Vlemmings, W.; Afonso, J.; Messias, H.

    2015-12-01

    In recent years there has been a paradigm shift from centralised to geographically distributed resources. Individual entities are no longer able to host or afford the necessary expertise in-house, and, as a consequence, society increasingly relies on widespread collaborations. Although such collaborations are now the norm for scientific projects, more technical structures providing support to a distributed scientific community without direct financial or other material benefits are scarce. The network of European ALMA Regional Centre (ARC) nodes is an example of such an internationally distributed user support network. It is an organised effort to provide the European ALMA user community with uniform expert support to enable optimal usage and scientific output of the ALMA facility. The network model for the European ARC nodes is described in terms of its organisation, communication strategies and user support.

  13. Support for significant evolutions of the user data model in ROOT files

    SciTech Connect

    Canal, P.; Brun, R.; Fine, V.; Janyst, L.; Lauret, J.; Russo, P.; /Fermilab

    2010-01-01

    One of the main strengths of ROOT input and output (I/O) is its inherent support for schema evolution. Two distinct modes are supported, one manual via a hand coded streamer function and one fully automatic via the ROOT StreamerInfo. One draw back of the streamer functions is that they are not usable by TTree objects in split mode. Until now, the user could not customize the automatic schema evolution mechanism and the only mechanism to go beyond the default rules was to revert to using the streamer function. In ROOT 5.22/00, we introduced a new mechanism which allows user provided extensions of the automatic schema evolution that can be used in object-wise, member-wise and split modes. This paper will describe the many possibilities ranging from the simple assignment of transient members to the complex reorganization of the user's object model.

  14. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) user manual (version 3), volume 1

    NASA Astrophysics Data System (ADS)

    Kadlec, D. L.; Coffey, E. L.

    1983-09-01

    GEMACS solves electromagnetic radiation and scattering problems. The Method of Moments (MOM) and Geometrical Theory of Diffraction (GTD) are used. MOM is formalized with the Electric Field Integral Equation (EFIE) for wires and the Magnetic Field Integral Equation (MFIE) for patches. The code employs both full matrix decomposition and Banded Matrix Iteration (BMI) solution techniques. The MOM, GTD and hybrid MOM/GTD techniques in the code are used to solve electrically small object problems, electrically large object problems and combination sized object problems. Volume I of this report is the User Manual. The code execution requirements, input language and output are discussed.

  15. Utility of the PRECEDE model in differentiating users and nonusers of smokeless tobacco.

    PubMed

    Polcyn, M M; Price, J H; Jurs, S G; Roberts, S M

    1991-04-01

    The utility of the PRECEDE model in identifying factors associated with smokeless tobacco use among male adolescents was investigated. Users and triers were more likely than nonusers to be white, older than average age for their grade, receive below average grades in school, and have had no classroom instruction on adverse effects of smokeless tobacco use. The standard score means of users, triers, and nonusers differed significantly on six of seven PRECEDE model components: attitudes, beliefs, values, perceptions, reinforcing factors, and enabling factors. The standard score means of triers and nonusers, only, differed significantly for the knowledge component. A stepwise multiple regression analysis using all seven model components as predictor variables of smokeless tobacco use accounted for 47.9% of explained variance among users, triers, and nonusers. The values component was the most powerful predictor of smokeless tobacco use (35% of the explained variance). Discriminant function analysis demonstrated the seven components of the PRECEDE mode instrument accurately classified 93.1% of users and nonusers. PMID:1857107

  16. Jobs and Economic Development Impact (JEDI) Model: Offshore Wind User Reference Guide

    SciTech Connect

    Lantz, E.; Goldberg, M.; Keyser, D.

    2013-06-01

    The Offshore Wind Jobs and Economic Development Impact (JEDI) model, developed by NREL and MRG & Associates, is a spreadsheet based input-output tool. JEDI is meant to be a user friendly and transparent tool to estimate potential economic impacts supported by the development and operation of offshore wind projects. This guide describes how to use the model as well as technical information such as methodology, limitations, and data sources.

  17. PCB/transformer techno-economic analysis model: User manual, volume 2

    NASA Astrophysics Data System (ADS)

    Plum, Martin M.; Geimer, Ray M.

    1989-02-01

    This model is designed to evaluate the economic viability of replacing or retrofilling a PCB transformer with numerous non-PCB transformer options. Replacement options include conventional, amorphous, amorphous-liquid filled, or amorphous-liquid filled-high performance transformers. The retrofill option is the process that removes and disposes this with non-PCB dielectric. Depending on data available, the skills of the user, and the intent of the analysis, there are three model options available to the user. For practical purposes, Level 1 requires the least amount of input data from the user while Level 3 requires the greatest quantity of data. This manual is designed for people who have a minimum experience with Lotus 123 and are familiar with electric transformers. This manual covers system requirements, how to install the model on your system, how to get started, how to move around in the model, how to make changes in the model data, how to print information, how to save your work, and how to exit from the model.

  18. Polyelectrolyte brushes: theory, modelling, synthesis and applications.

    PubMed

    Das, Siddhartha; Banik, Meneka; Chen, Guang; Sinha, Shayandev; Mukherjee, Rabibrata

    2015-11-28

    Polyelectrolyte (PE) brushes are a special class of polymer brushes (PBs) containing charges. Polymer chains attain "brush"-like configuration when they are grafted or get localized at an interface (solid-fluid or liquid-fluid) with sufficiently close proximity between two-adjacent grafted polymer chains - such a proximity triggers a particular nature of interaction between the adjacent polymer molecules forcing them to stretch orthogonally to the grafting interface, instead of random-coil arrangement. In this review, we discuss the theory, synthesis, and applications of PE brushes. The theoretical discussion starts with the standard scaling concepts for polymer and PE brushes; following that, we shed light on the state of the art in continuum modelling approaches for polymer and PE brushes directed towards analysis beyond the scaling calculations. A special emphasis is laid in pinpointing the cases for which the PE electrostatic effects can be de-coupled from the PE entropic and excluded volume effects; such de-coupling is necessary to appropriately probe the complicated electrostatic effects arising from pH-dependent charging of the PE brushes and the use of these effects for driving liquid and ion transport at the interfaces covered with PE brushes. We also discuss the atomistic simulation approaches for polymer and PE brushes. Next we provide a detailed review of the existing approaches for the synthesis of polymer and PE brushes on interfaces, nanoparticles, and nanochannels, including mixed brushes and patterned brushes. Finally, we discuss some of the possible applications and future developments of polymer and PE brushes grafted on a variety of interfaces. PMID:26399305

  19. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  20. ModelMuse - A Graphical User Interface for MODFLOW-2005 and PHAST

    USGS Publications Warehouse

    Winston, Richard B.

    2009-01-01

    ModelMuse is a graphical user interface (GUI) for the U.S. Geological Survey (USGS) models MODFLOW-2005 and PHAST. This software package provides a GUI for creating the flow and transport input file for PHAST and the input files for MODFLOW-2005. In ModelMuse, the spatial data for the model is independent of the grid, and the temporal data is independent of the stress periods. Being able to input these data independently allows the user to redefine the spatial and temporal discretization at will. This report describes the basic concepts required to work with ModelMuse. These basic concepts include the model grid, data sets, formulas, objects, the method used to assign values to data sets, and model features. The ModelMuse main window has a top, front, and side view of the model that can be used for editing the model, and a 3-D view of the model that can be used to display properties of the model. ModelMuse has tools to generate and edit the model grid. It also has a variety of interpolation methods and geographic functions that can be used to help define the spatial variability of the model. ModelMuse can be used to execute both MODFLOW-2005 and PHAST and can also display the results of MODFLOW-2005 models. An example of using ModelMuse with MODFLOW-2005 is included in this report. Several additional examples are described in the help system for ModelMuse, which can be accessed from the Help menu.

  1. A Quantitative Causal Model Theory of Conditional Reasoning

    ERIC Educational Resources Information Center

    Fernbach, Philip M.; Erb, Christopher D.

    2013-01-01

    The authors propose and test a causal model theory of reasoning about conditional arguments with causal content. According to the theory, the acceptability of modus ponens (MP) and affirming the consequent (AC) reflect the conditional likelihood of causes and effects based on a probabilistic causal model of the scenario being judged. Acceptability…

  2. Program evaluation models and related theories: AMEE guide no. 67.

    PubMed

    Frye, Ann W; Hemmer, Paul A

    2012-01-01

    This Guide reviews theories of science that have influenced the development of common educational evaluation models. Educators can be more confident when choosing an appropriate evaluation model if they first consider the model's theoretical basis against their program's complexity and their own evaluation needs. Reductionism, system theory, and (most recently) complexity theory have inspired the development of models commonly applied in evaluation studies today. This Guide describes experimental and quasi-experimental models, Kirkpatrick's four-level model, the Logic Model, and the CIPP (Context/Input/Process/Product) model in the context of the theories that influenced their development and that limit or support their ability to do what educators need. The goal of this Guide is for educators to become more competent and confident in being able to design educational program evaluations that support intentional program improvement while adequately documenting or describing the changes and outcomes-intended and unintended-associated with their programs. PMID:22515309

  3. Theory of stellar convection II: first stellar models

    NASA Astrophysics Data System (ADS)

    Pasetto, S.; Chiosi, C.; Chiosi, E.; Cropper, M.; Weiss, A.

    2016-04-01

    We present here the first stellar models on the Hertzsprung-Russell diagram (HRD), in which convection is treated according to the new scale-free convection theory (SFC theory) by Pasetto et al. (2014). The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few percent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients ∇ and ∇e, and energy fluxes that are very similar to those derived from the "calibrated" MT theory for main sequence stars. We conclude that the old scale dependent ML theory can now be replaced with a self-consistent scale-free theory able to predict correct results, as it is more physically grounded than the ML theory. Fundamentally, the SFC theory offers a deeper insight of the underlying physics than numerical simulations.

  4. Theory of stellar convection - II. First stellar models

    NASA Astrophysics Data System (ADS)

    Pasetto, S.; Chiosi, C.; Chiosi, E.; Cropper, M.; Weiss, A.

    2016-07-01

    We present here the first stellar models on the Hertzsprung-Russell diagram, in which convection is treated according to the new scale-free convection theory (SFC theory) by Pasetto et al. The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few per cent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients ∇ and ∇e, and energy fluxes that are very similar to those derived from the `calibrated' MT theory for main-sequence stars. We conclude that the old scale dependent ML theory can now be replaced with a self-consistent scale-free theory able to predict correct results, as it is more physically grounded than the ML theory. Fundamentally, the SFC theory offers a deeper insight of the underlying physics than numerical simulations.

  5. Large field inflation models from higher-dimensional gauge theories

    NASA Astrophysics Data System (ADS)

    Furuuchi, Kazuyuki; Koyama, Yoji

    2015-02-01

    Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante's Inferno model turns out to be the most preferred model in this framework.

  6. Large field inflation models from higher-dimensional gauge theories

    SciTech Connect

    Furuuchi, Kazuyuki; Koyama, Yoji

    2015-02-23

    Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante’s Inferno model turns out to be the most preferred model in this framework.

  7. The Woodcock-Johnson Tests of Cognitive Abilities III's Cognitive Performance Model: Empirical Support for Intermediate Factors within CHC Theory

    ERIC Educational Resources Information Center

    Taub, Gordon E.; McGrew, Kevin S.

    2014-01-01

    The Woodcock-Johnson Tests of Cognitive Ability Third Edition is developed using the Cattell-Horn-Carroll (CHC) measurement-theory test design as the instrument's theoretical blueprint. The instrument provides users with cognitive scores based on the Cognitive Performance Model (CPM); however, the CPM is not a part of CHC theory. Within the…

  8. User modeling techniques for enhanced usability of OPSMODEL operations simulation software

    NASA Technical Reports Server (NTRS)

    Davis, William T.

    1991-01-01

    The PC based OPSMODEL operations software for modeling and simulation of space station crew activities supports engineering and cost analyses and operations planning. Using top-down modeling, the level of detail required in the data base can be limited to being commensurate with the results required of any particular analysis. To perform a simulation, a resource environment consisting of locations, crew definition, equipment, and consumables is first defined. Activities to be simulated are then defined as operations and scheduled as desired. These operations are defined within a 1000 level priority structure. The simulation on OPSMODEL, then, consists of the following: user defined, user scheduled operations executing within an environment of user defined resource and priority constraints. Techniques for prioritizing operations to realistically model a representative daily scenario of on-orbit space station crew activities are discussed. The large number of priority levels allows priorities to be assigned commensurate with the detail necessary for a given simulation. Several techniques for realistic modeling of day-to-day work carryover are also addressed.

  9. A new approach in climate modelling strategies to provide climate information based on user needs

    NASA Astrophysics Data System (ADS)

    Dell'Aquila, Alessandro; Somot, Samuel; Dubois, Clotilde; Nabat, Pierre; Coppola, Erika

    2014-05-01

    In the framework of CLIMRUN EU FP7 project a new approach to plan the climate modelling activities has been proposed and applied. In particular a bottom-up approach mainly driven by the specific needs of end users has been adopted. In this perspective, the new climate information for Mediterranean region provided from new modelling activity have been tailored on the users needs raised in several Stakeholders Workshops organized at the early stages of the project. At the beginning of the project, several different options of possible developments for new modelling tools have been proposed by climate researchers involved in the project. Taking carefully into account the ranking of priorities suggested by end-users, the climate researchers could set a more focused research line fitting the expectations of stakeholders. New modelling tools to improve the representation and projection of surface wind speed, surface solar radiation, trend of extreme events, temperature of lakes and islands of Mediterranean have been successfully developed. Here we report some of the major outcomes from the new tools and more in general some recommendations about the future role of climate researchers in developing climate services.The results here reported could be useful also in the othe ongoing experiences about climate services such as projects SPECS, EUPORIAS...

  10. Hiding the system from the user: Moving from complex mental models to elegant metaphors

    SciTech Connect

    Curtis W. Nielsen; David J. Bruemmer

    2007-08-01

    In previous work, increased complexity of robot behaviors and the accompanying interface design often led to operator confusion and/or a fight for control between the robot and operator. We believe the reason for the conflict was that the design of the interface and interactions presented too much of the underlying robot design model to the operator. Since the design model includes the implementation of sensors, behaviors, and sophisticated algorithms, the result was that the operator’s cognitive efforts were focused on understanding the design of the robot system as opposed to focusing on the task at hand. This paper illustrates how this very problem emerged at the INL and how the implementation of new metaphors for interaction has allowed us to hide the design model from the user and allow the user to focus more on the task at hand. Supporting the user’s focus on the task rather than on the design model allows increased use of the system and significant performance improvement in a search task with novice users.

  11. A catastrophe theory model of planar orientation

    SciTech Connect

    Wright, M.W.; Deacon, G.E.

    2000-06-01

    The manipulation of planar objects using linear fences is of interest in robotics and parts feeding applications. The global behavior of such systems can be characterized graphically using Brost's push stability diagram (PSD). Previously, the authors have shown specifically under what conditions this representation undergoes qualitative, topological transitions corresponding to globally distinct behavioral regimes. In this paper, they show that these insights form a united whole when viewed from the perspective of catastrophe theory. The key result is that a planar object being pushed by a fence under the assumption of Coulomb friction is functionally equivalent to a gravitational catastrophe machine. Qualitative changes in global behavior are thus explained as catastrophes as singularities are encountered on a discriminant surface due to smooth changes in parameters. Catastrophe theory thus forms part of a computational theory of planar orientation, the aim of which is to understand such systems and make predictions about their behavior.

  12. General autocatalytic theory and simple model of financial markets

    NASA Astrophysics Data System (ADS)

    Thuy Anh, Chu; Lan, Nguyen Tri; Viet, Nguyen Ai

    2015-06-01

    The concept of autocatalytic theory has become a powerful tool in understanding evolutionary processes in complex systems. A generalization of autocatalytic theory was assumed by considering that the initial element now is being some distribution instead of a constant value as in traditional theory. This initial condition leads to that the final element might have some distribution too. A simple physics model for financial markets is proposed, using this general autocatalytic theory. Some general behaviours of evolution process and risk moment of a financial market also are investigated in framework of this simple model.

  13. Non-linear sigma-models and string theories

    SciTech Connect

    Sen, A.

    1986-10-01

    The connection between sigma-models and string theories is discussed, as well as how the sigma-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs. (LEW)

  14. User's manual for a parameter identification technique. [with options for model simulation for fixed input forcing functions and identification from wind tunnel and flight measurements

    NASA Technical Reports Server (NTRS)

    Kanning, G.

    1975-01-01

    A digital computer program written in FORTRAN is presented that implements the system identification theory for deterministic systems using input-output measurements. The user supplies programs simulating the mathematical model of the physical plant whose parameters are to be identified. The user may choose any one of three options. The first option allows for a complete model simulation for fixed input forcing functions. The second option identifies up to 36 parameters of the model from wind tunnel or flight measurements. The third option performs a sensitivity analysis for up to 36 parameters. The use of each option is illustrated with an example using input-output measurements for a helicopter rotor tested in a wind tunnel.

  15. Surface water management: a user's guide to calculate a water balance using the CREAMS model

    SciTech Connect

    Lane, L.J.

    1984-11-01

    The hydrologic component of the CREAMS model is described and discussed in terms of calculating a surface water balance for shallow land burial systems used for waste disposal. Parameter estimates and estimation procedures are presented in detail in the form of a user's guide. Use of the model is illustrated with three examples based on analysis of data from Los Alamos, New Mexico and Rock Valley, Nevada. Use of the model in design of trench caps for shallow land burial systems is illustrated with the example applications at Los Alamos.

  16. User's Manual for Data for Validating Models for PV Module Performance

    SciTech Connect

    Marion, W.; Anderberg, A.; Deline, C.; Glick, S.; Muller, M.; Perrin, G.; Rodriguez, J.; Rummel, S.; Terwilliger, K.; Silverman, T. J.

    2014-04-01

    This user's manual describes performance data measured for flat-plate photovoltaic (PV) modules installed in Cocoa, Florida, Eugene, Oregon, and Golden, Colorado. The data include PV module current-voltage curves and associated meteorological data for approximately one-year periods. These publicly available data are intended to facilitate the validation of existing models for predicting the performance of PV modules, and for the development of new and improved models. For comparing different modeling approaches, using these public data will provide transparency and more meaningful comparisons of the relative benefits.

  17. NETPATH-WIN: an interactive user version of the mass-balance model, NETPATH

    USGS Publications Warehouse

    El-Kadi, A. I.; Plummer, L.N.; Aggarwal, P.

    2011-01-01

    NETPATH-WIN is an interactive user version of NETPATH, an inverse geochemical modeling code used to find mass-balance reaction models that are consistent with the observed chemical and isotopic composition of waters from aquatic systems. NETPATH-WIN was constructed to migrate NETPATH applications into the Microsoft WINDOWS® environment. The new version facilitates model utilization by eliminating difficulties in data preparation and results analysis of the DOS version of NETPATH, while preserving all of the capabilities of the original version. Through example applications, the note describes some of the features of NETPATH-WIN as applied to adjustment of radiocarbon data for geochemical reactions in groundwater systems.

  18. Reliability and Maintainability Model (RAM): User and Maintenance Manual. Part 2; Improved Supportability Analysis

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1996-01-01

    This report documents the procedures for utilizing and maintaining the Reliability & Maintainability Model (RAM) developed by the University of Dayton for the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). The purpose of the grant is to provide support to NASA in establishing operational and support parameters and costs of proposed space systems. As part of this research objective, the model described here was developed. This Manual updates and supersedes the 1995 RAM User and Maintenance Manual. Changes and enhancements from the 1995 version of the model are primarily a result of the addition of more recent aircraft and shuttle R&M data.

  19. Using the Theory of Planned Behavior to investigate condom use behaviors among female injecting drug users who are also sex workers in China.

    PubMed

    Gu, Jing; Lau, Joseph T F; Chen, Xi; Liu, Chuliang; Liu, Jun; Chen, Hongyao; Wang, Renfan; Lei, Zhangquan; Li, Zhenglin

    2009-08-01

    Female injecting drug users who are sex workers (IDUFSWs) is a strategic "bridge population" for HIV transmission. Goals of the study were to investigate condom use behaviors during commercial sex among IDUFSWs using the Theory of Planned Behavior (TPB), and to investigate moderating effects that modify the strength of associations between the TPB-related variables and inconsistent condom use during commercial sex. A total of 281 non-institutionalized IDUFSWs were recruited using snowball sampling method. Anonymous face-to-face interviews were administered by trained doctors. The results showed that the prevalence of inconsistent condom use during commercial sex in the last six months was 64%. After adjusting for some significant background variables (e.g. main venue of sex work), all associations between the five TPB-related variables and the studied condom use variable were statistically significant (Odds Ratio (OR) = 0.43-0.68, p<0.001). In the hierarchical nested models, three background variables (age, venue of sex work, and ever used HIV-related interventions) entered in the first step (-2LL = 294.98, p<0.001) and the Social Norm Scale, the Perceived Behavioral Control Scale and the Behavioral Intention Scale were selected by the second step (OR = 0.67 - 0.72, p<0.01; -2LL = 160.99, p<0.001). Significant moderating effects between some TPB-related variables (Positive Condom use Attitude Scale and Behavioral Intention Scale) and duration of sex work and duration of drug use were also reported. The results highlighted the potential of using the TPB to better understand condom use behaviors in IDUFSWs in China. Theory-based research and intervention work should be developed in China in the future. PMID:20024752

  20. Micromechanics of metal matrix composites using the Generalized Method of Cells model (GMC) user's guide

    NASA Technical Reports Server (NTRS)

    Aboudi, Jacob; Pindera, Marek-Jerzy

    1992-01-01

    A user's guide for the program gmc.f is presented. The program is based on the generalized method of cells model (GMC) which is capable via a micromechanical analysis, of predicting the overall, inelastic behavior of unidirectional, multi-phase composites from the knowledge of the properties of the viscoplastic constituents. In particular, the program is sufficiently general to predict the response of unidirectional composites having variable fiber shapes and arrays.

  1. Streamflow forecasting using the modular modeling system and an object-user interface

    USGS Publications Warehouse

    Jeton, A.E.

    2001-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Bureau of Reclamation (BOR), developed a computer program to provide a general framework needed to couple disparate environmental resource models and to manage the necessary data. The Object-User Interface (OUI) is a map-based interface for models and modeling data. It provides a common interface to run hydrologic models and acquire, browse, organize, and select spatial and temporal data. One application is to assist river managers in utilizing streamflow forecasts generated with the Precipitation-Runoff Modeling System running in the Modular Modeling System (MMS), a distributed-parameter watershed model, and the National Weather Service Extended Streamflow Prediction (ESP) methodology.

  2. Strong coupling theory for interacting lattice models

    NASA Astrophysics Data System (ADS)

    Stanescu, Tudor D.; Kotliar, Gabriel

    2004-11-01

    We develop a strong coupling approach for a general lattice problem. We argue that this strong coupling perspective represents the natural framework for a generalization of the dynamical mean field theory (DMFT). The main result of this analysis is twofold: (1) It provides the tools for a unified treatment of any nonlocal contribution to the Hamiltonian. Within our scheme, nonlocal terms such as hopping terms, spin-spin interactions, or nonlocal Coulomb interactions are treated on equal footing. (2) By performing a detailed strong-coupling analysis of a generalized lattice problem, we establish the basis for possible clean and systematic extensions beyond DMFT. To this end, we study the problem using three different perspectives. First, we develop a generalized expansion around the atomic limit in terms of the coupling constants for the nonlocal contributions to the Hamiltonian. By analyzing the diagrammatics associated with this expansion, we establish the equations for a generalized dynamical mean-field theory. Second, we formulate the theory in terms of a generalized strong coupling version of the Baym-Kadanoff functional. Third, following Pairault, Sénéchal, and Tremblay [Phys. Rev. Lett. 80, 5389 (1998)], we present our scheme in the language of a perturbation theory for canonical fermionic and bosonic fields and we establish the interpretation of various strong coupling quantities within a standard perturbative picture.

  3. Hawaii demand-side management resource assessment. Final report, Reference Volume 5: The DOETRAN user`s manual; The DOE-2/DBEDT DSM forecasting model interface

    SciTech Connect

    1995-04-01

    The DOETRAN model is a DSM database manager, developed to act as an intermediary between the whole building energy simulation model, DOE-2, and the DBEDT DSM Forecasting Model. DOETRAN accepts output data from DOE-2 and TRANslates that into the format required by the forecasting model. DOETRAN operates in the Windows environment and was developed using the relational database management software, Paradox 5.0 for Windows. It is not necessary to have any knowledge of Paradox to use DOETRAN. DOETRAN utilizes the powerful database manager capabilities of Paradox through a series of customized user-friendly windows displaying buttons and menus with simple and clear functions. The DOETRAN model performs three basic functions, with an optional fourth. The first function is to configure the user`s computer for DOETRAN. The second function is to import DOE-2 files with energy and loadshape data for each building type. The third main function is to then process the data into the forecasting model format. As DOETRAN processes the DOE-2 data, graphs of the total electric monthly impacts for each DSM measure appear, providing the user with a visual means of inspecting DOE-2 data, as well as following program execution. DOETRAN provides three tables for each building type for the forecasting model, one for electric measures, gas measures, and basecases. The optional fourth function provided by DOETRAN is to view graphs of total electric annual impacts by measure. This last option allows a comparative view of how one measure rates against another. A section in this manual is devoted to each of the four functions mentioned above, as well as computer requirements and exiting DOETRAN.

  4. Modeling transonic aerodynamic response using nonlinear systems theory for use with modern control theory

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1993-01-01

    The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.

  5. MFIX documentation: User`s manual

    SciTech Connect

    Syamlal, M.

    1994-11-01

    MFIX (Multiphase Flow with Interphase exchanges) is a general-purpose hydro-dynamic model for describing chemical reactions and heat transfer in dense or dilute fluid-solids flows, which typically occur in energy conversion and chemical processing reactors. MFIX calculations give time-dependent information on pressure, temperature, composition, and velocity distributions in the reactors. The theoretical basis of the calculations is described in the MFIX Theory Guide. This report, which is the MFIX User`s Manual, gives an overview of the numerical technique, and describes how to install the MFIX code and post-processing codes, set up data files and run MFIX, graphically analyze MFIX results, and retrieve data from the output files. Two tutorial problems that highlight various features of MFIX are also discussed.

  6. Psycholinguistic Theory of Learning to Read Compared to the Traditional Theory Model.

    ERIC Educational Resources Information Center

    Murphy, Robert F.

    A comparison of two models of the reading process--the psycholinguistic model, in which learning to read is seen as a top-down, holistic procedure, and the traditional theory model, in which learning to read is seen as a bottom-up, atomistic procedure--is provided in this paper. The first part of the paper provides brief overviews of the following…

  7. Labor Market Projections Model: a user's guide to the population, labor force, and unemployment projections model at Lawrence Berkeley Laboratory

    SciTech Connect

    Schroeder, E.

    1980-08-01

    In an effort to assist SESA analysts and CETA prime sponsor planners in the development of labor-market information suitable to their annual plans, the Labor Market Projections Model (LMPM) was initiated. The purpose of LMPM is to provide timely information on the demographic characteristics of local populations, labor supply, and unemployment. In particular, the model produces short-term projections of the distributions of population, labor force, and unemployment by age, sex, and race. LMPM was designed to carry out these projections at various geographic levels - counties, prime-sponsor areas, SMSAs, and states. While LMPM can project population distributions for areas without user input, the labor force and unemployment projections rely upon inputs from analysts or planners familiar with the economy of the area of interest. Thus, LMPM utilizes input from the SESA analysts. This User's Guide to LMPM was specifically written as an aid to SESA analysts and other users in improving their understanding of LMPM. The basic method of LMPM is a demographic cohort aging model that relies upon 1970 Census data. LMPM integrates data from several sources in order to produce current projections from the 1970 baseline for all the local areas of the nation. This User's Guide documents the procedures, data, and output of LMPM. 11 references.

  8. Freight Network Modeling System. Volume IV. Shortest-Path Analysis and Display user's guide

    SciTech Connect

    Not Available

    1985-04-01

    The Freight Network Modeling System (FNEM) is a general and flexible modeling system designed to have wide applicability to a variety of freight transportation analyses. The system consists of compatible network data bases, data management software, models of freight transportation, report generators, and graphics output. In many studies, a model as comprehensive as FNEM is not required. The second model, Shortest-Path Analysis and Display (SPAD), is a simpler model that optimizes routings of single shipments. The routing criteria that can be used are numerous - including minimizing cost, minimizing delay, minimizing population exposure (useful when considering shipments of hazardous materials), and minimizing accident risk. In addition to the above criteria, the routes can also be restricted to those with clearance for oversized loads or with sufficient load capabilities. SPAD can be used interactively and the routes can be displayed graphically. This volume contains a user's guide for SPAD including preprocessor programs and SPAD execution. 7 figs., 19 tabs.

  9. Theory and model use in social marketing health interventions.

    PubMed

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions. PMID:22934539

  10. Mechanical Regulation of Bone Regeneration: Theories, Models, and Experiments

    PubMed Central

    Betts, Duncan Colin; Müller, Ralph

    2014-01-01

    How mechanical forces influence the regeneration of bone remains an open question. Their effect has been demonstrated experimentally, which has allowed mathematical theories of mechanically driven tissue differentiation to be developed. Many simulations driven by these theories have been presented, however, validation of these models has remained difficult due to the number of independent parameters considered. An overview of these theories and models is presented along with a review of experimental studies and the factors they consider. Finally limitations of current experimental data and how this influences modeling are discussed and potential solutions are proposed. PMID:25540637

  11. Mars Global Reference Atmospheric Model 2000 Version (Mars-GRAM 2000): Users Guide

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; James, B. F.

    2000-01-01

    This report presents Mars Global Reference Atmospheric Model 2000 Version (Mars-GRAM 2000) and its new features. All parameterizations for temperature, pressure, density, and winds versus height, latitude, longitude, time of day, and L(sub s) have been replaced by input data tables from NASA Ames Mars General Circulation Model (MGCM) for the surface through 80-km altitude and the University of Arizona Mars Thermospheric General Circulation Model (MTGCM) for 80 to 170 km. A modified Stewart thermospheric model is still used for higher altitudes and for dependence on solar activity. "Climate factors" to tune for agreement with GCM data are no longer needed. Adjustment of exospheric temperature is still an option. Consistent with observations from Mars Global Surveyor, a new longitude-dependent wave model is included with user input to specify waves having 1 to 3 wavelengths around the planet. A simplified perturbation model has been substituted for the earlier one. An input switch allows users to select either East or West longitude positive. This memorandum includes instructions on obtaining Mars-GRAM source code and data files and for running the program. It also provides sample input and output and an example for incorporating Mars-GRAM as an atmospheric subroutine in a trajectory code.

  12. User Guide for VISION 3.4.7 (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Wendell D. Hintze

    2011-07-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters and options; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating 'what if' scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., 'reactor types' not individual reactors and 'separation types' not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation or disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. VISION is comprised of several Microsoft Excel input files, a Powersim Studio core, and several Microsoft Excel output files. All must be co-located in the same folder on a PC to function. You must use Powersim Studio 8 or better. We have tested VISION with the Studio 8 Expert, Executive, and Education versions. The Expert and Education

  13. User-driven Cloud Implementation of environmental models and data for all

    NASA Astrophysics Data System (ADS)

    Gurney, R. J.; Percy, B. J.; Elkhatib, Y.; Blair, G. S.

    2014-12-01

    Environmental data and models come from disparate sources over a variety of geographical and temporal scales with different resolutions and data standards, often including terabytes of data and model simulations. Unfortunately, these data and models tend to remain solely within the custody of the private and public organisations which create the data, and the scientists who build models and generate results. Although many models and datasets are theoretically available to others, the lack of ease of access tends to keep them out of reach of many. We have developed an intuitive web-based tool that utilises environmental models and datasets located in a cloud to produce results that are appropriate to the user. Storyboards showing the interfaces and visualisations have been created for each of several exemplars. A library of virtual machine images has been prepared to serve these exemplars. Each virtual machine image has been tailored to run computer models appropriate to the end user. Two approaches have been used; first as RESTful web services conforming to the Open Geospatial Consortium (OGC) Web Processing Service (WPS) interface standard using the Python-based PyWPS; second, a MySQL database interrogated using PHP code. In all cases, the web client sends the server an HTTP GET request to execute the process with a number of parameter values and, once execution terminates, an XML or JSON response is sent back and parsed at the client side to extract the results. All web services are stateless, i.e. application state is not maintained by the server, reducing its operational overheads and simplifying infrastructure management tasks such as load balancing and failure recovery. A hybrid cloud solution has been used with models and data sited on both private and public clouds. The storyboards have been transformed into intuitive web interfaces at the client side using HTML, CSS and JavaScript, utilising plug-ins such as jQuery and Flot (for graphics), and Google Maps

  14. Scaling theory of depinning in the Sneppen model

    SciTech Connect

    Maslov, S.; Paczuski, M. Department of Physics, State University of New York at Stony Brook, Stony Brook, New York 11790 The Isaac Newton Institute for Mathematical Sciences, 20 Clarkson Road, Cambridge CB4 0EH )

    1994-08-01

    We develop a scaling theory for the critical depinning behavior of the Sneppen interface model [Phys. Rev. Lett. [bold 69], 3539 (1992)]. This theory is based on a gap'' equation that describes the self-organization process to a critical state of the depinning transition. All of the critical exponents can be expressed in terms of two independent exponents, [nu][sub [parallel

  15. Reframing Leadership Pedagogy through Model and Theory Building.

    ERIC Educational Resources Information Center

    Mello, Jeffrey A.

    1999-01-01

    Leadership theories formed the basis of a course assignment with four objectives: understanding complex factors affecting leadership dynamics, developing abilities to assess organizational factors influencing leadership, practicing model and theory building, and viewing leadership from a multicultural perspective. The assignment was to develop a…

  16. Spin Kinetic Models of Plasmas - Semiclassical and Quantum Mechanical Theory

    SciTech Connect

    Brodin, Gert; Marklund, Mattias; Zamanian, Jens

    2009-11-10

    In this work a recently published semiclassical spin kinetic model, generalizing those of previous authors are discussed. Some previously described properties are reviewed, and a new example illustrating the theory is presented. The generalization to a fully quantum mechanical description is discussed, and the main features of such a theory is outlined. Finally, the main conclusions are presented.

  17. A user's manual for the method of moments Aircraft Modeling Code (AMC)

    NASA Technical Reports Server (NTRS)

    Peters, M. E.; Newman, E. H.

    1989-01-01

    This report serves as a user's manual for the Aircraft Modeling Code or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. The input command language is described and several examples which illustrate typical code inputs and outputs are also included.

  18. A users manual for the method of moments Aircraft Modeling Code (AMC), version 2

    NASA Technical Reports Server (NTRS)

    Peters, M. E.; Newman, E. H.

    1994-01-01

    This report serves as a user's manual for Version 2 of the 'Aircraft Modeling Code' or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. This report describes the input command language and also includes several examples which illustrate typical code inputs and outputs.

  19. Optimal water allocation in small hydropower plants between traditional and non-traditional water users: merging theory and existing practices.

    NASA Astrophysics Data System (ADS)

    Gorla, Lorenzo; Crouzy, Benoît; Perona, Paolo

    2014-05-01

    Water demand for hydropower production is increasing together with the consciousness of the importance of riparian ecosystems and biodiversity. Some Cantons in Switzerland and other alpine regions in Austria and in Süd Tiröl (Italy) started replacing the inadequate concept of Minimum Flow Requirement (MFR) with a dynamic one, by releasing a fix percentage of the total inflow (e.g. 25 %) to the environment. Starting from a model proposed by Perona et al. (2013) and the need of including the environment as an actual water user, we arrived to similar qualitative results, and better quantitative performances. In this paper we explore the space of non-proportional water repartition rules analysed by Gorla and Perona (2013), and we propose new ecological indicators which are directly derived from current ecologic evaluation practices (fish habitat modelling and hydrological alteration). We demonstrate that both MFR water redistribution policy and also proportional repartition rules can be improved using nothing but available information. Furthermore, all water redistribution policies can be described by the model proposed by Perona et al. (2013) in terms of the Principle of Equal Marginal Utility (PEMU) and a suitable class of nonlinear functions. This is particularly useful to highlights implicit assumptions and choosing best-compromise solutions, providing analytical reasons explaining why efficiency cannot be attained by classic repartition rules. Each water repartition policy underlies an ecosystem monetization and a political choice always has to be taken. We explicit the value of the ecosystem health underlying each policy by means of the PEMU under a few assumptions, and discuss how the theoretic efficient redistribution law obtained by our approach is feasible and doesn't imply high costs or advanced management tools. For small run-of-river power plants, this methodology answers the question "how much water should be left to the river?" and is therefore a

  20. A Comprehensive and Systematic Model of User Evaluation of Web Search Engines: II. An Evaluation by Undergraduates.

    ERIC Educational Resources Information Center

    Su, Louise T.

    2003-01-01

    Presents an application of a model of user evaluation of four major Web search engines (Alta Vista, Excite, Infoseek, and Lycos) by undergraduates. Evaluation was based on 16 performance measures representing five evaluation criteria-relevance, efficiency, utility, user satisfaction, and connectivity. Content analysis of verbal data identified a…

  1. Case studies of mental models in home heat control: searching for feedback, valve, timer and switch theories.

    PubMed

    Revell, Kirsten M A; Stanton, Neville A

    2014-05-01

    An intergroup case study was undertaken to determine if: 1) There exist distinct mental models of home heating function, that differ significantly from the actual functioning of UK heating systems; and 2) Mental models of thermostat function can be categorized according to Kempton's (1986) valve and feedback shared theories, and others from the literature. Distinct, inaccurate mental models of the heating system, as well as thermostat devices in isolation, were described. It was possible to categorise thermostat models by Kempton's (1986) feedback shared theory, but other theories proved ambiguous. Alternate control devices could be categorized by Timer (Norman, 2002) and Switch (Peffer et al., 2011) theories. The need to consider the mental models of the heating system in terms of an integrated set of control devices, and to consider user's goals and expectations of the system benefit, was highlighted. The value of discovering shared theories, and understanding user mental models, of home heating, are discussed with reference to their present day relevance for reducing energy consumption. PMID:23731626

  2. A model of the measurement process in quantum theory

    NASA Astrophysics Data System (ADS)

    Diel, H. H.

    2015-07-01

    The so-called measurement problem of quantum theory (QT) is still lacking a satisfactory, or at least widely agreed upon, solution. A number of theories, known as interpretations of quantum theory, have been proposed and found differing acceptance among physicists. Most of the proposed theories try to explain what happens during a QT measurement using a modification of the declarative equations that define the possible results of a measurement of QT observables or by making assumptions outside the scope of falsifiable physics. This paper proposes a solution to the QT measurement problem in terms of a model of the process for the evolution of two QT systems that interact in a way that represents a measurement. The model assumes that the interactions between the measured QT object and the measurement apparatus are ’’normal” interactions which adhere to the laws of quantum field theory.

  3. Convergent perturbation theory for lattice models with fermions

    NASA Astrophysics Data System (ADS)

    Sazonov, V. K.

    2016-05-01

    The standard perturbation theory in QFT and lattice models leads to the asymptotic expansions. However, an appropriate regularization of the path or lattice integrals allows one to construct convergent series with an infinite radius of the convergence. In the earlier studies, this approach was applied to the purely bosonic systems. Here, using bosonization, we develop the convergent perturbation theory for a toy lattice model with interacting fermionic and bosonic fields.

  4. Conceptual development: an adaptive resonance theory model of polysemy

    NASA Astrophysics Data System (ADS)

    Dunbar, George L.

    1997-04-01

    Adaptive Resonance Theory provides a model of pattern classification that addresses the plasticity--stability dilemma and allows a neural network to detect when to construct a new category without the assistance of a supervisor. We show that Adaptive Resonance Theory can be applied to the study of natural concept development. Specifically, a model is presented which is able to categorize different usages of a common noun and group the polysemous senses appropriately.

  5. An information model to support user-centered design of medical devices.

    PubMed

    Hagedorn, Thomas J; Krishnamurty, Sundar; Grosse, Ian R

    2016-08-01

    The process of engineering design requires the product development team to balance the needs and limitations of many stakeholders, including those of the user, regulatory organizations, and the designing institution. This is particularly true in medical device design, where additional consideration must be given for a much more complex user-base that can only be accessed on a limited basis. Given this inherent challenge, few projects exist that consider design domain concepts, such as aspects of a detailed design, a detailed view of various stakeholders and their capabilities, along with the user-needs simultaneously. In this paper, we present a novel information model approach that combines a detailed model of design elements with a model of the design itself, customer requirements, and of the capabilities of the customer themselves. The information model is used to facilitate knowledge capture and automated reasoning across domains with a minimal set of rules by adopting a terminology that treats customer and design specific factors identically, thus enabling straightforward assessments. A uniqueness of this approach is that it systematically provides an integrated perspective on the key usability information that drive design decisions towards more universal or effective outcomes with the very design information impacted by the usability information. This can lead to cost-efficient optimal designs based on a direct inclusion of the needs of customers alongside those of business, marketing, and engineering requirements. Two case studies are presented to show the method's potential as a more effective knowledge management tool with built-in automated inferences that provide design insight, as well as its overall effectiveness as a platform to develop and execute medical device design from a holistic perspective. PMID:27401857

  6. Consumer preference models: fuzzy theory approach

    NASA Astrophysics Data System (ADS)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  7. A Dynamic Systems Theory Model of Visual Perception Development

    ERIC Educational Resources Information Center

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  8. A Sharing Item Response Theory Model for Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Segall, Daniel O.

    2004-01-01

    A new sharing item response theory (SIRT) model is presented that explicitly models the effects of sharing item content between informants and test takers. This model is used to construct adaptive item selection and scoring rules that provide increased precision and reduced score gains in instances where sharing occurs. The adaptive item selection…

  9. Bianchi class A models in Sàez-Ballester's theory

    NASA Astrophysics Data System (ADS)

    Socorro, J.; Espinoza-García, Abraham

    2012-08-01

    We apply the Sàez-Ballester (SB) theory to Bianchi class A models, with a barotropic perfect fluid in a stiff matter epoch. We obtain exact classical solutions à la Hamilton for Bianchi type I, II and VIh=-1 models. We also find exact quantum solutions to all Bianchi Class A models employing a particular ansatz for the wave function of the universe.

  10. Using the NIATx Model to Implement User-Centered Design of Technology for Older Adults.

    PubMed

    Gustafson, David H; Maus, Adam; Judkins, Julianne; Dinauer, Susan; Isham, Andrew; Johnson, Roberta; Landucci, Gina; Atwood, Amy K

    2016-01-01

    What models can effectively guide the creation of eHealth and mHealth technologies? This paper describes the use of the NIATx model as a framework for the user-centered design of a new technology for older adults. The NIATx model is a simple framework of process improvement based on the following principles derived from an analysis of decades of research from various industries about why some projects fail and others succeed: (1) Understand and involve the customer; (2) fix key problems; (3) pick an influential change leader; (4) get ideas from outside the field; (5) use rapid-cycle testing. This paper describes the use of these principles in technology development, the strengths and challenges of using this approach in this context, and lessons learned from the process. Overall, the NIATx model enabled us to produce a user-focused technology that the anecdotal evidence available so far suggests is engaging and useful to older adults. The first and fourth principles were especially important in developing the technology; the fourth proved the most challenging to use. PMID:27025985

  11. Using the NIATx Model to Implement User-Centered Design of Technology for Older Adults

    PubMed Central

    Maus, Adam; Judkins, Julianne; Dinauer, Susan; Isham, Andrew; Johnson, Roberta; Landucci, Gina; Atwood, Amy K

    2016-01-01

    What models can effectively guide the creation of eHealth and mHealth technologies? This paper describes the use of the NIATx model as a framework for the user-centered design of a new technology for older adults. The NIATx model is a simple framework of process improvement based on the following principles derived from an analysis of decades of research from various industries about why some projects fail and others succeed: (1) Understand and involve the customer; (2) fix key problems; (3) pick an influential change leader; (4) get ideas from outside the field; (5) use rapid-cycle testing. This paper describes the use of these principles in technology development, the strengths and challenges of using this approach in this context, and lessons learned from the process. Overall, the NIATx model enabled us to produce a user-focused technology that the anecdotal evidence available so far suggests is engaging and useful to older adults. The first and fourth principles were especially important in developing the technology; the fourth proved the most challenging to use. PMID:27025985

  12. The HADDOCK2.2 Web Server: User-Friendly Integrative Modeling of Biomolecular Complexes.

    PubMed

    van Zundert, G C P; Rodrigues, J P G L M; Trellet, M; Schmitz, C; Kastritis, P L; Karaca, E; Melquiond, A S J; van Dijk, M; de Vries, S J; Bonvin, A M J J

    2016-02-22

    The prediction of the quaternary structure of biomolecular macromolecules is of paramount importance for fundamental understanding of cellular processes and drug design. In the era of integrative structural biology, one way of increasing the accuracy of modeling methods used to predict the structure of biomolecular complexes is to include as much experimental or predictive information as possible in the process. This has been at the core of our information-driven docking approach HADDOCK. We present here the updated version 2.2 of the HADDOCK portal, which offers new features such as support for mixed molecule types, additional experimental restraints and improved protocols, all of this in a user-friendly interface. With well over 6000 registered users and 108,000 jobs served, an increasing fraction of which on grid resources, we hope that this timely upgrade will help the community to solve important biological questions and further advance the field. The HADDOCK2.2 Web server is freely accessible to non-profit users at http://haddock.science.uu.nl/services/HADDOCK2.2. PMID:26410586

  13. The monster sporadic group and a theory underlying superstring models

    SciTech Connect

    Chapline, G.

    1996-09-01

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs.

  14. ROADWAY: A NUMERICAL MODEL FOR PREDICTING AIR POLLUTANTS NEAR HIGHWAYS. USER'S GUIDE

    EPA Science Inventory

    ROADWAY is a finite-difference model which solves a conservation of species equation to predict pollutant concentrations within two hundred meters of a highway. It uses surface layer similarity theory to predict wind and eddy diffusion profiles from temperature at two heights and...

  15. A user-oriented and computerized model for estimating vehicle ride quality

    NASA Technical Reports Server (NTRS)

    Leatherwood, J. D.; Barker, L. M.

    1984-01-01

    A simplified empirical model and computer program for estimating passenger ride comfort within air and surface transportation systems are described. The model is based on subjective ratings from more than 3000 persons who were exposed to controlled combinations of noise and vibration in the passenger ride quality apparatus. This model has the capability of transforming individual elements of a vehicle's noise and vibration environment into subjective discomfort units and then combining the subjective units to produce a single discomfort index typifying passenger acceptance of the environment. The computational procedures required to obtain discomfort estimates are discussed, and a user oriented ride comfort computer program is described. Examples illustrating application of the simplified model to helicopter and automobile ride environments are presented.

  16. Mathematical Modelling and New Theories of Learning.

    ERIC Educational Resources Information Center

    Boaler, Jo

    2001-01-01

    Demonstrates the importance of expanding notions of learning beyond knowledge to the practices in mathematics classrooms. Considers a three-year study of students who learned through mathematical modeling. Shows that a modeling approach encouraged the development of a range of important practices in addition to knowledge that were useful in real…

  17. Baldrige Theory into Practice: A Generic Model

    ERIC Educational Resources Information Center

    Arif, Mohammed

    2007-01-01

    Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…

  18. The Family FIRO Model: The Integration of Group Theory and Family Theory.

    ERIC Educational Resources Information Center

    Colangelo, Nicholas; Doherty, William J.

    1988-01-01

    Presents the Family Fundamental Interpersonal Relations Orientation (Family FIRO) Model, an integration of small-group theory and family therapy. The model is offered as a framework for organizing family issues. Discusses three fundamental issues of human relatedness and their applicability to group dynamics. (Author/NB)

  19. Measurement Models for Reasoned Action Theory

    PubMed Central

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach. PMID:23243315

  20. User-Friendly Predictive Modeling of Greenhouse Gas (GHG) Fluxes and Carbon Storage in Tidal Wetlands

    NASA Astrophysics Data System (ADS)

    Ishtiaq, K. S.; Abdul-Aziz, O. I.

    2015-12-01

    We developed user-friendly empirical models to predict instantaneous fluxes of CO2 and CH4 from coastal wetlands based on a small set of dominant hydro-climatic and environmental drivers (e.g., photosynthetically active radiation, soil temperature, water depth, and soil salinity). The dominant predictor variables were systematically identified by applying a robust data-analytics framework on a wide range of possible environmental variables driving wetland greenhouse gas (GHG) fluxes. The method comprised of a multi-layered data-analytics framework, including Pearson correlation analysis, explanatory principal component and factor analyses, and partial least squares regression modeling. The identified dominant predictors were finally utilized to develop power-law based non-linear regression models to predict CO2 and CH4 fluxes under different climatic, land use (nitrogen gradient), tidal hydrology and salinity conditions. Four different tidal wetlands of Waquoit Bay, MA were considered as the case study sites to identify the dominant drivers and evaluate model performance. The study sites were dominated by native Spartina Alterniflora and characterized by frequent flooding and high saline conditions. The model estimated the potential net ecosystem carbon balance (NECB) both in gC/m2 and metric tonC/hectare by up-scaling the instantaneous predicted fluxes to the growing season and accounting for the lateral C flux exchanges between the wetlands and estuary. The entire model was presented in a single Excel spreadsheet as a user-friendly ecological engineering tool. The model can aid the development of appropriate GHG offset protocols for setting monitoring plans for tidal wetland restoration and maintenance projects. The model can also be used to estimate wetland GHG fluxes and potential carbon storage under various IPCC climate change and sea level rise scenarios; facilitating an appropriate management of carbon stocks in tidal wetlands and their incorporation into a

  1. Understanding Rasch Measurement: The Rasch Model, Additive Conjoint Measurement, and New Models of Probabilistic Measurement Theory.

    ERIC Educational Resources Information Center

    Karabatsos, George

    2001-01-01

    Describes similarities and differences between additive conjoint measurement and the Rasch model, and formalizes some new nonparametric item response models that are, in a sense, probabilistic measurement theory models. Applies these new models to published and simulated data. (SLD)

  2. SALMOD: a population model for salmonids: user's manual. Version W3

    USGS Publications Warehouse

    Bartholow, John; Heasley, John; Laake, Jeff; Sandelin, Jeff; Coughlan, Beth A.K.; Moos, Alan

    2002-01-01

    SALMOD is a computer model that simulates the dynamics of freshwater salmonid populations, both anadromous and resident. The conceptual model was developed in a workshop setting (Williamson et al. 1993) using fish experts concerned with Trinity River chinook restoration. The model builds on the foundation laid by similar models (see Cheslak and Jacobson 1990). The model’s premise that that egg and fish mortality are directly related to spatially and temporally variable micro- and macrohabitat limitations, which themselves are related to the timing and amount of streamflow and other meteorological variables. Habitat quality and capacity are characterized by the hydraulic and thermal properties of individual mesohabitats, which we use as spatial “computation units” in the model. The model tracks a population of spatially distinct cohorts that originate as gees and grow from one life stage to another as a function of local water temperature. Individual cohorts either remain in the computational unit in which they emerged or move, in whole or in part, to nearby units (see McCormick et al. 1998). Model processes include spawning (with red superimposition and incubation losses), growth (including egg maturation), mortality, and movement (freshet-induced, habitat-induced, and seasonal). Model processes are implemented such that the user (modeler) has the ability to more-or-less program the model on the fly to create the dynamics thought to animate the population. SALMOD then tabulates the various causes of mortality and the whereabouts of fish.

  3. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    SciTech Connect

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  4. The theory research of multi-user quantum access network with Measurement Device Independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Ji, Yi-Ming; Li, Yun-Xia; Shi, Lei; Meng, Wen; Cui, Shu-Min; Xu, Zhen-Yu

    2015-10-01

    Quantum access network can't guarantee the absolute security of multi-user detector and eavesdropper can get access to key information through time-shift attack and other ways. Measurement-device-independent quantum key distribution is immune from all the detection attacks, and accomplishes the safe sharing of quantum key. In this paper, that Measurement-device-independent quantum key distribution is used in the application of multi-user quantum access to the network is on the research. By adopting time-division multiplexing technology to achieve the sharing of multiuser detector, the system structure is simplified and the security of quantum key sharing is acquired.

  5. Homogeneous cosmological models in Yang's gravitation theory

    NASA Technical Reports Server (NTRS)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  6. Improving Vortex Models via Optimal Control Theory

    NASA Astrophysics Data System (ADS)

    Hemati, Maziar; Eldredge, Jeff; Speyer, Jason

    2012-11-01

    Flapping wing kinematics, common in biological flight, can allow for agile flight maneuvers. On the other hand, we currently lack sufficiently accurate low-order models that enable such agility in man-made micro air vehicles. Low-order point vortex models have had reasonable success in predicting the qualitative behavior of the aerodynamic forces resulting from such maneuvers. However, these models tend to over-predict the force response when compared to experiments and high-fidelity simulations, in part because they neglect small excursions of separation from the wing's edges. In the present study, we formulate a constrained minimization problem which allows us to relax the usual edge regularity conditions in favor of empirical determination of vortex strengths. The optimal vortex strengths are determined by minimizing the error with respect to empirical force data, while the vortex positions are constrained to evolve according to the impulse matching model developed in previous work. We consider a flat plate undergoing various canonical maneuvers. The optimized model leads to force predictions remarkably close to the empirical data. Additionally, we compare the optimized and original models in an effort to distill appropriate edge conditions for unsteady maneuvers.

  7. MT3D: a 3 dimensional magnetotelluric modeling program (user's guide and documentation for Rev. 1)

    SciTech Connect

    Nutter, C.; Wannamaker, P.E.

    1980-11-01

    MT3D.REV1 is a non-interactive computer program written in FORTRAN to do 3-dimensional magnetotelluric modeling. A 3-D volume integral equation has been adapted to simulate the MT response of a 3D body in the earth. An integro-difference scheme has been incorporated to increase the accuracy. This is a user's guide for MT3D.REV1 on the University of Utah Research Institute's (UURI) PRIME 400 computer operating under PRIMOS IV, Rev. 17.

  8. Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1994-01-01

    An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.

  9. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  10. 3DFATMIC: THREE DIMENSIONAL SUBSURFACE FLOW, FATE AND TRANSPORT OF MICROBES AND CHEMICALS MODEL - USER'S MANUAL VERSION 1.0

    EPA Science Inventory

    This document is the user's manual of 3DFATMIC, a 3-Dimensional Subsurface Flow, Fate and Transport of Microbes and Chemicals Model using a Lagrangian-Eulerian adapted zooming and peak capturing (LEZOOMPC) algorithm.

  11. A user-friendly model for spray drying to aid pharmaceutical product development.

    PubMed

    Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L J; Frijlink, Henderik W

    2013-01-01

    The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach. PMID:24040240

  12. A User-Friendly Model for Spray Drying to Aid Pharmaceutical Product Development

    PubMed Central

    Grasmeijer, Niels; de Waard, Hans; Hinrichs, Wouter L. J.; Frijlink, Henderik W.

    2013-01-01

    The aim of this study was to develop a user-friendly model for spray drying that can aid in the development of a pharmaceutical product, by shifting from a trial-and-error towards a quality-by-design approach. To achieve this, a spray dryer model was developed in commercial and open source spreadsheet software. The output of the model was first fitted to the experimental output of a Büchi B-290 spray dryer and subsequently validated. The predicted outlet temperatures of the spray dryer model matched the experimental values very well over the entire range of spray dryer settings that were tested. Finally, the model was applied to produce glassy sugars by spray drying, an often used excipient in formulations of biopharmaceuticals. For the production of glassy sugars, the model was extended to predict the relative humidity at the outlet, which is not measured in the spray dryer by default. This extended model was then successfully used to predict whether specific settings were suitable for producing glassy trehalose and inulin by spray drying. In conclusion, a spray dryer model was developed that is able to predict the output parameters of the spray drying process. The model can aid the development of spray dried pharmaceutical products by shifting from a trial-and-error towards a quality-by-design approach. PMID:24040240

  13. User's guide for a personal computer model of turbulence at a wind turbine rotor

    NASA Astrophysics Data System (ADS)

    Connell, J. R.; Powell, D. C.; Gower, G. L.

    1989-08-01

    This document is primarily: (1) a user's guide for the personal computer (PC) version of the code for the PNL computational model of the rotationally sampled wind speed (RODASIM11), and (2) a brief guide to the growing literature on the subject of rotationally sampled turbulence, from which the model is derived. The model generates values of turbulence experienced by single points fixed in the rotating frame of reference of an arbitrary wind turbine blade. The character of the turbulence depends on the specification of mean wind speed, the variance of turbulence, the crosswind and along-wind integral scales of turbulence, mean wind shear, and the hub height, radius, and angular speed of rotation of any point at which wind fluctuation is to be calculated.

  14. User's guide for a personal computer model of turbulence at a wind turbine rotor

    SciTech Connect

    Connell, J.R.; Powell, D.C.; Gower, G.L.

    1989-08-01

    This document is primarily (1) a user's guide for the personal computer (PC) version of the code for the PNL computational model of the rotationally sampled wind speed (RODASIM11) and (2) a brief guide to the growing literature on the subject of rotationally sampled turbulence, from which the model is derived. The model generates values of turbulence experienced by single points fixed in the rotating frame of reference of an arbitrary wind turbine blade. The character of the turbulence depends on the specification of mean wind speed, the variance of turbulence, the crosswind and along-wind integral scales of turbulence, mean wind shear, and the hub height, radius, and angular speed of rotation of any point at which wind fluctuation is to be calculated. 13 refs., 4 figs., 4 tabs.

  15. Coordinating the Complexity of Tools, Tasks, and Users: On Theory-Based Approaches to Authoring Tool Usability

    ERIC Educational Resources Information Center

    Murray, Tom

    2016-01-01

    Intelligent Tutoring Systems authoring tools are highly complex educational software applications used to produce highly complex software applications (i.e. ITSs). How should our assumptions about the target users (authors) impact the design of authoring tools? In this article I first reflect on the factors leading to my original 1999 article on…

  16. Scaling Users' Perceptions of Library Service Quality Using Item Response Theory: A LibQUAL+ [TM] Study

    ERIC Educational Resources Information Center

    Wei, Youhua; Thompson, Bruce; Cook, C. Colleen

    2005-01-01

    LibQUAL+[TM] data to date have not been subjected to the modern measurement theory called polytomous item response theory (IRT). The data interpreted here were collected from 42,090 participants who completed the "American English" version of the 22 core LibQUAL+[TM] items, and 12,552 participants from Australia and Europe who completed the…

  17. BIGFLOW: A numerical code for simulating flow in variably saturated, heterogeneous geologic media. Theory and user`s manaual, Version 1.1

    SciTech Connect

    Ababou, R.; Bagtzoglou, A.C.

    1993-06-01

    This report documents BIGFLOW 1.1, a numerical code for simulating flow in variably saturated heterogeneous geologic media. It contains the underlying mathematical and numerical models, test problems, benchmarks, and applications of the BIGFLOW code. The BIGFLOW software package is composed of a simulation and an interactive data processing code (DATAFLOW). The simulation code solves linear and nonlinear porous media flow equations based on Darcy`s law, appropriately generalized to account for 3D, deterministic, or random heterogeneity. A modified Picard Scheme is used for linearizing unsaturated flow equations, and preconditioned iterative methods are used for solving the resulting matrix systems. The data processor (DATAFLOW) allows interactive data entry, manipulation, and analysis of 3D datasets. The report contains analyses of computational performance carried out using Cray-2 and Cray-Y/MP8 supercomputers. Benchmark tests include comparisons with other independently developed codes, such as PORFLOW and CMVSFS, and with analytical or semi-analytical solutions.

  18. Modeling Developmental Transitions in Adaptive Resonance Theory

    ERIC Educational Resources Information Center

    Raijmakers, Maartje E. J.; Molenaar, Peter C. M.

    2004-01-01

    Neural networks are applied to a theoretical subject in developmental psychology: modeling developmental transitions. Two issues that are involved will be discussed: discontinuities and acquiring qualitatively new knowledge. We will argue that by the appearance of a bifurcation, a neural network can show discontinuities and may acquire…

  19. Microbial community modeling using reliability theory.

    PubMed

    Zilles, Julie L; Rodríguez, Luis F; Bartolerio, Nicholas A; Kent, Angela D

    2016-08-01

    Linking microbial community composition with the corresponding ecosystem functions remains challenging. Because microbial communities can differ in their functional responses, this knowledge gap limits ecosystem assessment, design and management. To develop models that explicitly incorporate microbial populations and guide efforts to characterize their functional differences, we propose a novel approach derived from reliability engineering. This reliability modeling approach is illustrated here using a microbial ecology dataset from denitrifying bioreactors. Reliability modeling is well-suited for analyzing the stability of complex networks composed of many microbial populations. It could also be applied to evaluate the redundancy within a particular biochemical pathway in a microbial community. Reliability modeling allows characterization of the system's resilience and identification of failure-prone functional groups or biochemical steps, which can then be targeted for monitoring or enhancement. The reliability engineering approach provides a new perspective for unraveling the interactions between microbial community diversity, functional redundancy and ecosystem services, as well as practical tools for the design and management of engineered ecosystems. PMID:26882268

  20. Time dependent turbulence modeling and analytical theories of turbulence

    NASA Technical Reports Server (NTRS)

    Rubinstein, R.

    1993-01-01

    By simplifying the direct interaction approximation (DIA) for turbulent shear flow, time dependent formulas are derived for the Reynolds stresses which can be included in two equation models. The Green's function is treated phenomenologically, however, following Smith and Yakhot, we insist on the short and long time limits required by DIA. For small strain rates, perturbative evaluation of the correlation function yields a time dependent theory which includes normal stress effects in simple shear flows. From this standpoint, the phenomenological Launder-Reece-Rodi model is obtained by replacing the Green's function by its long time limit. Eddy damping corrections to short time behavior initiate too quickly in this model; in contrast, the present theory exhibits strong suppression of eddy damping at short times. A time dependent theory for large strain rates is proposed in which large scales are governed by rapid distortion theory while small scales are governed by Kolmogorov inertial range dynamics. At short times and large strain rates, the theory closely matches rapid distortion theory, but at long times it relaxes to an eddy damping model.

  1. Industrial Source Complex (ISC) Dispersion Model User's Guide. Second edition. Volume 1 (revised). Final report

    SciTech Connect

    Wagner, C.P.

    1987-12-01

    The Second Edition (Revised) of the Industrial Source Complex Dispersion (ISC) Model User's Guide provides a detailed technical discussion of the updated ISC Model. The ISC Model was designed in response to the need for a comprehensive set of dispersion-model computer programs that could be used to evaluate the air-quality impact of emissions from large industrial source complexes. Air-quality impact analyses for source complexes often require consideration of factors such as fugitive emissions, aerodynamic building-wake effects, time-dependent exponential decay of pollutants, gravitational settling, and dry deposition. The ISC Model consists of two computer programs that are designed to consider these and other factors so as to meet the dispersion modeling needs of air-pollution-control agencies and others responsible for performing dispersion-modeling analyses. Major features in the revised model code include: (1) a regulatory default option; (2) a CALMS processing procedure; (3) a new Urban Mode 3; (4) revised sets of wind-speed profile exponents for rural and urban scenarios.

  2. Technical documentation and user's guide for City-County Allocation Model (CCAM). Version 1. 0

    SciTech Connect

    Clark, L.T. Jr.; Scott, M.J.; Hammer, P.

    1986-05-01

    The City-County Allocation Model (CCAM) was developed as part of the Monitored Retrievable Storage (MRS) Program. The CCAM model was designed to allocate population changes forecasted by the MASTER model to specific local communities within commuting distance of the MRS facility. The CCAM model was designed to then forecast the potential changes in demand for key community services such as housing, police protection, and utilities for these communities. The CCAM model uses a flexible on-line data base on demand for community services that is based on a combination of local service levels and state and national service standards. The CCAM model can be used to quickly forecast the potential community service consequence of economic development for local communities anywhere in the country. The remainder of this document is organized as follows. The purpose of this manual is to assist the user in understanding and operating the City-County Allocation Model (CCAM). The annual explains the data sources for the model and code modifications as well as the operational procedures.

  3. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  4. Dust in fusion plasmas: theory and modeling

    SciTech Connect

    Smirnov, R. D.; Pigarov, A. Yu.; Krasheninnikov, S. I.; Mendis, D. A.; Rosenberg, M.; Rudakov, D.; Tanaka, Y.; Rognlien, T. D.; Soboleva, T. K.; Shukla, P. K.; Bray, B. D.; West, W. P.; Roquemore, A. L.; Skinner, C. H.

    2008-09-07

    Dust may have a large impact on ITER-scale plasma experiments including both safety and performance issues. However, the physics of dust in fusion plasmas is very complex and multifaceted. Here, we discuss different aspects of dust dynamics including dust-plasma, and dust-surface interactions. We consider the models of dust charging, heating, evaporation/sublimation, dust collision with material walls, etc., which are suitable for the conditions of fusion plasmas. The physical models of all these processes have been incorporated into the DUST Transport (DUSTT) code. Numerical simulations demonstrate that dust particles are very mobile and accelerate to large velocities due to the ion drag force (cruise speed >100 m/s). Deep penetration of dust particles toward the plasma core is predicted. It is shown that DUSTT is capable of reproducing many features of recent dust-related experiments, but much more work is still needed.

  5. Landfill Gas Emissions Model, version 2.0., user`s manual. Final report, September 1993--September 1997

    SciTech Connect

    Pelt, R.; Bass, R.; Heaton, R.; White, C.; Blackard, A.

    1998-05-01

    Landfill Gas Emissions Model (LandGEM) estimates air pollutant emissions from municipal solid waste (MSW) landfills. The model can be used to estimate emission rates for methane, carbon dioxide, nonmethane organic compounds, and individual air pollutants from landfills. It can also be used by landfill owners and operators to determine if a landfill is subject to the control requirements of the federal New Source Performance Standard (NSPS) for new MSW landfills or the emission guidelines for existing MSW landfills. The model is based on a first order decay equation and can be run using site-specific data are available, using default values: one set based on the requirements of the NSPS and emission guidelines, and the other based on emission factors in EPA`s Compilation of Air Pollutant Emission Factors, AP-42.

  6. Applying learning theories and instructional design models for effective instruction.

    PubMed

    Khalil, Mohammed K; Elkhider, Ihsan A

    2016-06-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory. PMID:27068989

  7. Field theory as a tool to constrain new physics models

    NASA Astrophysics Data System (ADS)

    Maas, Axel

    2015-08-01

    One of the major problems in developing new physics scenarios is that very often the parameters can be adjusted such that in perturbation theory almost all experimental low-energy results can be accommodated. It is therefore desirable to have additional constraints. Field-theoretical considerations can provide such additional constraints on the low-lying spectrum and multiplicities of models. Especially for theories with elementary or composite Higgs particle the Fröhlich-Morchio-Strocchi (FMS) mechanism provides a route to create additional conditions, though showing it to be at work requires genuine non-perturbative calculations. The qualitative features of this procedure are discussed for generic 2-Higgs-doublet models (2HDMs), grand-unified theories (GUTs) and technicolor-type theories.

  8. Group theory and biomolecular conformation: I. Mathematical and computational models

    PubMed Central

    Chirikjian, Gregory S

    2010-01-01

    Biological macromolecules, and the complexes that they form, can be described in a variety of ways ranging from quantum mechanical and atomic chemical models, to coarser grained models of secondary structure and domains, to continuum models. At each of these levels, group theory can be used to describe both geometric symmetries and conformational motion. In this survey, a detailed account is provided of how group theory has been applied across computational structural biology to analyze the conformational shape and motion of macromolecules and complexes. PMID:20827378

  9. Automated Physico-Chemical Cell Model Development through Information Theory

    SciTech Connect

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  10. Item response theory modeling in health outcomes measurement.

    PubMed

    Reeve, Bryce B

    2003-04-01

    There is a great need in health outcomes research to develop instruments that accurately measure a person's health status with minimal response burden. This need for psychometrically sound and clinically meaningful measures calls for better analytical tools beyond the methods available from traditional measurement theory. Applications of item response theory (IRT) modeling have increased considerably because of its utility for instrument development and evaluation, scale scoring, assessment of cultural equivalence, instrument linking and computerized adaptive testing. IRT models the relationship between a person's response to a survey question and their standing on a health construct, such as fatigue or depression. This review will discuss the theory and basics of IRT models and applications of these models to health outcomes measurement. PMID:19807361

  11. Incentivizing biodiversity conservation in artisanal fishing communities through territorial user rights and business model innovation.

    PubMed

    Gelcich, Stefan; Donlan, C Josh

    2015-08-01

    Territorial user rights for fisheries are being promoted to enhance the sustainability of small-scale fisheries. Using Chile as a case study, we designed a market-based program aimed at improving fishers' livelihoods while incentivizing the establishment and enforcement of no-take areas within areas managed with territorial user right regimes. Building on explicit enabling conditions (i.e., high levels of governance, participation, and empowerment), we used a place-based, human-centered approach to design a program that will have the necessary support and buy-in from local fishers to result in landscape-scale biodiversity benefits. Transactional infrastructure must be complex enough to capture the biodiversity benefits being created, but simple enough so that the program can be scaled up and is attractive to potential financiers. Biodiversity benefits created must be commoditized, and desired behavioral changes must be verified within a transactional context. Demand must be generated for fisher-created biodiversity benefits in order to attract financing and to scale the market model. Important design decisions around these 3 components-supply, transactional infrastructure, and demand-must be made based on local social-ecological conditions. Our market model, which is being piloted in Chile, is a flexible foundation on which to base scalable opportunities to operationalize a scheme that incentivizes local, verifiable biodiversity benefits via conservation behaviors by fishers that could likely result in significant marine conservation gains and novel cross-sector alliances. PMID:25737027

  12. Using Web 2.0 Techniques To Bring Global Climate Modeling To More Users

    NASA Astrophysics Data System (ADS)

    Chandler, M. A.; Sohl, L. E.; Tortorici, S.

    2012-12-01

    The Educational Global Climate Model has been used for many years in undergraduate courses and professional development settings to teach the fundamentals of global climate modeling and climate change simulation to students and teachers. While course participants have reported a high level of satisfaction in these courses and overwhelmingly claim that EdGCM projects are worth the effort, there is often a high level of frustration during the initial learning stages. Many of the problems stem from issues related to installation of the software suite and to the length of time it can take to run initial experiments. Two or more days of continuous run time may be required before enough data has been gathered to begin analyses. Asking users to download existing simulation data has not been a solution because the GCM data sets are several gigabytes in size, requiring substantial bandwidth and stable dedicated internet connections. As a means of getting around these problems we have been developing a Web 2.0 utility called EzGCM (Easy G-G-M) which emphasizes that participants learn the steps involved in climate modeling research: constructing a hypothesis, designing an experiment, running a computer model and assessing when an experiment has finished (reached equilibrium), using scientific visualization to support analysis, and finally communicating the results through social networking methods. We use classic climate experiments that can be "rediscovered" through exercises with EzGCM and are attempting to make this Web 2.0 tool an entry point into climate modeling for teachers with little time to cover the subject, users with limited computer skills, and for those who want an introduction to the process before tackling more complex projects with EdGCM.

  13. New theories of root growth modelling

    NASA Astrophysics Data System (ADS)

    Landl, Magdalena; Schnepf, Andrea; Vanderborght, Jan; Huber, Katrin; Javaux, Mathieu; Bengough, A. Glyn; Vereecken, Harry

    2016-04-01

    In dynamic root architecture models, root growth is represented by moving root tips whose line trajectory results in the creation of new root segments. Typically, the direction of root growth is calculated as the vector sum of various direction-affecting components. However, in our simulations this did not reproduce experimental observations of root growth in structured soil. We therefore developed a new approach to predict the root growth direction. In this approach we distinguish between, firstly, driving forces for root growth, i.e. the force exerted by the root which points in the direction of the previous root segment and gravitropism, and, secondly, the soil mechanical resistance to root growth or penetration resistance. The latter can be anisotropic, i.e. depending on the direction of growth, which leads to a difference between the direction of the driving force and the direction of the root tip movement. Anisotropy of penetration resistance can be caused either by microscale differences in soil structure or by macroscale features, including macropores. Anisotropy at the microscale is neglected in our model. To allow for this, we include a normally distributed random deflection angle α to the force which points in the direction of the previous root segment with zero mean and a standard deviation σ. The standard deviation σ is scaled, so that the deflection from the original root tip location does not depend on the spatial resolution of the root system model. Similarly to the water flow equation, the direction of the root tip movement corresponds to the water flux vector while the driving forces are related to the water potential gradient. The analogue of the hydraulic conductivity tensor is the root penetrability tensor. It is determined by the inverse of soil penetration resistance and describes the ease with which a root can penetrate the soil. By adapting the three dimensional soil and root water uptake model R-SWMS (Javaux et al., 2008) in this way

  14. User's manual for heat-pump seasonal-performance model (SPM) with selected parametric examples

    SciTech Connect

    Not Available

    1982-06-30

    The Seasonal Performance Model (SPM) was developed to provide an accurate source of seasonal energy consumption and cost predictions for the evaluation of heat pump design options. The program uses steady state heat pump performance data obtained from manufacturers' or Computer Simulation Model runs. The SPM was originally developed in two forms - a cooling model for central air conditioners and heat pumps and a heating model for heat pumps. The original models have undergone many modifications, which are described, to improve the accuracy of predictions and to increase flexibility for use in parametric evaluations. Insights are provided into the theory and construction of the major options, and into the use of the available options and output variables. Specific investigations provide examples of the possible applications of the model. (LEW)

  15. Classifying linearly shielded modified gravity models in effective field theory.

    PubMed

    Lombriser, Lucas; Taylor, Andy

    2015-01-23

    We study the model space generated by the time-dependent operator coefficients in the effective field theory of the cosmological background evolution and perturbations of modified gravity and dark energy models. We identify three classes of modified gravity models that reduce to Newtonian gravity on the small scales of linear theory. These general classes contain enough freedom to simultaneously admit a matching of the concordance model background expansion history. In particular, there exists a large model space that mimics the concordance model on all linear quasistatic subhorizon scales as well as in the background evolution. Such models also exist when restricting the theory space to operators introduced in Horndeski scalar-tensor gravity. We emphasize that whereas the partially shielded scenarios might be of interest to study in connection with tensions between large and small scale data, with conventional cosmological probes, the ability to distinguish the fully shielded scenarios from the concordance model on near-horizon scales will remain limited by cosmic variance. Novel tests of the large-scale structure remedying this deficiency and accounting for the full covariant nature of the alternative gravitational theories, however, might yield further insights on gravity in this regime. PMID:25658988

  16. The Use of Modelling for Theory Building in Qualitative Analysis

    ERIC Educational Resources Information Center

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  17. Minimal Pati-Salam model from string theory unification

    SciTech Connect

    Dent, James B.; Kephart, Thomas W.

    2008-06-01

    We provide what we believe is the minimal three family N=1 SUSY and conformal Pati-Salam model from type IIB superstring theory. This Z{sub 3} orbifolded AdS x S{sup 5} model has long lived protons and has potential phenomenological consequences for LHC (Large Hadron Collider)

  18. Goodness-of-Fit Assessment of Item Response Theory Models

    ERIC Educational Resources Information Center

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  19. Theory and Practice: An Integrative Model Linking Class and Field

    ERIC Educational Resources Information Center

    Lesser, Joan Granucci; Cooper, Marlene

    2006-01-01

    Social work has evolved over the years taking on the challenges of the times. The profession now espouses a breadth of theoretical approaches and treatment modalities. We have developed a model to help graduate social work students master the skill of integrating theory and social work practice. The Integrative Model has five components: (l) The…

  20. The Mapping Model: A Cognitive Theory of Quantitative Estimation

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2008-01-01

    How do people make quantitative estimations, such as estimating a car's selling price? Traditionally, linear-regression-type models have been used to answer this question. These models assume that people weight and integrate all information available to estimate a criterion. The authors propose an alternative cognitive theory for quantitative…

  1. Chiral field theories as models for hadron substructure

    SciTech Connect

    Kahana, S.H.

    1987-03-01

    A model for the nucleon as soliton of quarks interacting with classical meson fields is described. The theory, based on the linear sigma model, is renormalizable and capable of including sea quarks straightforwardly. Application to nuclear matter is made in a Wigner-Seitz approximation.

  2. Receptor model technical series. Volume 3 (1989 revision): CMB7 user's manual

    SciTech Connect

    Watson, J.G.; Henry, R.C.; Nguyen, Q.T.; Meyer, E.L.; Pace, T.G.

    1990-01-01

    The Chemical Mass Balance (CMB) receptor model uses chemical composition measured in the source and receptor samples to estimate the relative contributions of different source categories to ambient particulate concentration. The manual describes the CMB7 receptor model software. It is designed to allow users to use the CMB receptor model constructively with a few hour's learning time. Emphasizing rapid command of modeling procedures, the manual covers primarily the mechanical aspects of operating the model. Information on the theoretical basic principles of CMB receptor modeling is also briefly explained in the appendices. The manual is intended for wide use by State and local air pollution control agency personnel in developing State Implementation Plans for PM10. The U.S. Environmental Protection Agency has published a companion document to this manual that should be consulted for this application. The Protocol for Applying and Validating the CMB Model, EPA-450/4-87-010, provides guidance on applicability, assumptions and interpretation of results. This protocol provides a practical strategy for obtaining valid results.

  3. User's guide for the VTRPE computer model. Final report, May 90-Sep 91

    SciTech Connect

    Ryan, F.J.

    1991-10-01

    This report is a user's guide to the VTRPE (variable terrain radio parabolic equation) computer model. It is designed to provide the reader with a summary of the physics and numerical methods used in the VTRPE model, along with detailed instructions on the model's use and operation. The VTRPE computer program is a range-dependent, tropospheric microwave propagation model that is based upon the split-step Fourier parabolic wave equation algorithm. The nominal applicable frequency range of the model is VHF to K-band. The VTRPE program is able to make predictions for microwave propagation over both land and water. The VTRPE code is a full-wave propagation model that solves the electromagnetic wave equations for the complex electric and magnetic radiation fields. The model accounts for the effects of nonuniform atmospheric refractivity fields, variable surface terrain, and varying surface dielectric properties on microwave propagation. The code is written in ANSI-77 FORTRAN with MILSPEC-1753 FORTRAN language extensions. The VTRPE program is currently configured to run under the UNIX operating system on SUN minicomputers and CONVEX supercomputers, and under MS-DOS on 80386/80486-based PC's.

  4. CIRCE2/DEKGEN2: A software package for facilitated optical analysis of 3-D distributed solar energy concentrators. Theory and user manual

    SciTech Connect

    Romero, V.J.

    1994-03-01

    CIRCE2 is a computer code for modeling the optical performance of three-dimensional dish-type solar energy concentrators. Statistical methods are used to evaluate the directional distribution of reflected rays from any given point on the concentrator. Given concentrator and receiver geometries, sunshape (angular distribution of incident rays from the sun), and concentrator imperfections such as surface roughness and random deviation in slope, the code predicts the flux distribution and total power incident upon the target. Great freedom exists in the variety of concentrator and receiver configurations that can be modeled. Additionally, provisions for shading and receiver aperturing are included.- DEKGEN2 is a preprocessor designed to facilitate input of geometry, error distributions, and sun models. This manual describes the optical model, user inputs, code outputs, and operation of the software package. A user tutorial is included in which several collectors are built and analyzed in step-by-step examples.

  5. Nanofluid Drop Evaporation: Experiment, Theory, and Modeling

    NASA Astrophysics Data System (ADS)

    Gerken, William James

    Nanofluids, stable colloidal suspensions of nanoparticles in a base fluid, have potential applications in the heat transfer, combustion and propulsion, manufacturing, and medical fields. Experiments were conducted to determine the evaporation rate of room temperature, millimeter-sized pendant drops of ethanol laden with varying amounts (0-3% by weight) of 40-60 nm aluminum nanoparticles (nAl). Time-resolved high-resolution drop images were collected for the determination of early-time evaporation rate (D2/D 02 > 0.75), shown to exhibit D-square law behavior, and surface tension. Results show an asymptotic decrease in pendant drop evaporation rate with increasing nAl loading. The evaporation rate decreases by approximately 15% at around 1% to 3% nAl loading relative to the evaporation rate of pure ethanol. Surface tension was observed to be unaffected by nAl loading up to 3% by weight. A model was developed to describe the evaporation of the nanofluid pendant drops based on D-square law analysis for the gas domain and a description of the reduction in liquid fraction available for evaporation due to nanoparticle agglomerate packing near the evaporating drop surface. Model predictions are in relatively good agreement with experiment, within a few percent of measured nanofluid pendant drop evaporation rate. The evaporation of pinned nanofluid sessile drops was also considered via modeling. It was found that the same mechanism for nanofluid evaporation rate reduction used to explain pendant drops could be used for sessile drops. That mechanism is a reduction in evaporation rate due to a reduction in available ethanol for evaporation at the drop surface caused by the packing of nanoparticle agglomerates near the drop surface. Comparisons of the present modeling predictions with sessile drop evaporation rate measurements reported for nAl/ethanol nanofluids by Sefiane and Bennacer [11] are in fairly good agreement. Portions of this abstract previously appeared as: W. J

  6. A class of effective field theory models of cosmic acceleration

    NASA Astrophysics Data System (ADS)

    Bloomfield, Jolyon K.; Flanagan, Éanna É.

    2012-10-01

    We explore a class of effective field theory models of cosmic acceleration involving a metric and a single scalar field. These models can be obtained by starting with a set of ultralight pseudo-Nambu-Goldstone bosons whose couplings to matter satisfy the weak equivalence principle, assuming that one boson is lighter than all the others, and integrating out the heavier fields. The result is a quintessence model with matter coupling, together with a series of correction terms in the action in a covariant derivative expansion, with specific scalings for the coefficients. After eliminating higher derivative terms and exploiting the field redefinition freedom, we show that the resulting theory contains nine independent free functions of the scalar field when truncated at four derivatives. This is in contrast to the four free functions found in similar theories of single-field inflation, where matter is not present. We discuss several different representations of the theory that can be obtained using the field redefinition freedom. For perturbations to the quintessence field today on subhorizon lengthscales larger than the Compton wavelength of the heavy fields, the theory is weakly coupled and natural in the sense of t'Hooft. The theory admits a regime where the perturbations become modestly nonlinear, but very strong nonlinearities lie outside its domain of validity.

  7. Integrated Modeling Program, Applied Chemical Theory (IMPACT)

    PubMed Central

    BANKS, JAY L.; BEARD, HEGE S.; CAO, YIXIANG; CHO, ART E.; DAMM, WOLFGANG; FARID, RAMY; FELTS, ANTHONY K.; HALGREN, THOMAS A.; MAINZ, DANIEL T.; MAPLE, JON R.; MURPHY, ROBERT; PHILIPP, DEAN M.; REPASKY, MATTHEW P.; ZHANG, LINDA Y.; BERNE, BRUCE J.; FRIESNER, RICHARD A.; GALLICCHIO, EMILIO; LEVY, RONALD M.

    2009-01-01

    We provide an overview of the IMPACT molecular mechanics program with an emphasis on recent developments and a description of its current functionality. With respect to core molecular mechanics technologies we include a status report for the fixed charge and polarizable force fields that can be used with the program and illustrate how the force fields, when used together with new atom typing and parameter assignment modules, have greatly expanded the coverage of organic compounds and medicinally relevant ligands. As we discuss in this review, explicit solvent simulations have been used to guide our design of implicit solvent models based on the generalized Born framework and a novel nonpolar estimator that have recently been incorporated into the program. With IMPACT it is possible to use several different advanced conformational sampling algorithms based on combining features of molecular dynamics and Monte Carlo simulations. The program includes two specialized molecular mechanics modules: Glide, a high-throughput docking program, and QSite, a mixed quantum mechanics/molecular mechanics module. These modules employ the IMPACT infrastructure as a starting point for the construction of the protein model and assignment of molecular mechanics parameters, but have then been developed to meet specialized objectives with respect to sampling and the energy function. PMID:16211539

  8. Evaluation of custom energy expenditure models for SenseWear armband in manual wheelchair users.

    PubMed

    Tsang, KaLai; Hiremath, Shivayogi V; Cooper, Rory A; Ding, Dan

    2015-01-01

    Physical activity monitors are increasingly used to help the general population lead a healthy lifestyle by keeping track of their daily physical activity (PA) and energy expenditure (EE). However, none of the commercially available activity monitors can accurately estimate PA and EE in people who use wheelchairs as their primary means of mobility. Researchers have recently developed custom EE prediction models for manual wheelchair users (MWUs) with spinal cord injuries (SCIs) based on a commercial activity monitor--the SenseWear armband. This study evaluated the performance of two custom EE prediction models, including a general model and a set of activity-specific models among 45 MWUs with SCI. The estimated EE was obtained by using the two custom models and the default manufacturer's model, and it was compared with the gold standard measured by the K4b2 portable metabolic cart. The general, activity-specific, and default models had a mean signed percent error (mean +/- standard deviation) of -2.8 +/- 26.1%, -4.8 +/- 25.4%, and -39.6 +/- 37.8%, respectively. The intraclass correlation coefficient was 0.86 (95% confidence interval [CI] = 0.82 to 0.89) for the general model, 0.83 (95% CI = 0.79 to 0.87) for the activity-specific model, and 0.62 (95% CI = 0.16 to 0.81) for the default model. The custom models for the SenseWear armband significantly improved the EE estimation accuracy for MWUs with SCI. PMID:26745837

  9. A Brinkmanship Game Theory Model of Terrorism

    NASA Astrophysics Data System (ADS)

    Melese, Francois

    This study reveals conditions under which a world leader might credibly issue a brinkmanship threat of preemptive action to deter sovereign states or transnational terrorist organizations from acquiring weapons of mass destruction (WMD). The model consists of two players: the United Nations (UN) “Principal,” and a terrorist organization “Agent.” The challenge in issuing a brinkmanship threat is that it needs to be sufficiently unpleasant to deter terrorists from acquiring WMD, while not being so repugnant to those that must carry it out that they would refuse to do so. Two “credibility constraints” are derived. The first relates to the unknown terrorist type (Hard or Soft), and the second to acceptable risks (“blowback”) to the World community. Graphing the incentive-compatible Nash equilibrium solutions reveals when a brinkmanship threat is credible, and when it is not - either too weak to be effective, or unacceptably dangerous to the World community.

  10. A Graphical User Interface for Parameterizing Biochemical Models of Photosynthesis and Chlorophyll Fluorescence

    NASA Astrophysics Data System (ADS)

    Kornfeld, A.; Van der Tol, C.; Berry, J. A.

    2015-12-01

    Recent advances in optical remote sensing of photosynthesis offer great promise for estimating gross primary productivity (GPP) at leaf, canopy and even global scale. These methods -including solar-induced chlorophyll fluorescence (SIF) emission, fluorescence spectra, and hyperspectral features such as the red edge and the photochemical reflectance index (PRI) - can be used to greatly enhance the predictive power of global circulation models (GCMs) by providing better constraints on GPP. The way to use measured optical data to parameterize existing models such as SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes) is not trivial, however. We have therefore extended a biochemical model to include fluorescence and other parameters in a coupled treatment. To help parameterize the model, we then use nonlinear curve-fitting routines to determine the parameter set that enables model results to best fit leaf-level gas exchange and optical data measurements. To make the tool more accessible to all practitioners, we have further designed a graphical user interface (GUI) based front-end to allow researchers to analyze data with a minimum of effort while, at the same time, allowing them to change parameters interactively to visualize how variation in model parameters affect predicted outcomes such as photosynthetic rates, electron transport, and chlorophyll fluorescence. Here we discuss the tool and its effectiveness, using recently-gathered leaf-level data.

  11. A graphical user interface for numerical modeling of acclimation responses of vegetation to climate change

    NASA Astrophysics Data System (ADS)

    Le, Phong V. V.; Kumar, Praveen; Drewry, Darren T.; Quijano, Juan C.

    2012-12-01

    Ecophysiological models that vertically resolve vegetation canopy states are becoming a powerful tool for studying the exchange of mass, energy, and momentum between the land surface and the atmosphere. A mechanistic multilayer canopy-soil-root system model (MLCan) developed by Drewry et al. (2010a) has been used to capture the emergent vegetation responses to elevated atmospheric CO2 for both C3 and C4 plants under various climate conditions. However, processing input data and setting up such a model can be time-consuming and error-prone. In this paper, a graphical user interface that has been developed for MLCan is presented. The design of this interface aims to provide visualization capabilities and interactive support for processing input meteorological forcing data and vegetation parameter values to facilitate the use of this model. In addition, the interface also provides graphical tools for analyzing the forcing data and simulated numerical results. The model and its interface are both written in the MATLAB programming language. Finally, an application of this model package for capturing the ecohydrological responses of three bioenergy crops (maize, miscanthus, and switchgrass) to local environmental drivers at two different sites in the Midwestern United States is presented.

  12. Effect of Human Model Height and Sex on Induced Current Dosimetry in Household Induction Heater Users

    NASA Astrophysics Data System (ADS)

    Tarao, Hiroo; Hayashi, Noriyuki; Isaka, Katsuo

    Induced currents in the high-resolution, anatomical human models are numerically calculated by the impedance method. The human models are supposed to be exposed to highly inhomogeneous 20.9 kHz magnetic fields from a household induction heater (IH). In the case of the adult models, the currents ranging from 5 to 19 mA/m2 are induced for between the shoulder and lower abdomen. Meanwhile, in the case of the child models, the currents ranging from 5 to 21 mA/m2 are induced for between the head and abdomen. In particular, the induced currents near the brain tissue are almost the same as those near the abdomen. When the induced currents in the central nervous system tissues are considered, the induced currents in the child model are 2.1 to 6.9 times as large as those in the adult model under the same B-field exposure environment. These results suggest the importance of further investigation intended for a pregnant female who uses the IH as well as for a child (or the IH users of small standing height).

  13. TIME Impact - a new user-friendly tuberculosis (TB) model to inform TB policy decisions.

    PubMed

    Houben, R M G J; Lalli, M; Sumner, T; Hamilton, M; Pedrazzoli, D; Bonsu, F; Hippner, P; Pillay, Y; Kimerling, M; Ahmedov, S; Pretorius, C; White, R G

    2016-01-01

    Tuberculosis (TB) is the leading cause of death from infectious disease worldwide, predominantly affecting low- and middle-income countries (LMICs), where resources are limited. As such, countries need to be able to choose the most efficient interventions for their respective setting. Mathematical models can be valuable tools to inform rational policy decisions and improve resource allocation, but are often unavailable or inaccessible for LMICs, particularly in TB. We developed TIME Impact, a user-friendly TB model that enables local capacity building and strengthens country-specific policy discussions to inform support funding applications at the (sub-)national level (e.g. Ministry of Finance) or to international donors (e.g. the Global Fund to Fight AIDS, Tuberculosis and Malaria).TIME Impact is an epidemiological transmission model nested in TIME, a set of TB modelling tools available for free download within the widely-used Spectrum software. The TIME Impact model reflects key aspects of the natural history of TB, with additional structure for HIV/ART, drug resistance, treatment history and age. TIME Impact enables national TB programmes (NTPs) and other TB policymakers to better understand their own TB epidemic, plan their response, apply for funding and evaluate the implementation of the response.The explicit aim of TIME Impact's user-friendly interface is to enable training of local and international TB experts towards independent use. During application of TIME Impact, close involvement of the NTPs and other local partners also builds critical understanding of the modelling methods, assumptions and limitations inherent to modelling. This is essential to generate broad country-level ownership of the modelling data inputs and results. In turn, it stimulates discussions and a review of the current evidence and assumptions, strengthening the decision-making process in general.TIME Impact has been effectively applied in a variety of settings. In South Africa, it

  14. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    ERIC Educational Resources Information Center

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  15. Integrating Developmental Theory and Methodology: Using Derivatives to Articulate Change Theories, Models, and Inferences

    ERIC Educational Resources Information Center

    Deboeck, Pascal R.; Nicholson, Jody; Kouros, Chrystyna; Little, Todd D.; Garber, Judy

    2015-01-01

    Matching theories about growth, development, and change to appropriate statistical models can present a challenge, which can result in misuse, misinterpretation, and underutilization of different analytical approaches. We discuss the use of "derivatives": the change of a construct with respect to the change in another construct.…

  16. Population changes: contemporary models and theories.

    PubMed

    Sauvy, A

    1981-01-01

    In many developing countries rapid population growth has promoted a renewed interest in the study of the effect of population growth on economic development. This research takes either the macroeconomic viewpoint, where the nation is the framework, or the microeconomic perspective, where the family is the framework. For expository purposes, the macroeconomic viewpoint is assumed, and an example of such an investment is presented. Attention is directed to the following: a simplified model--housing; the lessons learned from experience (primitive populations, Spain in the 17th and 18th centuries, comparing development in Spain and Italy, 19th century Western Europe, and underdeveloped countries); the positive factors of population growth; and the concept of the optimal rate of growth. Housing is the typical investment that an individual makes. Hence, the housing per person (roughly 1/3 of the necessary amount of housing per family) is taken as a unit, and the calculations are made using averages. The conclusion is that growth is expensive. A population decrease might be advantageous, for this decrease would enable the entire population to benefit from past capital accumulation. It is also believed, "a priori," that population growth is more expensive for a developed than for a developing country. This belief may be attributable to the fact that the capital per person tends to be high in the developed countries. Any further increase in the population requires additional capital investments, driving this ratio even higher. Yet, investment is not the only factor inhibiting economic development. The literature describes factors regarding population growth, yet this writer prefers to emphasize 2 other factors that have been the subject of less study: a growing population's ease of adaptation and the human factor--behavior. A growing population adapts better to new conditions than does a stationary or declining population, and contrary to "a priori" belief, a growing

  17. Summary of papers presented in the Theory and Modelling session

    NASA Astrophysics Data System (ADS)

    Lin-Liu, Y. R.; Westerhof, E.

    2012-09-01

    A total of 14 contributions were presented in the Theory and Modelling sessions at EC-17. One Theory and Modelling paper was included in the ITER ECRH and ECE sessions each. Three papers were in the area of nonlinear physics discussing parametric processes accompanying ECRH. Eight papers were based on the quasi-linear theory of wave heating and current drive. Three of these addressed the application of ECCD for NTM stabilization. Two papers considered scattering of EC waves by edge density fluctuations and related phenomena. In this summary, we briefly describe the highlights of these contributions. Finally, the three papers concerning modelling of various aspects of ECE are reported in the ECE session.

  18. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    PubMed

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  19. INTERLINE 5.0 -- An expanded railroad routing model: Program description, methodology, and revised user`s manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S.; Clarke, D.B.; Jacobi, J.M.

    1993-03-01

    A rail routine model, INTERLINE, has been developed at the Oak Ridge National Laboratory to investigate potential routes for transporting radioactive materials. In Version 5.0, the INTERLINE routing algorithms have been enhanced to include the ability to predict alternative routes, barge routes, and population statistics for any route. The INTERLINE railroad network is essentially a computerized rail atlas describing the US railroad system. All rail lines, with the exception of industrial spurs, are included in the network. Inland waterways and deep water routes along with their interchange points with the US railroadsystem are also included. The network contains over 15,000 rail and barge segments (links) and over 13,000 stations, interchange points, ports, and other locations (nodes). The INTERLINE model has been converted to operate on an IBM-compatible personal computer. At least a 286 computer with a hard disk containing approximately 6 MB of free space is recommended. Enhanced program performance will be obtained by using arandom-access memory drive on a 386 or 486 computer.

  20. Theory of Time beyond the standard model

    SciTech Connect

    Poliakov, Eugene S.

    2008-05-29

    A frame of non-uniform time is discussed. A concept of 'flow of time' is presented. The principle of time relativity in analogy with Galilean principle of relativity is set. Equivalence principle is set to state that the outcome of non-uniform time in an inertial frame of reference is equivalent to the outcome of a fictitious gravity force external to the frame of reference. Thus it is flow of time that causes gravity rather than mass. The latter is compared to experimental data achieving precision of up to 0.0003%. It is shown that the law of energy conservation is inapplicable to the frames of non-uniform time. A theoretical model of a physical entity (point mass, photon) travelling in the field of non-uniform time is considered. A generalized law that allows the flow of time to replace classical energy conservation is introduced on the basis of the experiment of Pound and Rebka. It is shown that linear dependence of flow of time on spatial coordinate conforms the inverse square law of universal gravitation and Keplerian mechanics. Momentum is shown to still be conserved.

  1. FINDING POTENTIALLY UNSAFE NUTRITIONAL SUPPLEMENTS FROM USER REVIEWS WITH TOPIC MODELING.

    PubMed

    Sullivan, Ryan; Sarker, Abeed; O'Connor, Karen; Goodin, Amanda; Karlsrud, Mark; Gonzalez, Graciela

    2016-01-01

    Although dietary supplements are widely used and generally are considered safe, some supplements have been identified as causative agents for adverse reactions, some of which may even be fatal. The Food and Drug Administration (FDA) is responsible for monitoring supplements and ensuring that supplements are safe. However, current surveillance protocols are not always effective. Leveraging user-generated textual data, in the form of Amazon.com reviews for nutritional supplements, we use natural language processing techniques to develop a system for the monitoring of dietary supplements. We use topic modeling techniques, specifically a variation of Latent Dirichlet Allocation (LDA), and background knowledge in the form of an adverse reaction dictionary to score products based on their potential danger to the public. Our approach generates topics that semantically capture adverse reactions from a document set consisting of reviews posted by users of specific products, and based on these topics, we propose a scoring mechanism to categorize products as "high potential danger", "average potential danger" and "low potential danger." We evaluate our system by comparing the system categorization with human annotators, and we find that the our system agrees with the annotators 69.4% of the time. With these results, we demonstrate that our methods show promise and that our system represents a proof of concept as a viable low-cost, active approach for dietary supplement monitoring. PMID:26776215

  2. Perception of Influencing Factors on Acceptance of Mobile Health Monitoring Service: A Comparison between Users and Non-users

    PubMed Central

    Lee, Jaebeom

    2013-01-01

    Objectives To improve and promote mobile health monitoring services, this study investigated the perception of various factors influencing the acceptance of services between users and non-users. Methods This study drew 9 variables from studies related to mobile health monitoring services and the unified theory of acceptance and the use of technology model. A total of 219 samples were collected by a paper-based survey from users (n = 106) and non-users (n = 113). Analysis was carried out using a two-independent samples t-test. Results The findings indicate that users have a more positive perception of service benefits than non-users. Although there were difference between users and non-users, all respondents had a positive perception of the service benefits. After users used the service, they were less concerned about the risks involved with it. However, both users and non-users had a high negative perception of service risk. Users also had a more positive perception of intimacy and communication associated with the services than non-users. Both users and non-users had a high behavioral intention to use the services. Finally, this study observed that older subjects tended to recognize the higher value of the services. Conclusions This study provides insights to improve and invigorate mobile health monitoring services. This study also offers insights into how to increase the number of users of mobile health monitoring services in South Korea. PMID:24175115

  3. Supersymmetry and String Theory: Beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Dine, Michael

    2007-01-01

    The past decade has witnessed dramatic developments in the field of theoretical physics. This book is a comprehensive introduction to these recent developments. It contains a review of the Standard Model, covering non-perturbative topics, and a discussion of grand unified theories and magnetic monopoles. It introduces the basics of supersymmetry and its phenomenology, and includes dynamics, dynamical supersymmetry breaking, and electric-magnetic duality. The book then covers general relativity and the big bang theory, and the basic issues in inflationary cosmologies before discussing the spectra of known string theories and the features of their interactions. The book also includes brief introductions to technicolor, large extra dimensions, and the Randall-Sundrum theory of warped spaces. This will be of great interest to graduates and researchers in the fields of particle theory, string theory, astrophysics and cosmology. The book contains several problems, and password protected solutions will be available to lecturers at www.cambridge.org/9780521858410. Provides reader with tools to confront limitations of the Standard Model Includes several exercises and problems Solutions are available to lecturers at www.cambridge.org/9780521858410

  4. Modeling pyramidal sensors in ray-tracing software by a suitable user-defined surface

    NASA Astrophysics Data System (ADS)

    Antichi, Jacopo; Munari, Matteo; Magrin, Demetrio; Riccardi, Armando

    2016-04-01

    Following the unprecedented results in terms of performances delivered by the first light adaptive optics system at the Large Binocular Telescope, there has been a wide-spread and increasing interest on the pyramid wavefront sensor (PWFS), which is the key component, together with the adaptive secondary mirror, of the adaptive optics (AO) module. Currently, there is no straightforward way to model a PWFS in standard sequential ray-tracing software. Common modeling strategies tend to be user-specific and, in general, are unsatisfactory for general applications. To address this problem, we have developed an approach to PWFS modeling based on user-defined surface (UDS), whose properties reside in a specific code written in C language, for the ray-tracing software ZEMAX™. With our approach, the pyramid optical component is implemented as a standard surface in ZEMAX™, exploiting its dynamic link library (DLL) conversion then greatly simplifying ray tracing and analysis. We have utilized the pyramid UDS DLL surface-referred to as pyramidal acronyms may be too risky (PAM2R)-in order to design the current PWFS-based AO system for the Giant Magellan Telescope, evaluating tolerances, with particular attention to the angular sensitivities, by means of sequential ray-tracing tools only, thus verifying PAM2R reliability and robustness. This work indicates that PAM2R makes the design of PWFS as simple as that of other optical standard components. This is particularly suitable with the advent of the extremely large telescopes era for which complexity is definitely one of the main challenges.

  5. Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study

    NASA Astrophysics Data System (ADS)

    Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana

    The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.

  6. RELMAP: a regional Lagrangian model of air pollution - user's guide. Final report

    SciTech Connect

    Eder, B.K.; Coventry, D.H.; Clark, T.L.; Bollinger, C.E.

    1986-03-01

    The regional Lagrangian Model of Air Pollution (RELMAP) is a mass-conserving, Lagrangian model that simulates ambient concentrations and wet and dry depositions of SO/sub 2/, SO/sub 4//sup 2 -/ and fine and coarse particulate matter over the eastern United States and southeastern Canada (default domain). Discrete puffs of pollutants, which are released periodically over the model's domain, are transported by wind fields, and subjected to linear chemical transformation and wet and dry deposition processes. The model, which is generally run for one month, can operate in two different output modes. The first mode produces patterns of ambient concentration, and wet and dry deposition over the defined domain, and the second mode produces interregional exchange matrices over user-specified source/receptor regions. RELMAP was written in FORTRAN IV on the Sperry UNIVAC 1100/82, and consists of 19 preprocessor programs that prepare meteorological and emissions data for use in the main program, which uses 17 subroutines to produce the model simulations.

  7. Consistent constraints on the Standard Model Effective Field Theory

    NASA Astrophysics Data System (ADS)

    Berthier, Laure; Trott, Michael

    2016-02-01

    We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S, T analysis is modified by the theory errors we include as an illustrative example.

  8. Compartmental models: theory and practice using the SAAM II software system.

    PubMed

    Cobelli, C; Foster, D M

    1998-01-01

    Understanding in vivo the functioning of metabolic systems at the whole-body or regional level requires one to make some assumptions on how the system works and to describe them mathematically, that is, to postulate a model of the system. Models of systems can have different characteristics depending on the properties of the system and the database available for their study; they can be deterministic or stochastic, dynamic or static, with lumped or distributed parameters. Metabolic systems are dynamic systems and we focus here on the most widely used class of dynamic (differential equation) models: compartmental models. This is a class of models for which the governing law is conservation of mass. It is a very attractive class to users because it formalizes physical intuition in a simple and reasonable way. Compartmental models are lumped parameter models, in that the events in the system are described by a finite number of changing variables, and are thus described by ordinary differential equations. While stochastic compartment models can also be defined, we discuss here the deterministic versions--those that can work with exact relationships between model variables. These are the models most widely used in discussions of endocrinology and metabolism. In this chapter, we will discuss the theory of compartmental models, and then discuss how the SAAM II software system, a system designed specifically to aid in the development and testing of multicompartmental models, can be used. PMID:9781383

  9. Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation.

    NASA Astrophysics Data System (ADS)

    Lawley, Russell; Barron, Mark; Lark, Murray

    2015-04-01

    At BGS, expert elicitation has been used to evaluate uncertainty of surveyed boundaries in several, common, geological scenarios. As a result, a 'collective' understanding of the issues surrounding each scenario has emerged. The work has provoked wider debate in three key areas: a) what can we do to resolve those scenarios where a 'consensus' of understanding cannot be achieved b) what does it mean for survey practices and subsequent use of maps in 3D models c) how do we communicate the 'collective' understanding of geological mapping (with or without consensus for specific scenarios). Previous work elicited expert judgement for uncertainty in six contrasting mapping scenarios. In five cases it was possible to arrive at a consensus model; in a sixth case experts with different experience (length of service, academic background) took very different views of the nature of the mapping problem. The scenario concerned identification of the boundary between two contrasting tills (one derived from Triassic source materials being red in colour; the other, derived from Jurassic materials being grey in colour). Initial debate during the elicitation identified that the colour contrast should provide some degree of confidence in locating the boundary via traditional auger-traverse survey methods. However, as the elicitation progressed, it became clear that the complexities of the relationship between the two Tills were not uniformly understood across the experts and the panel could not agree a consensus regarding the spatial uncertainty of the boundary. The elicitation process allowed a significant degree of structured knowledge-exchange between experts of differing backgrounds and was successful in identifying a measure of uncertainty for what was considered a contentious scenario. However, the findings have significant implications for a boundary-scenario that is widely mapped across the central regions of Great Britain. We will discuss our experience of the use of

  10. Theory, modeling and simulation of superconducting qubits

    SciTech Connect

    Berman, Gennady P; Kamenev, Dmitry I; Chumak, Alexander; Kinion, Carin; Tsifrinovich, Vladimir

    2011-01-13

    We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high

  11. Numerical tests of nucleation theories for the Ising models

    NASA Astrophysics Data System (ADS)

    Ryu, Seunghwa; Cai, Wei

    2010-07-01

    The classical nucleation theory (CNT) is tested systematically by computer simulations of the two-dimensional (2D) and three-dimensional (3D) Ising models with a Glauber-type spin flip dynamics. While previous studies suggested potential problems with CNT, our numerical results show that the fundamental assumption of CNT is correct. In particular, the Becker-Döring theory accurately predicts the nucleation rate if the correct droplet free energy function is provided as input. This validates the coarse graining of the system into a one dimensional Markov chain with the largest droplet size as the reaction coordinate. Furthermore, in the 2D Ising model, the droplet free energy predicted by CNT matches numerical results very well, after a logarithmic correction term from Langer’s field theory and a constant correction term are added. But significant discrepancies are found between the numerical results and existing theories on the magnitude of the logarithmic correction term in the 3D Ising model. Our analysis underscores the importance of correctly accounting for the temperature dependence of surface energy when comparing numerical results and nucleation theories.

  12. User's Guide for the Agricultural Non-Point Source (AGNPS) Pollution Model Data Generator

    USGS Publications Warehouse

    Finn, Michael P.; Scheidt, Douglas J.; Jaromack, Gregory M.

    2003-01-01

    BACKGROUND Throughout this user guide, we refer to datasets that we used in conjunction with developing of this software for supporting cartographic research and producing the datasets to conduct research. However, this software can be used with these datasets or with more 'generic' versions of data of the appropriate type. For example, throughout the guide, we refer to national land cover data (NLCD) and digital elevation model (DEM) data from the U.S. Geological Survey (USGS) at a 30-m resolution, but any digital terrain model or land cover data at any appropriate resolution will produce results. Another key point to keep in mind is to use a consistent data resolution for all the datasets per model run. The U.S. Department of Agriculture (USDA) developed the Agricultural Nonpoint Source (AGNPS) pollution model of watershed hydrology in response to the complex problem of managing nonpoint sources of pollution. AGNPS simulates the behavior of runoff, sediment, and nutrient transport from watersheds that have agriculture as their prime use. The model operates on a cell basis and is a distributed parameter, event-based model. The model requires 22 input parameters. Output parameters are grouped primarily by hydrology, sediment, and chemical output (Young and others, 1995.) Elevation, land cover, and soil are the base data from which to extract the 22 input parameters required by the AGNPS. For automatic parameter extraction, follow the general process described in this guide of extraction from the geospatial data through the AGNPS Data Generator to generate input parameters required by the pollution model (Finn and others, 2002.)

  13. Modelling strain localization in granular materials using micropolar theory: mathematical formulations

    NASA Astrophysics Data System (ADS)

    Alsaleh, Mustafa I.; Voyiadjis, George Z.; Alshibli, Khalid A.

    2006-12-01

    It has been known that classical continuum mechanics laws fail to describe strain localization in granular materials due to the mathematical ill-posedness and mesh dependency. Therefore, a non-local theory with internal length scales is needed to overcome such problems. The micropolar and high-order gradient theories can be considered as good examples to characterize the strain localization in granular materials. The fact that internal length scales are needed requires micromechanical models or laws; however, the classical constitutive models can be enhanced through the stress invariants to incorporate the Micropolar effects. In this paper, Lade's single hardening model is enhanced to account for the couple stress and Cosserat rotation and the internal length scales are incorporated accordingly. The enhanced Lade's model and its material properties are discussed in detail; then the finite element formulations in the Updated Lagrangian Frame (UL) are used. The finite element formulations were implemented into a user element subroutine for ABAQUS (UEL) and the solution method is discussed in the companion paper. The model was found to predict the strain localization in granular materials with low dependency on the finite element mesh size. The shear band was found to reflect on a certain angle when it hit a rigid boundary. Applications for the model on plane strain specimens tested in the laboratory are discussed in the companion paper. Copyright

  14. User's guide for the stock-recruitment model validation program. Environmental Sciences Division Publication No. 1985

    SciTech Connect

    Christensen, S.W.; Kirk, B.L.; Goodyear, C.P.

    1982-06-01

    SRVAL is a FORTRAN IV computer code designed to aid in assessing the validity of curve-fits of the linearized Ricker stock-recruitment model, modified to incorporate multiple-age spawners and to include an environmental variable, to variously processed annual catch-per-unit effort statistics for a fish population. It is sometimes asserted that curve-fits of this kind can be used to determine the sensitivity of fish populations to such man-induced stresses as entrainment and impingement at power plants. The SRVAL code was developed to test such assertions. It was utilized in testimony written in connection with the Hudson River Power Case (US Environmental Protection Agency, Region II). This testimony was recently published as a NUREG report. Here, a user's guide for SRVAL is presented.

  15. Fatigue strength reduction model: RANDOM3 and RANDOM4 user manual, appendix 2

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    The FORTRAN programs RANDOM3 and RANDOM4 are documented. They are based on fatigue strength reduction, using a probabilistic constitutive model. They predict the random lifetime of an engine component to reach a given fatigue strength. Included in this user manual are details regarding the theoretical backgrounds of RANDOM3 and RANDOM4. Appendix A gives information on the physical quantities, their symbols, FORTRAN names, and both SI and U.S. Customary units. Appendix B and C include photocopies of the actual computer printout corresponding to the sample problems. Appendices D and E detail the IMSL, Version 10(1), subroutines and functions called by RANDOM3 and RANDOM4 and SAS/GRAPH(2) programs that can be used to plot both the probability density functions (p.d.f.) and the cumulative distribution functions (c.d.f.).

  16. User-support models at the Isaac Newton Group of Telescopes

    NASA Astrophysics Data System (ADS)

    Benn, Chris

    2012-09-01

    The user support model at the ING telescopes has evolved considerably over the last 20 years, mainly in response to improvements in the reliability and efficiency of the observing systems. Observers at the 4.2-m William Her- schel Telescope (WHT) currently get first-night (afternoon + evening) support from staff support astronomers, and all-night support from telescope operators. As of 2010, the telescope operators also provide engineering sup- port at night. Observers at the 2.5-m Isaac Newton Telescope (INT) get first-night support from student support astronomers, but no night-time operator/engineering support. Feedback from observers indicates a continuing high level of satisfaction with the support they receive.

  17. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    SciTech Connect

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  18. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  19. Extending Theory for User-Centered Information Services: Diagnosing and Learning from Error in Complex Statistical Data.

    ERIC Educational Resources Information Center

    Robbin, Alice; Frost-Kumpf, Lee

    1997-01-01

    Extends a theoretical framework for designing effective information services by synthesizing and integrating theory and research derived from multiple approaches in the social and behavioral sciences. These frameworks are applied to develop general design strategies and principles for information systems and services that rely on complex…

  20. User's manual for the Sandia Waste-Isolation Flow and Transport model (SWIFT).

    SciTech Connect

    Reeves, Mark; Cranwell, Robert M.

    1981-11-01

    This report describes a three-dimensional finite-difference model (SWIFT) which is used to simulate flow and transport processes in geologic media. The model was developed for use by the Nuclear Regulatory Commission in the analysis of deep geologic nuclear waste-disposal facilities. This document, as indicated by the title, is a user's manual and is intended to facilitate the use of the SWIFT simulator. Mathematical equations, submodels, application notes, and a description of the program itself are given herein. In addition, a complete input data guide is given along with several appendices which are helpful in setting up a data-input deck. Computer code SWIFT (Sandia Waste Isolation, Flow and Transport Model) is a fully transient, three-dimensional model which solves the coupled equations for transport in geologic media. The processes considered are: (1) fluid flow; (2) heat transport; (3) dominant-species miscible displacement; and (4) trace-species miscible displacement. The first three processes are coupled via fluid density and viscosity. Together they provide the velocity field on which the fourth process depends.

  1. Modeling the HIV/AIDS epidemic among injecting drug users and sex workers in Kunming, China.

    PubMed

    Bacaër, Nicolas; Abdurahman, Xamxinur; Ye, Jianli

    2006-04-01

    This paper presents a mathematical model of the HIV/AIDS epidemic in Kunming, the provincial capital of Yunnan, China. The population is divided into several groups, with individuals possibly changing group. Two transmission routes of HIV are considered: needle sharing between injecting drug users (IDUs) and commercial sex between female sex worker (FSWs) and clients. The model includes male IDUs who are also clients and female IDUs who are also FSWs. Groups are split in two--risky and safe--according to condom use and needle sharing. A system of partial differential equations is derived to describe the spread of the disease. For the simulation, parameters are chosen to fit as much as possible data publicly available for Kunming. Some mathematical properties of the model--in particular the epidemic threshold R0 which determines the goal of public health interventions--are also presented. Though the model couples two transmission routes of HIV, the approximation R0 approximately = max[R0(IDU), R0(sex)], with closed formulas for R0(IDU) and R0(sex), appears to be quite good. The critical levels of condom use and clean needle use necessary to stop both the sexual transmission and the transmission among IDUs can therefore be determined independently. PMID:16794944

  2. A user-friendly one-dimensional model for wet volcanic plumes

    USGS Publications Warehouse

    Mastin, Larry G.

    2007-01-01

    This paper presents a user-friendly graphically based numerical model of one-dimensional steady state homogeneous volcanic plumes that calculates and plots profiles of upward velocity, plume density, radius, temperature, and other parameters as a function of height. The model considers effects of water condensation and ice formation on plume dynamics as well as the effect of water added to the plume at the vent. Atmospheric conditions may be specified through input parameters of constant lapse rates and relative humidity, or by loading profiles of actual atmospheric soundings. To illustrate the utility of the model, we compare calculations with field-based estimates of plume height (∼9 km) and eruption rate (>∼4 × 105 kg/s) during a brief tephra eruption at Mount St. Helens on 8 March 2005. Results show that the atmospheric conditions on that day boosted plume height by 1–3 km over that in a standard dry atmosphere. Although the eruption temperature was unknown, model calculations most closely match the observations for a temperature that is below magmatic but above 100°C.

  3. Coarse-grained theory of a realistic tetrahedral liquid model

    NASA Astrophysics Data System (ADS)

    Procaccia, I.; Regev, I.

    2012-02-01

    Tetrahedral liquids such as water and silica-melt show unusual thermodynamic behavior such as a density maximum and an increase in specific heat when cooled to low temperatures. Previous work had shown that Monte Carlo and mean-field solutions of a lattice model can exhibit these anomalous properties with or without a phase transition, depending on the values of the different terms in the Hamiltonian. Here we use a somewhat different approach, where we start from a very popular empirical model of tetrahedral liquids —the Stillinger-Weber model— and construct a coarse-grained theory which directly quantifies the local structure of the liquid as a function of volume and temperature. We compare the theory to molecular-dynamics simulations and show that the theory can rationalize the simulation results and the anomalous behavior.

  4. HIGHWAY 3. 1: An enhanced HIGHWAY routing model: Program description, methodology, and revised user's manual

    SciTech Connect

    Johnson, P.E.; Joy, D.S. ); Clarke, D.B.; Jacobi, J.M. . Transportation Center)

    1993-03-01

    The HIGHWAY program provides a flexible tool for predicting highway routes for transporting radioactive materials in the United States. The HIGHWAY data base is essentially a computerized road atlas that currently describes over 240,000 miles of highways. Complete descriptions of all Interstate System and most US highways (except those that parallel a nearby Interstate highway) are included in the data base. Many of the principal state highways and a number of local and county highways are also identified. The data base also includes locations of nuclear facilities and major airports. Several different types of routes may be calculated, depending on a set of user-supplied constraints. Routes are calculated by minimizing the total impedance between the origin and the destination. Basically, the impedance is defined as a function of distance and driving time along a particular highway segment. Several routing constraints can be imposed during the calculations. One of the special features of the HIGHWAY model is its ability to calculate routes that maximize use of Interstate System highways. This feature allows the user to predict routes for shipments of radioactive materials that conform to the US Department of Transportation routing regulations. Other features of the model include the ability to predict routes that bypass a specific state, city, town, or highway segment. Two special features have been incorporated in HIGHWAY, version 3.1. The first is the ability to automatically calculate alternative routes. Frequently, there are a number of routes between the source and destination that vary slightly in distance and estimated driving time. The HIGHWAY program offers a selection of different but nearly equal routes. The second special feature is the capability to calculate route-specific population density statistics. The population density distribution is calculated for each highway segment in the route and is reported on a state-by-state basis.

  5. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    PubMed Central

    Tsai, Chung-Hung

    2014-01-01

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities. PMID:24810577

  6. Excellence in Physics Education Award: Modeling Theory for Physics Instruction

    NASA Astrophysics Data System (ADS)

    Hestenes, David

    2014-03-01

    All humans create mental models to plan and guide their interactions with the physical world. Science has greatly refined and extended this ability by creating and validating formal scientific models of physical things and processes. Research in physics education has found that mental models created from everyday experience are largely incompatible with scientific models. This suggests that the fundamental problem in learning and understanding science is coordinating mental models with scientific models. Modeling Theory has drawn on resources of cognitive science to work out extensive implications of this suggestion and guide development of an approach to science pedagogy and curriculum design called Modeling Instruction. Modeling Instruction has been widely applied to high school physics and, more recently, to chemistry and biology, with noteworthy results.

  7. PAL-DS MODEL: THE PAL MODEL INCLUDING DEPOSITION AND SEDIMENTATION. USER'S GUIDE

    EPA Science Inventory

    PAL is an acronym for an air quality model which applies a Gaussian plume diffusion algorithm to point, area, and line sources. The model is available from the U.S. Environmental Protection Agency and can be used for estimating hourly and short-term average concentrations of non-...

  8. REGIONAL OXIDANT MODEL (ROM) USER'S GUIDE, PART 3: THE CORE MODEL

    EPA Science Inventory

    The Regional Oxidant Model (ROM) determines hourly concentrations and fates of zone and 34 other chemical species over a scale of 1000 km x 1000 km for ozone "episodes" of up to one month's duration. he model structure, based on phenomenological concepts, consists of 3 1/2 layers...

  9. F-theory duals of singular heterotic K3 models

    NASA Astrophysics Data System (ADS)

    Lüdeling, Christoph; Ruehle, Fabian

    2015-01-01

    We study F-theory duals of singular heterotic K3 models that correspond to Abelian toroidal orbifolds T4/ZN . While our focus is on the standard embedding, we also comment on models with Wilson lines and more general gauge embeddings. In the process of constructing the duals, we work out a Weierstrass description of the heterotic toroidal orbifold models, which exhibit singularities of Kodaira type I0* , IV * , II I * , and II * . This construction unveils properties like the instanton number per fixed point and a correlation between the orbifold order and the multiplicities in the Dynkin diagram. The results from the Weierstrass description are then used to restrict the complex structure of the F-theory Calabi-Yau threefold such that the gauge group and the matter spectrum of the heterotic theories are reproduced. We also comment on previous approaches that have been employed to construct the duality and point out the differences and limitations in our case. Our results show explicitly how the various orbifold models are connected and described in F-theory.

  10. A Model of Statistics Performance Based on Achievement Goal Theory.

    ERIC Educational Resources Information Center

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  11. A Proposed Model of Jazz Theory Knowledge Acquisition

    ERIC Educational Resources Information Center

    Ciorba, Charles R.; Russell, Brian E.

    2014-01-01

    The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…

  12. The Decision/Adoption Model as a Heuristic Program Theory.

    ERIC Educational Resources Information Center

    Tedrick, William E.

    The emerging concept of program theory and its function in program evaluation practice is the central focus of this paper. It appears that the traditional decision/adoption model, when considered in conjunction with the institutionalized beliefs, values, and methodological procedures of the state Cooperative Extension Services, meets the criteria…

  13. Medical Specialty Decision Model: Utilizing Social Cognitive Career Theory

    ERIC Educational Resources Information Center

    Gibson, Denise D.; Borges, Nicole J.

    2004-01-01

    Objectives: The purpose of this study was to develop a working model to explain medical specialty decision-making. Using Social Cognitive Career Theory, we examined personality, medical specialty preferences, job satisfaction, and expectations about specialty choice to create a conceptual framework to guide specialty choice decision-making.…

  14. Application of Health Promotion Theories and Models for Environmental Health

    ERIC Educational Resources Information Center

    Parker, Edith A.; Baldwin, Grant T.; Israel, Barbara; Salinas, Maria A.

    2004-01-01

    The field of environmental health promotion gained new prominence in recent years as awareness of physical environmental stressors and exposures increased in communities across the country and the world. Although many theories and conceptual models are used routinely to guide health promotion and health education interventions, they are rarely…

  15. Using SAS PROC MCMC for Item Response Theory Models

    ERIC Educational Resources Information Center

    Ames, Allison J.; Samonte, Kelli

    2015-01-01

    Interest in using Bayesian methods for estimating item response theory models has grown at a remarkable rate in recent years. This attentiveness to Bayesian estimation has also inspired a growth in available software such as WinBUGS, R packages, BMIRT, MPLUS, and SAS PROC MCMC. This article intends to provide an accessible overview of Bayesian…

  16. An NCME Instructional Module on Polytomous Item Response Theory Models

    ERIC Educational Resources Information Center

    Penfield, Randall David

    2014-01-01

    A polytomous item is one for which the responses are scored according to three or more categories. Given the increasing use of polytomous items in assessment practices, item response theory (IRT) models specialized for polytomous items are becoming increasingly common. The purpose of this ITEMS module is to provide an accessible overview of…

  17. Item Response Theory Models for Performance Decline during Testing

    ERIC Educational Resources Information Center

    Jin, Kuan-Yu; Wang, Wen-Chung

    2014-01-01

    Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…

  18. Conceptualizations of Creativity: Comparing Theories and Models of Giftedness

    ERIC Educational Resources Information Center

    Miller, Angie L.

    2012-01-01

    This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…

  19. Multilevel Higher-Order Item Response Theory Models

    ERIC Educational Resources Information Center

    Huang, Hung-Yu; Wang, Wen-Chung

    2014-01-01

    In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…

  20. A Model to Demonstrate the Place Theory of Hearing

    ERIC Educational Resources Information Center

    Ganesh, Gnanasenthil; Srinivasan, Venkata Subramanian; Krishnamurthi, Sarayu

    2016-01-01

    In this brief article, the authors discuss Georg von Békésy's experiments showing the existence of traveling waves in the basilar membrane and that maximal displacement of the traveling wave was determined by the frequency of the sound. The place theory of hearing equates the basilar membrane to a frequency analyzer. The model described in this…