Science.gov

Sample records for scission-point model

  1. SPY: A new scission point model based on microscopic ingredients to predict fission fragments properties

    NASA Astrophysics Data System (ADS)

    Lemaître, J.-F.; Dubray, N.; Hilaire, S.; Panebianco, S.; Sida, J.-L.

    2013-12-01

    Our purpose is to determine fission fragments characteristics in a framework of a scission point model named SPY for Scission Point Yields. This approach can be considered as a theoretical laboratory to study fission mechanism since it gives access to the correlation between the fragments properties and their nuclear structure, such as shell correction, pairing, collective degrees of freedom, odd-even effects. Which ones are dominant in final state? What is the impact of compound nucleus structure? The SPY model consists in a statistical description of the fission process at the scission point where fragments are completely formed and well separated with fixed properties. The most important property of the model relies on the nuclear structure of the fragments which is derived from full quantum microscopic calculations. This approach allows computing the fission final state of extremely exotic nuclei which are inaccessible by most of the fission model available on the market.

  2. New statistical scission-point model to predict fission fragment observables

    NASA Astrophysics Data System (ADS)

    Lemaître, Jean-François; Panebianco, Stefano; Sida, Jean-Luc; Hilaire, Stéphane; Heinrich, Sophie

    2015-09-01

    The development of high performance computing facilities makes possible a massive production of nuclear data in a full microscopic framework. Taking advantage of the individual potential calculations of more than 7000 nuclei, a new statistical scission-point model, called SPY, has been developed. It gives access to the absolute available energy at the scission point, which allows the use of a parameter-free microcanonical statistical description to calculate the distributions and the mean values of all fission observables. SPY uses the richness of microscopy in a rather simple theoretical framework, without any parameter except the scission-point definition, to draw clear answers based on perfect knowledge of the ingredients involved in the model, with very limited computing cost.

  3. SPY: a new scission-point model based on microscopic inputs to predict fission fragment properties

    NASA Astrophysics Data System (ADS)

    Panebianco, Stefano; Dubray, Nöel; Goriely, Stéphane; Hilaire, Stéphane; Lemaître, Jean-François; Sida, Jean-Luc

    2014-04-01

    Despite the difficulty in describing the whole fission dynamics, the main fragment characteristics can be determined in a static approach based on a so-called scission-point model. Within this framework, a new Scission-Point model for the calculations of fission fragment Yields (SPY) has been developed. This model, initially based on the approach developed by Wilkins in the late seventies, consists in performing a static energy balance at scission, where the two fragments are supposed to be completely separated so that their macroscopic properties (mass and charge) can be considered as fixed. Given the knowledge of the system state density, averaged quantities such as mass and charge yields, mean kinetic and excitation energy can then be extracted in the framework of a microcanonical statistical description. The main advantage of the SPY model is the introduction of one of the most up-to-date microscopic descriptions of the nucleus for the individual energy of each fragment and, in the future, for their state density. These quantities are obtained in the framework of HFB calculations using the Gogny nucleon-nucleon interaction, ensuring an overall coherence of the model. Starting from a description of the SPY model and its main features, a comparison between the SPY predictions and experimental data will be discussed for some specific cases, from light nuclei around mercury to major actinides. Moreover, extensive predictions over the whole chart of nuclides will be discussed, with particular attention to their implication in stellar nucleosynthesis. Finally, future developments, mainly concerning the introduction of microscopic state densities, will be briefly discussed.

  4. Characterization of the scission point from fission-fragment velocities

    NASA Astrophysics Data System (ADS)

    Caamaño, M.; Farget, F.; Delaune, O.; Schmidt, K.-H.; Schmitt, C.; Audouin, L.; Bacri, C.-O.; Benlliure, J.; Casarejos, E.; Derkx, X.; Fernández-Domínguez, B.; Gaudefroy, L.; Golabek, C.; Jurado, B.; Lemasson, A.; Ramos, D.; Rodríguez-Tajes, C.; Roger, T.; Shrivastava, A.

    2015-09-01

    The isotopic yield distributions and kinematic properties of fragments produced in the transfer-induced fission of 240Pu and fusion-induced fission of 250Cf, with 9 MeV and 45 MeV excitation energy, respectively, were measured in inverse kinematics with the spectrometer VAMOS. The kinematics of identified fission fragments allow to derive properties of the scission configuration such as the distance between fragments, the total kinetic energy, the neutron multiplicity, the total excitation energy, and, for the first time, the proton- and neutron-number sharing during the emergence of the fragments. These properties of the scission point are studied as functions of the fragment atomic number. The correlation between these observables, gathered in one single experiment and for two different fissioning systems at different excitation energies, give valuable information for the understanding and modeling of the fission process.

  5. Nonuniform character of the population of spin projections K for a fissile nucleus at the scission point and anisotropies in the angular distributions of fragments originating from the induced fission of nuclei

    SciTech Connect

    Kadmensky, S. G.; Bunakov, V. E.; Kadmensky, S. S.

    2012-11-15

    It is shown that the emergence of anisotropies in the angular distributions of fragments originating from the spontaneous and induced fission of oriented actinide nuclei is possible only if nonuniformities in the population of the projectionsM (K) of the fissile-nucleus spin onto the z axis of the laboratory frame (fissile-nucleus symmetry axis) appear simultaneously in the vicinity of the scission point but not in the vicinity of the outer saddle point of the deformation potential. The possibilities for creating the orientation of fissile nuclei for spontaneous and induced fission and the effect of these orientations on the anisotropies under analysis are considered. The role of Coriolis interaction as a unique source of the mixing of different-K fissile-nucleus states at all stages of the fission process is studied with allowance for the dynamical enhancement of this interaction for excited thermalized states of the nucleus involved that is characterized by a high energy density. It is shown that the absence of thermalization of excited states of the fissile nucleus that appear because of the effect of nonadiabaticity of its collective deformation motion in the vicinity of the scission point is a condition of conservation of the influence that transition fission states formed at the inner and outer fission barriers exerts on the distribution of the spin projections K for lowenergy spontaneous nuclear fission. It is confirmed that anisotropies observed in the angular distributions of fragments originating from the fission of nuclei that is induced by fast light particles (multiply charged ions) are due to the appearance of strongly excited equilibrium(nonequilibrium) states of the fissile nucleus in the vicinity of its scission point that have a Gibbs (non-Gibbs) distribution of projections K.

  6. Application of the dinuclear system model to fission process

    NASA Astrophysics Data System (ADS)

    Andreev, A. V.; Shneidman, T. M.; Ventura, A.

    2016-01-01

    A theoretical evaluation of the collective excitation spectra of nucleus at large deformations is possible within the framework of the dinuclear system model, which treats the wave function of the fissioning nucleus as a superposition of a mononucleus configuration and two-cluster configurations in a dynamical way, permitting exchange of nucleons between clusters. In this work the method of calculation of the potential energy and the collective spectrum of fissioning nucleus at scission point is presented. Combining the DNS model calculations and the statistical model of fission we calculate the mass, total kinetic energy, and angular distribution of fission fragments for the neutron-induced fission of 239Pu.

  7. Fission yield calculation using toy model based on Monte Carlo simulation

    SciTech Connect

    Jubaidah; Kurniadi, Rizal

    2015-09-30

    Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (R{sub c}), mean of left curve (μ{sub L}) and mean of right curve (μ{sub R}), deviation of left curve (σ{sub L}) and deviation of right curve (σ{sub R}). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90

  8. Fission fragment mass distribution studies in 30Si +180Hf reaction

    NASA Astrophysics Data System (ADS)

    Shamlath, A.; Shareef, M.; Prasad, E.; Sugathan, P.; Thomas, R. G.; Jhingan, A.; Appannababu, S.; Nasirov, A. K.; Vinodkumar, A. M.; Varier, K. M.; Yadav, C.; Babu, B. R. S.; Nath, S.; Mohanto, G.; Mukul, Ish; Singh, D.; Kailas, S.

    2016-01-01

    Fission fragment mass-angle and mass ratio distributions have been measured for the 30Si + 180Hf reaction in the beam energy range 128-148 MeV. Quasifission signature is observed in this reaction, forming the compound system 210Rn. The results are compared with a very asymmetric reaction 16O + 194Pt, forming the same compound nucleus. Calculations assuming saddle point, scission point and DNS models have been performed to interpret the experimental results. The results strongly suggest the entrance channel dependence of quasifission in heavy ion collisions.

  9. Energy dependence of mass, charge, isotopic, and energy distributions in neutron-induced fission of 235U and 239Pu

    NASA Astrophysics Data System (ADS)

    Pasca, H.; Andreev, A. V.; Adamian, G. G.; Antonenko, N. V.; Kim, Y.

    2016-05-01

    The mass, charge, isotopic, and kinetic-energy distributions of fission fragments are studied within an improved scission-point statistical model in the reactions 235U+n and 239Pu+n at different energies of the incident neutron. The charge and mass distributions of the electromagnetic- and neutron-induced fission of 214,218Ra, 230,232,238U are also shown. The available experimental data are well reproduced and the energy-dependencies of the observable characteristics of fission are predicted for future experiments.

  10. Asymmetry of fission fragment mass distribution for Po and Ir isotopes

    NASA Astrophysics Data System (ADS)

    Andreev, A. V.; Adamian, G. G.; Antonenko, N. V.

    2016-03-01

    Using the improved scission-point model, the mass distributions are calculated for induced fission of several Po and Ir isotopes. The calculated mass distributions and mean total kinetic energies of fission fragments are compared with the existing experimental data. The revealed coexistence of both symmetric and asymmetric modes in the β -delayed fission of At,196194 is in agreement with the experimental observations. The change of the shape of mass distribution with increasing A of fissioning AIr nucleus from asymmetric for 185Ir to symmetric for 193Ir is found.

  11. Modeling modeling.

    PubMed Central

    Killeen, P R

    1999-01-01

    Models are tools; they need to fit both the hand and the task. Presence or absence of a feature such as a pacemaker or a cascade is not in itself good. Or bad. Criteria for model evaluation involve benefit-cost ratios, with the numerator a function of the range of phenomena explained, goodness of fit, consistency with other nearby models, and intangibles such as beauty. The denominator is a function of complexity, the number of phenomena that must be ignored, and the effort necessary to incorporate the model into one's parlance. Neither part of the ratio can yet be evaluated for MTS, whose authors provide some cogent challenges to SET. PMID:10220934

  12. Comparative study of the fragments' mass and energy characteristics in the spontaneous fussion of 238Pu, 240Pu and 242Pu and in the thermal-neutron-induced fission of 239Pu

    NASA Astrophysics Data System (ADS)

    Schillebeeckx, P.; Wagemans, C.; Deruytter, A. J.; Barthélémy, R.

    1992-08-01

    The energy and mass distribution and their correlations have been studied for the spontaneous fission of 238, 240, 242Pu and for the thermal-neutron-induced fission of 239Pu. A comparison of 240Pu(s.f.) and 239Pu(nth,f) shows that the increase in excitation energy mainly results in an increase of the intrinsic excitation energy. A comparison of the results for 238Pu, 240Pu and 242Pu(s.f.) demonstrates the occurence of different fission modes with varying relative probability. These results are discussed in terms of the scission point model as well as in terms of the fission channel model with random neck-rupture.

  13. Study of Fission Barrier Heights of Uranium Isotopes by the Macroscopic-Microscopic Method

    NASA Astrophysics Data System (ADS)

    Zhong, Chun-Lai; Fan, Tie-Shuan

    2014-09-01

    Potential energy surfaces of uranium nuclei in the range of mass numbers 229 through 244 are investigated in the framework of the macroscopic-microscopic model and the heights of static fission barriers are obtained in terms of a double-humped structure. The macroscopic part of the nuclear energy is calculated according to Lublin—Strasbourg-drop (LSD) model. Shell and pairing corrections as the microscopic part are calculated with a folded-Yukawa single-particle potential. The calculation is carried out in a five-dimensional parameter space of the generalized Lawrence shapes. In order to extract saddle points on the potential energy surface, a new algorithm which can effectively find an optimal fission path leading from the ground state to the scission point is developed. The comparison of our results with available experimental data and others' theoretical results confirms the reliability of our calculations.

  14. Visualizing Nuclear Scission through a Multifield Extension of Topological Analysis.

    PubMed

    Duke, D; Carr, H; Knoll, A; Schunck, N; Nam, Hai Ah; Staszczak, A

    2012-12-01

    In nuclear science, density functional theory (DFT) is a powerful tool to model the complex interactions within the atomic nucleus, and is the primary theoretical approach used by physicists seeking a better understanding of fission. However DFT simulations result in complex multivariate datasets in which it is difficult to locate the crucial `scission' point at which one nucleus fragments into two, and to identify the precursors to scission. The Joint Contour Net (JCN) has recently been proposed as a new data structure for the topological analysis of multivariate scalar fields, analogous to the contour tree for univariate fields. This paper reports the analysis of DFT simulations using the JCN, the first application of the JCN technique to real data. It makes three contributions to visualization: (i) a set of practical methods for visualizing the JCN, (ii) new insight into the detection of nuclear scission, and (iii) an analysis of aesthetic criteria to drive further work on representing the JCN. PMID:26357109

  15. Asymmetrical fission and statistical emission of complex fragments from the highly excited {sup 47}V compound nucleus

    SciTech Connect

    Beck, C.; Djerroud, B.; Haas, F.; Freeman, R.M.; Hachem, A.; Heusch, B.; Morsad, A.; Vuillet-A-Crilles, M.; Youlal, M.; Abe, Y.; Dayras, R.; Wieleczko, J.P.; Legrain, R.; Pollaco, E.; Ray, A.; Shapira, D.; Campo, J.G.D.; Kim, H.J.; Cavallaro, S.; De Fillippo, E.; Lanzano, G.; Pagano, A.; Sperduto, M.L.; Matsuse, T.; Sanders, S.J.

    1991-12-31

    The properties of the fully damped (deep-inelastic and orbiting) and fusion (evaporation and fission) processes have been investigated in three entrance channels leading to the same {sup 47} V compound nucleus at high excitation energies. No entrance channel effect has been observed in either the evaporation residue or the fission-like yields in contrast to the {sup 28}Si + {sup 12}C and {sup 24}Mg + {sup 16}O reactions in which the orbiting process still persists. The asymmetrical elemental distributions of the fully energy relaxed fragments are well described by fusion-fission models based respectively on the scission point and saddle point pictures. Finally a general discussion of the competition between orbiting and fusion-fission mechanisms in light heavy-ion reactions is presented in the framework of their calculated available number of open channels.

  16. Asymmetrical fission and statistical emission of complex fragments from the highly excited sup 47 V compound nucleus

    SciTech Connect

    Beck, C.; Djerroud, B.; Haas, F.; Freeman, R.M.; Hachem, A.; Heusch, B.; Morsad, A.; Vuillet-A-Crilles, M.; Youlal, M.; Abe, Y. . Centre de Recherches Nucleaires); Dayras, R.; Wieleczko, J.P.; Legrain, R.; Pollaco, E. ); Ray, A.; Shapira, D.; Campo, J.G.D.; Kim, H.J. (Oak Ridge National Lab., TN (United States

    1991-01-01

    The properties of the fully damped (deep-inelastic and orbiting) and fusion (evaporation and fission) processes have been investigated in three entrance channels leading to the same {sup 47} V compound nucleus at high excitation energies. No entrance channel effect has been observed in either the evaporation residue or the fission-like yields in contrast to the {sup 28}Si + {sup 12}C and {sup 24}Mg + {sup 16}O reactions in which the orbiting process still persists. The asymmetrical elemental distributions of the fully energy relaxed fragments are well described by fusion-fission models based respectively on the scission point and saddle point pictures. Finally a general discussion of the competition between orbiting and fusion-fission mechanisms in light heavy-ion reactions is presented in the framework of their calculated available number of open channels.

  17. Models, Fiction, and Fictional Models

    NASA Astrophysics Data System (ADS)

    Liu, Chuang

    2014-03-01

    The following sections are included: * Introduction * Why Most Models in Science Are Not Fictional * Typically Fictional Models in Science * Modeling the Unobservable * Fictional Models for the Unobservable? * References

  18. Mental Models, Conceptual Models, and Modelling.

    ERIC Educational Resources Information Center

    Greca, Ileana Maria; Moreira, Marco Antonio

    2000-01-01

    Reviews science education research into representations constructed by students in their interactions with the world, its phenomena, and artefacts. Features discussions of mental models, conceptual models, and the activity of modeling. (Contains 30 references.) (Author/WRM)

  19. MODEL DEVELOPMENT - DOSE MODELS

    EPA Science Inventory

    Model Development

    Humans are exposed to mixtures of chemicals from multiple pathways and routes. These exposures may result from a single event or may accumulate over time if multiple exposure events occur. The traditional approach of assessing risk from a single chemica...

  20. Promoting Models

    NASA Astrophysics Data System (ADS)

    Li, Qin; Zhao, Yongxin; Wu, Xiaofeng; Liu, Si

    There can be multitudinous models specifying aspects of the same system. Each model has a bias towards one aspect. These models often override in specific aspects though they have different expressions. A specification written in one model can be refined by introducing additional information from other models. The paper proposes a concept of promoting models which is a methodology to obtain refinements with support from cooperating models. It refines a primary model by integrating the information from a secondary model. The promotion principle is not merely an academic point, but also a reliable and robust engineering technique which can be used to develop software and hardware systems. It can also check the consistency between two specifications from different models. A case of modeling a simple online shopping system with the cooperation of the guarded design model and CSP model illustrates the practicability of the promotion principle.

  1. Ionospheric modeling

    NASA Astrophysics Data System (ADS)

    Dandekar, B. S.

    1982-01-01

    The purpose of this report is to familiarize a user of ionospheric models with the options presently available for ionospheric prediction and specification. Two types of ionospheric models are available: the numerical-phenomenological and theoretical models. From the numerical type, the ITS-78, IONCAP, and Bent models have been discussed. In the theoretical models the main concern is the number of parameters included in the model. Nine ionoshperic models available have been summarized. The differences and limitations of these models are compared and tabulated. This information will help a user make a judicious selection of an ionospheric model to satisfy his specific needs. The sources for obtaining the programs for these models have been listed for ready reference.

  2. Models, Part IV: Inquiry Models.

    ERIC Educational Resources Information Center

    Callison, Daniel

    2002-01-01

    Discusses models for information skills that include inquiry-oriented activities. Highlights include WebQuest, which uses Internet resources supplemented with videoconferencing; Minnesota's Inquiry Process based on the Big Six model for information problem-solving; Indiana's Student Inquiry Model; constructivist learning models for inquiry; and…

  3. [Malaria modelling].

    PubMed

    Rogier, C; Sallet, G

    2004-01-01

    The purpose of this article is to describe various models used for the study of malaria. The type of models depends on the focus of study. Until now the most models have been designed to study malaria transmission. In addition to giving a basic description of a few classic models, we show how simulation can be useful despite the inherent simplicity of models. In our opinion it is urgent to develop models for use in public health based on recent advances in computer, automation, and mathematical techniques. PMID:15224566

  4. Supermatrix models

    SciTech Connect

    Yost, S.A.

    1991-05-01

    Radom matrix models based on an integral over supermatrices are proposed as a natural extension of bosonic matrix models. The subtle nature of superspace integration allows these models to have very different properties from the analogous bosonic models. Two choices of integration slice are investigated. One leads to a perturbative structure which is reminiscent of, and perhaps identical to, the usual Hermitian matrix models. Another leads to an eigenvalue reduction which can be described by a two component plasma in one dimension. A stationary point of the model is described.

  5. ENTRAINMENT MODELS

    EPA Science Inventory

    This presentation presented information on entrainment models. Entrainment models use entrainment hypotheses to express the continuity equation. The advantage is that plume boundaries are known. A major disadvantage is that the problems that can be solved are rather simple. The ...

  6. MODELS - 3

    EPA Science Inventory

    Models-3 is a third generation air quality modeling system that contains a variety of tools to perform research and analysis of critical environmental questions and problems. These tools provide regulatory analysts and scientists with quicker results, greater scientific accuracy ...

  7. Turbulence modeling

    NASA Technical Reports Server (NTRS)

    Rubesin, Morris W.

    1987-01-01

    Recent developments at several levels of statistical turbulence modeling applicable to aerodynamics are briefly surveyed. Emphasis is on examples of model improvements for transonic, two-dimensional flows. Experience with the development of these improved models is cited to suggest methods of accelerating the modeling process necessary to keep abreast of the rapid movement of computational fluid dynamics into the computation of complex three-dimensional flows.

  8. Phoenix model

    EPA Science Inventory

    Phoenix (formerly referred to as the Second Generation Model or SGM) is a global general equilibrium model designed to analyze energy-economy-climate related questions and policy implications in the medium- to long-term. This model disaggregates the global economy into 26 industr...

  9. Radiation Models

    ERIC Educational Resources Information Center

    James, W. G. G.

    1970-01-01

    Discusses the historical development of both the wave and the corpuscular photon model of light. Suggests that students should be informed that the two models are complementary and that each model successfully describes a wide range of radiation phenomena. Cites 19 references which might be of interest to physics teachers and students. (LC)

  10. Hydrological models are mediating models

    NASA Astrophysics Data System (ADS)

    Babel, L. V.; Karssenberg, D.

    2013-08-01

    Despite the increasing role of models in hydrological research and decision-making processes, only few accounts of the nature and function of models exist in hydrology. Earlier considerations have traditionally been conducted while making a clear distinction between physically-based and conceptual models. A new philosophical account, primarily based on the fields of physics and economics, transcends classes of models and scientific disciplines by considering models as "mediators" between theory and observations. The core of this approach lies in identifying models as (1) being only partially dependent on theory and observations, (2) integrating non-deductive elements in their construction, and (3) carrying the role of instruments of scientific enquiry about both theory and the world. The applicability of this approach to hydrology is evaluated in the present article. Three widely used hydrological models, each showing a different degree of apparent physicality, are confronted to the main characteristics of the "mediating models" concept. We argue that irrespective of their kind, hydrological models depend on both theory and observations, rather than merely on one of these two domains. Their construction is additionally involving a large number of miscellaneous, external ingredients, such as past experiences, model objectives, knowledge and preferences of the modeller, as well as hardware and software resources. We show that hydrological models convey the role of instruments in scientific practice by mediating between theory and the world. It results from these considerations that the traditional distinction between physically-based and conceptual models is necessarily too simplistic and refers at best to the stage at which theory and observations are steering model construction. The large variety of ingredients involved in model construction would deserve closer attention, for being rarely explicitly presented in peer-reviewed literature. We believe that devoting more importance to identifying and communicating on the many factors involved in model development might increase transparency of model building.

  11. Model Experiments and Model Descriptions

    NASA Technical Reports Server (NTRS)

    Jackman, Charles H.; Ko, Malcolm K. W.; Weisenstein, Debra; Scott, Courtney J.; Shia, Run-Lie; Rodriguez, Jose; Sze, N. D.; Vohralik, Peter; Randeniya, Lakshman; Plumb, Ian

    1999-01-01

    The Second Workshop on Stratospheric Models and Measurements Workshop (M&M II) is the continuation of the effort previously started in the first Workshop (M&M I, Prather and Remsberg [1993]) held in 1992. As originally stated, the aim of M&M is to provide a foundation for establishing the credibility of stratospheric models used in environmental assessments of the ozone response to chlorofluorocarbons, aircraft emissions, and other climate-chemistry interactions. To accomplish this, a set of measurements of the present day atmosphere was selected. The intent was that successful simulations of the set of measurements should become the prerequisite for the acceptance of these models as having a reliable prediction for future ozone behavior. This section is divided into two: model experiment and model descriptions. In the model experiment, participant were given the charge to design a number of experiments that would use observations to test whether models are using the correct mechanisms to simulate the distributions of ozone and other trace gases in the atmosphere. The purpose is closely tied to the needs to reduce the uncertainties in the model predicted responses of stratospheric ozone to perturbations. The specifications for the experiments were sent out to the modeling community in June 1997. Twenty eight modeling groups responded to the requests for input. The first part of this section discusses the different modeling group, along with the experiments performed. Part two of this section, gives brief descriptions of each model as provided by the individual modeling groups.

  12. ICRF modelling

    SciTech Connect

    Phillips, C.K.

    1985-12-01

    This lecture provides a survey of the methods used to model fast magnetosonic wave coupling, propagation, and absorption in tokamaks. The validity and limitations of three distinct types of modelling codes, which will be contrasted, include discrete models which utilize ray tracing techniques, approximate continuous field models based on a parabolic approximation of the wave equation, and full field models derived using finite difference techniques. Inclusion of mode conversion effects in these models and modification of the minority distribution function will also be discussed. The lecture will conclude with a presentation of time-dependent global transport simulations of ICRF-heated tokamak discharges obtained in conjunction with the ICRF modelling codes. 52 refs., 15 figs.

  13. Ventilation Model

    SciTech Connect

    H. Yang

    1999-11-04

    The purpose of this analysis and model report (AMR) for the Ventilation Model is to analyze the effects of pre-closure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts and provide heat removal data to support EBS design. It will also provide input data (initial conditions, and time varying boundary conditions) for the EBS post-closure performance assessment and the EBS Water Distribution and Removal Process Model. The objective of the analysis is to develop, describe, and apply calculation methods and models that can be used to predict thermal conditions within emplacement drifts under forced ventilation during the pre-closure period. The scope of this analysis includes: (1) Provide a general description of effects and heat transfer process of emplacement drift ventilation. (2) Develop a modeling approach to simulate the impacts of pre-closure ventilation on the thermal conditions in emplacement drifts. (3) Identify and document inputs to be used for modeling emplacement ventilation. (4) Perform calculations of temperatures and heat removal in the emplacement drift. (5) Address general considerations of the effect of water/moisture removal by ventilation on the repository thermal conditions. The numerical modeling in this document will be limited to heat-only modeling and calculations. Only a preliminary assessment of the heat/moisture ventilation effects and modeling method will be performed in this revision. Modeling of moisture effects on heat removal and emplacement drift temperature may be performed in the future.

  14. Climate Models

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.

    2012-01-01

    Climate models is a very broad topic, so a single volume can only offer a small sampling of relevant research activities. This volume of 14 chapters includes descriptions of a variety of modeling studies for a variety of geographic regions by an international roster of authors. The climate research community generally uses the rubric climate models to refer to organized sets of computer instructions that produce simulations of climate evolution. The code is based on physical relationships that describe the shared variability of meteorological parameters such as temperature, humidity, precipitation rate, circulation, radiation fluxes, etc. Three-dimensional climate models are integrated over time in order to compute the temporal and spatial variations of these parameters. Model domains can be global or regional and the horizontal and vertical resolutions of the computational grid vary from model to model. Considering the entire climate system requires accounting for interactions between solar insolation, atmospheric, oceanic and continental processes, the latter including land hydrology and vegetation. Model simulations may concentrate on one or more of these components, but the most sophisticated models will estimate the mutual interactions of all of these environments. Advances in computer technology have prompted investments in more complex model configurations that consider more phenomena interactions than were possible with yesterday s computers. However, not every attempt to add to the computational layers is rewarded by better model performance. Extensive research is required to test and document any advantages gained by greater sophistication in model formulation. One purpose for publishing climate model research results is to present purported advances for evaluation by the scientific community.

  15. Phenomenological models

    SciTech Connect

    Braby, L.A.

    1990-09-01

    The biological effects of ionizing radiation exposure are the result of a complex sequence of physical, chemical, biochemical, and physiological interactions. One way to begin a search for an understanding of health effects of radiation is through the development of phenomenological models of the response. Many models have been presented and tested in the slowly evolving process of characterizing cellular response. A range of models covering different endpoints and phenomena has developed in parallel. Many of these models employ similar assumptions about some underlying processes while differing about the nature of others. An attempt is made to organize many of the models into groups with similar features and to compare the consequences of those features with the actual experimental observations. It is assumed that by showing that some assumptions are inconsistent with experimental observations, the job of devising and testing mechanistic models can be simplified. 43 refs., 13 figs.

  16. Calorimetry modeling

    SciTech Connect

    Robinson, C.E.

    1990-01-01

    A heat-flow calorimeter has been modeled on a Compaq PC, using the Algor Heat Transfer Modeling and Analysis Program, Algor Interactive Systems, Inc., Pittsburgh, PA. Employed in this application of the Algor finite element analysis program are two-dimensional axisymmetric thermal conductivity elements. The development of a computer calorimeter modeling program allows for the testing of new materials and techniques without actual fabrication of the calorimeter. 2 figs.

  17. Cloud Modeling

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Moncrieff, Mitchell; Einaud, Franco (Technical Monitor)

    2001-01-01

    Numerical cloud models have been developed and applied extensively to study cloud-scale and mesoscale processes during the past four decades. The distinctive aspect of these cloud models is their ability to treat explicitly (or resolve) cloud-scale dynamics. This requires the cloud models to be formulated from the non-hydrostatic equations of motion that explicitly include the vertical acceleration terms since the vertical and horizontal scales of convection are similar. Such models are also necessary in order to allow gravity waves, such as those triggered by clouds, to be resolved explicitly. In contrast, the hydrostatic approximation, usually applied in global or regional models, does allow the presence of gravity waves. In addition, the availability of exponentially increasing computer capabilities has resulted in time integrations increasing from hours to days, domain grids boxes (points) increasing from less than 2000 to more than 2,500,000 grid points with 500 to 1000 m resolution, and 3-D models becoming increasingly prevalent. The cloud resolving model is now at a stage where it can provide reasonably accurate statistical information of the sub-grid, cloud-resolving processes poorly parameterized in climate models and numerical prediction models.

  18. Ventilation Model

    SciTech Connect

    V. Chipman

    2002-10-05

    The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their post-closure analyses. The Ventilation Model report was initially developed to analyze the effects of preclosure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts, and to provide heat removal data to support EBS design. Revision 00 of the Ventilation Model included documentation of the modeling results from the ANSYS-based heat transfer model. The purposes of Revision 01 of the Ventilation Model are: (1) To validate the conceptual model for preclosure ventilation of emplacement drifts and verify its numerical application in accordance with new procedural requirements as outlined in AP-SIII-10Q, Models (Section 7.0). (2) To satisfy technical issues posed in KTI agreement RDTME 3.14 (Reamer and Williams 2001a). Specifically to demonstrate, with respect to the ANSYS ventilation model, the adequacy of the discretization (Section 6.2.3.1), and the downstream applicability of the model results (i.e. wall heat fractions) to initialize post-closure thermal models (Section 6.6). (3) To satisfy the remainder of KTI agreement TEF 2.07 (Reamer and Williams 2001b). Specifically to provide the results of post-test ANSYS modeling of the Atlas Facility forced convection tests (Section 7.1.2). This portion of the model report also serves as a validation exercise per AP-SIII.10Q, Models, for the ANSYS ventilation model. (4) To further satisfy KTI agreements RDTME 3.01 and 3.14 (Reamer and Williams 2001a) by providing the source documentation referred to in the KTI Letter Report, ''Effect of Forced Ventilation on Thermal-Hydrologic Conditions in the Engineered Barrier System and Near Field Environment'' (Williams 2002). Specifically to provide the results of the MULTIFLUX model which simulates the coupled processes of heat and mass transfer in and around waste emplacement drifts during periods of forced ventilation. This portion of the model report is presented as an Alternative Conceptual Model with a numerical application, and also provides corroborative results used for model validation purposes (Section 6.3 and 6.4).

  19. Model Selection for Geostatistical Models

    SciTech Connect

    Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.

  20. Modeling Sunspots

    ERIC Educational Resources Information Center

    Oh, Phil Seok; Oh, Sung Jin

    2013-01-01

    Modeling in science has been studied by education researchers for decades and is now being applied broadly in school. It is among the scientific practices featured in the "Next Generation Science Standards" ("NGSS") (Achieve Inc. 2013). This article describes modeling activities in an extracurricular science club in a high

  1. Turbulence modeling

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.

    1995-01-01

    The objective of this work is to develop, verify, and incorporate the baseline two-equation turbulence models which account for the effects of compressibility into the three-dimensional Reynolds averaged Navier-Stokes (RANS) code and to provide documented descriptions of the models and their numerical procedures so that they can be implemented into 3-D CFD codes for engineering applications.

  2. Phonological Models.

    ERIC Educational Resources Information Center

    Ballard, W.L.

    1968-01-01

    The article discusses models of synchronic and diachronic phonology and suggests changes in them. The basic generative model of phonology is outlined with the author's reinterpretations. The systematic phonemic level is questioned in terms of its unreality with respect to linguistic performance and its lack of validity with respect to historical…

  3. Qualitative modeling.

    PubMed

    Forbus, Kenneth D

    2011-07-01

    Qualitative modeling concerns the representations and reasoning that people use to understand continuous aspects of the world. Qualitative models formalize everyday notions of causality and provide accounts of how to ground symbolic, relational representations in perceptual processes. This article surveys the basic ideas of qualitative modeling and their applications from a cognitive science perspective. It describes the basic principles of qualitative modeling, and a variety of qualitative representations that have been developed for quantities and for relationships between them, providing a kind of qualitative mathematics. Three ontological frameworks for organizing modeling knowledge (processes, components, and field) are summarized, along with research on automatically assembling models for particular tasks from such knowledge. Qualitative simulation and how it carves up time into meaningful units is discussed. We discuss several accounts of causal reasoning about dynamical systems, based on different choices of qualitative mathematics and ontology. Qualitative spatial reasoning is explored, both in terms of relational systems and visual reasoning. Applications of qualitative models of particular interest to cognitive scientists are described, including how they have been used to capture the expertise of scientists and engineers and how they have been used in education. Open questions and frontiers are also discussed, focusing on relationships between ideas developed in the qualitative modeling community and other areas of cognitive science. WIREs Cogni Sci 2011 2 374-391 DOI: 10.1002/wcs.115 For further resources related to this article, please visit the WIREs website. PMID:26302198

  4. Modeling Sunspots

    ERIC Educational Resources Information Center

    Oh, Phil Seok; Oh, Sung Jin

    2013-01-01

    Modeling in science has been studied by education researchers for decades and is now being applied broadly in school. It is among the scientific practices featured in the "Next Generation Science Standards" ("NGSS") (Achieve Inc. 2013). This article describes modeling activities in an extracurricular science club in a high…

  5. Budget Model.

    ERIC Educational Resources Information Center

    Washington State Board for Community Coll. Education, Olympia.

    Computerized formula-driven budget models are used by the Washington community college system to define resource needs for legislative budget requests and to distribute legislative appropriations among 22 community college districts. This manual outlines the sources of information needed to operate the model and illustrates the principles on which…

  6. Dispersion Modeling.

    ERIC Educational Resources Information Center

    Budiansky, Stephen

    1980-01-01

    This article discusses the need for more accurate and complete input data and field verification of the various models of air pollutant dispension. Consideration should be given to changing the form of air quality standards based on enhanced dispersion modeling techniques. (Author/RE)

  7. Protein structure modeling with MODELLER.

    PubMed

    Webb, Benjamin; Sali, Andrej

    2014-01-01

    Genome sequencing projects have resulted in a rapid increase in the number of known protein sequences. In contrast, only about one-hundredth of these sequences have been characterized at atomic resolution using experimental structure determination methods. Computational protein structure modeling techniques have the potential to bridge this sequence-structure gap. In this chapter, we present an example that illustrates the use of MODELLER to construct a comparative model for a protein with unknown structure. Automation of a similar protocol has resulted in models of useful accuracy for domains in more than half of all known protein sequences. PMID:24573470

  8. Protein structure modeling with MODELLER.

    PubMed

    Eswar, Narayanan; Eramian, David; Webb, Ben; Shen, Min-Yi; Sali, Andrej

    2008-01-01

    Genome sequencing projects have resulted in a rapid increase in the number of known protein sequences. In contrast, only about one-hundredth of these sequences have been characterized using experimental structure determination methods. Computational protein structure modeling techniques have the potential to bridge this sequence-structure gap. This chapter presents an example that illustrates the use of MODELLER to construct a comparative model for a protein with unknown structure. Automation of similar protocols (correction of protcols) has resulted in models of useful accuracy for domains in more than half of all known protein sequences. PMID:18542861

  9. OSPREY Model

    SciTech Connect

    Veronica J. Rutledge

    2013-01-01

    The absence of industrial scale nuclear fuel reprocessing in the U.S. has precluded the necessary driver for developing the advanced simulation capability now prevalent in so many other countries. Thus, it is essential to model complex series of unit operations to simulate, understand, and predict inherent transient behavior and feedback loops. A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes will provide substantial cost savings and many technical benefits. The specific fuel cycle separation process discussed in this report is the off-gas treatment system. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and REcoverY (OSPREY) models the adsorption of off-gas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas, sorbent, and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data is obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. It also outputs temperature along the column length as a function of time and pressure drop along the column length. Experimental data and parameters were input into the adsorption model to develop models specific for krypton adsorption. The same can be done for iodine, xenon, and tritium. The model will be validated with experimental breakthrough curves. Customers will be given access to OSPREY to used and evaluate the model.

  10. Phenomenological models.

    PubMed

    Braby, L A

    1991-01-01

    The biological effects of ionizing radiation exposure are the result of a complex sequence of physical, chemical, biochemical, and physiological interactions which are modified by characteristics of the radiation, the timing of its administration, the chemical and physical environment, and the nature of the biological system. However, it is generally agreed that the health effects in animals originate from changes in individual cells, or possibly small groups of cells, and that these cellular changes are initiated by ionizations and excitations produced by the passage of charged particles through the cells. One way to begin a search for an understanding of health effects of radiation is through the development of phenomenological models of the response. Many models have been presented and tested in the slowly evolving process of characterizing cellular response. Different phenomena (LET dependence, dose rate effect, oxygen effect etc.) and different end points (cell survival, aberration formation, transformation, etc.) have been observed, and no single model has been developed to cover all of them. Instead, a range of models covering different end points and phenomena have developed in parallel. Many of these models employ similar assumptions about some underlying processes while differing about the nature of others. An attempt is made to organize many of the models into groups with similar features and to compare the consequences of those features with the actual experimental observations. It is assumed that by showing that some assumptions are inconsistent with experimental observations, the job of devising and testing mechanistic models can be simplified. PMID:1811477

  11. Energy Models

    EPA Science Inventory

    Energy models characterize the energy system, its evolution, and its interactions with the broader economy. The energy system consists of primary resources, including both fossil fuels and renewables; power plants, refineries, and other technologies to process and convert these r...

  12. Programming models

    SciTech Connect

    Daniel, David J; Mc Pherson, Allen; Thorp, John R; Barrett, Richard; Clay, Robert; De Supinski, Bronis; Dube, Evi; Heroux, Mike; Janssen, Curtis; Langer, Steve; Laros, Jim

    2011-01-14

    A programming model is a set of software technologies that support the expression of algorithms and provide applications with an abstract representation of the capabilities of the underlying hardware architecture. The primary goals are productivity, portability and performance.

  13. Anchor Modeling

    NASA Astrophysics Data System (ADS)

    Regardt, Olle; Rönnbäck, Lars; Bergholtz, Maria; Johannesson, Paul; Wohed, Petia

    Maintaining and evolving data warehouses is a complex, error prone, and time consuming activity. The main reason for this state of affairs is that the environment of a data warehouse is in constant change, while the warehouse itself needs to provide a stable and consistent interface to information spanning extended periods of time. In this paper, we propose a modeling technique for data warehousing, called anchor modeling, that offers non-destructive extensibility mechanisms, thereby enabling robust and flexible management of changes in source systems. A key benefit of anchor modeling is that changes in a data warehouse environment only require extensions, not modifications, to the data warehouse. This ensures that existing data warehouse applications will remain unaffected by the evolution of the data warehouse, i.e. existing views and functions will not have to be modified as a result of changes in the warehouse model.

  14. Micromolecular modeling

    NASA Technical Reports Server (NTRS)

    Guillet, J. E.

    1984-01-01

    A reaction kinetics based model of the photodegradation process, which measures all important rate constants, and a computerized model capable of predicting the photodegradation rate and failure modes of a 30 year period, were developed. It is shown that the computerized photodegradation model for polyethylene correctly predicts failure of ELVAX 15 and cross linked ELVAX 150 on outdoor exposure. It is indicated that cross linking ethylene vinyl acetate (EVA) does not significantly change its degradation rate. It is shown that the effect of the stabilizer package is approximately equivalent on both polymers. The computerized model indicates that peroxide decomposers and UV absorbers are the most effective stabilizers. It is found that a combination of UV absorbers and a hindered amine light stabilizer (HALS) is the most effective stabilizer system.

  15. Do stroke models model stroke?

    PubMed Central

    Mergenthaler, Philipp; Meisel, Andreas

    2012-01-01

    Stroke is one of the leading causes of death worldwide and the biggest reason for long-term disability. Basic research has formed the modern understanding of stroke pathophysiology, and has revealed important molecular, cellular and systemic mechanisms. However, despite decades of research, most translational stroke trials that aim to introduce basic research findings into clinical treatment strategies – most notably in the field of neuroprotection – have failed. Among other obstacles, poor methodological and statistical standards, negative publication bias, and incomplete preclinical testing have been proposed as ‘translational roadblocks’. In this article, we introduce the models commonly used in preclinical stroke research, discuss some of the causes of failed translational success and review potential remedies. We further introduce the concept of modeling ‘care’ of stroke patients, because current preclinical research models the disorder but does not model care or state-of-the-art clinical testing. Stringent statistical methods and controlled preclinical trials have been suggested to counteract weaknesses in preclinical research. We conclude that preclinical stroke research requires (1) appropriate modeling of the disorder, (2) appropriate modeling of the care of stroke patients and (3) an approach to preclinical testing that is similar to clinical testing, including Phase 3 randomized controlled preclinical trials as necessary additional steps before new therapies enter clinical testing. PMID:23115201

  16. Reflectance Modeling

    NASA Technical Reports Server (NTRS)

    Smith, J. A.; Cooper, K.; Randolph, M.

    1984-01-01

    A classical description of the one dimensional radiative transfer treatment of vegetation canopies was completed and the results were tested against measured prairie (blue grama) and agricultural canopies (soybean). Phase functions are calculated in terms of directly measurable biophysical characteristics of the canopy medium. While the phase functions tend to exhibit backscattering anisotropy, their exact behavior is somewhat more complex and wavelength dependent. A Monte Carlo model was developed that treats soil surfaces with large periodic variations in three dimensions. A photon-ray tracing technology is used. Currently, the rough soil surface is described by analytic functions and appropriate geometric calculations performed. A bidirectional reflectance distribution function is calculated and, hence, available for other atmospheric or canopy reflectance models as a lower boundary condition. This technique is used together with an adding model to calculate several cases where Lambertian leaves possessing anisotropic leaf angle distributions yield non-Lambertian reflectance; similar behavior is exhibited for simulated soil surfaces.

  17. Cavitation modelling

    NASA Astrophysics Data System (ADS)

    Kueny, J. L.

    Improvements of cavitation modeling during the past years are reviewed. They are directly correlated to the development of the computers and above all to the development of cryogenic liquid propellant fed rocket engines. Cavitation is the evaporation of a liquid, caused by lowering its pressure. It occurs at locations where the pressure falls short of a critical value, roughly corresponding to the vapor pressure. In these areas of lowest pressure, vaporous cavities appear, that rest on walls or float within the liquid. New models are available for calculation of three dimensional partial cavitation in pumps with steady state assumptions. Two dimensional unsteady cavitating flow modelization are in development. Promising results are obtained for the prediction of the thermodynamical effects on cavitation by using a coupled code of the external liquid and the internal vapor flow.

  18. Modeling reality

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    Although powerful computers have allowed complex physical and manmade hardware systems to be modeled successfully, we have encountered persistent problems with the reliability of computer models for systems involving human learning, human action, and human organizations. This is not a misfortune; unlike physical and manmade systems, human systems do not operate under a fixed set of laws. The rules governing the actions allowable in the system can be changed without warning at any moment, and can evolve over time. That the governing laws are inherently unpredictable raises serious questions about the reliability of models when applied to human situations. In these domains, computers are better used, not for prediction and planning, but for aiding humans. Examples are systems that help humans speculate about possible futures, offer advice about possible actions in a domain, systems that gather information from the networks, and systems that track and support work flows in organizations.

  19. Supernova models

    SciTech Connect

    Woosley, S.E.; Weaver, T.A.

    1980-01-01

    Recent progress in understanding the observed properties of Type I supernovae as a consequence of the thermonuclear detonation of white dwarf stars and the ensuing decay of the /sup 56/Ni produced therein is reviewed. Within the context of this model for Type I explosions and the 1978 model for Type II explosions, the expected nucleosynthesis and gamma-line spectra from both kinds of supernovae are presented. Finally, a qualitatively new approach to the problem of massive star death and Type II supernovae based upon a combination of rotation and thermonuclear burning is discussed.

  20. Molecular Modeling

    NASA Astrophysics Data System (ADS)

    Holmes, Jon L.

    1999-06-01

    Molecular modeling has trickled down from the realm of pharmaceutical and research laboratories into the realm of undergraduate chemistry instruction. It has opened avenues for the visualization of chemical concepts that previously were difficult or impossible to convey. I am sure that many of you have developed exercises using the various molecular modeling tools. It is the desire of this Journal to become an avenue for you to share these exercises among your colleagues. It is to this end that Ron Starkey has agreed to edit such a column and to publish not only the description of such exercises, but also the software documents they use. The WWW is the obvious medium to distribute this combination and so accepted submissions will appear online as a feature of JCE Internet. Typical molecular modeling exercise: finding conformation energies. Molecular Modeling Exercises and Experiments is the latest feature column of JCE Internet, joining Conceptual Questions and Challenge Problems, Hal's Picks, and Mathcad in the Chemistry Curriculum. JCE Internet continues to seek submissions in these areas of interest and submissions of general interest. If you have developed materials and would like to submit them, please see our Guide to Submissions for more information. The Chemical Education Resource Shelf, Equipment Buyers Guide, and WWW Site Review would also like to hear about chemistry textbooks and software, equipment, and WWW sites, respectively. Please consult JCE Internet Features to learn more about these resources at JCE Online. Email Announcements Would you like to be informed by email when the latest issue of the Journal is available online? when a new JCE Software title is shipping? when a new JCE Internet article has been published or is available for Open Review? when your subscription is about to expire? A new feature of JCE Online makes this possible. Visit our Guestbook to learn how. When you submit the form on this page, which includes your email address, you may choose to receive an email notice about a Journal event that interests you. Currently such events include availability of the latest issue of the Journal at JCE Online, expiration of your Journal subscription, shipment of a new JCE Software issue, publication of a new JCE Internet article or its availability for Open Review, and other announcements from the Journal. You may choose any number of these options independently. JCE Online Guestbook. Your Privacy JCE Online promises to you that we will not use the information that you provide in our Guestbook for anything other than our own internal information. We will not provide this information to third parties. We will use the information you provide only in our effort to help make the JCE serve you better. You only need to provide your email address to take advantage of this service; the other information you provide is optional. Molecular Modeling Exercises and Experiments: Mission Statement We are seeking in this JCE Internet feature column to publish molecular modeling exercises and experiments that have been used successfully in undergraduate instruction. The exercises will be published here on JCE Internet. An abstract of published submissions will appear in print in the Journal of Chemical Education. Acceptable exercises could be used in either a chemistry laboratory or a chemistry computer laboratory. The exercise could cover any area of chemistry, but should be limited to undergraduate instructional applications. We envision that most of the exercises/experiments will utilize one of the popular instructional molecular modeling software programs (e.g. HyperChem, Spartan, CAChe, PC Model). Exercises that are specific to a particular modeling program are acceptable, but those usable with any modeling program are preferred. Ideally the exercises/experiments will be of the type where the "correct"answer is not obvious so that the student must discover the solution or provide an explanation. The goal of the exercises should not be specifically to learn molecular modeling, but to use modeling to learn chemistry. Of course, some concepts of modeling have to be addressed in order for the student to effectively utilize molecular modeling (e.g., the distinction between a local and a global energy minimum conformation). We are looking for exercises that go beyond those already published by the molecular modeling software distributors. Each exercise should have a specific goal or objective. Fairly detailed procedures for the exercise should be included. All submissions should indicate the molecular modeling software system (name, version, computer platform and operating system) utilized for the exercise and the chemistry course(s) in which the exercise has been used. Ideally procedures and instructions should not be specific to one particular modeling software system and/or computer platform, but should be general so that they could apply to more than one system. Submissions will be peer reviewed and should be in three parts:

    a. A brief abstract b. The instructions and procedure to be used by the student c. Instructor notes that discuss the objective of the exercise, the results, the selection of the computational method(s), and potential pitfalls and problems.
    Specific guidelines for submission of exercises will be available at the JCE Internet ModelExer site. Feature Editor: Ronald Starkey, Department of Chemistry, University of Wisconsin-Green Bay, Green Bay, WI 54311-7001 Phone: 920/465-2264, or 920/465-2371 Email: starkeyr@uwgb.edu

  1. Atmospheric Modeling

    EPA Science Inventory

    Although air quality models have been applied historically to address issues specific to ambient air quality standards (i.e., one criteria pollutant at a time) or welfare (e.g.. acid deposition or visibility impairment). they are inherently multipollutant based. Therefore. in pri...

  2. Ensemble Models

    EPA Science Inventory

    Ensemble forecasting has been used for operational numerical weather prediction in the United States and Europe since the early 1990s. An ensemble of weather or climate forecasts is used to characterize the two main sources of uncertainty in computer models of physical systems: ...

  3. Modeling Muscles

    ERIC Educational Resources Information Center

    Goodwyn, Lauren; Salm, Sarah

    2007-01-01

    Teaching the anatomy of the muscle system to high school students can be challenging. Students often learn about muscle anatomy by memorizing information from textbooks or by observing plastic, inflexible models. Although these mediums help students learn about muscle placement, the mediums do not facilitate understanding regarding integration of…

  4. Modeling Muscles

    ERIC Educational Resources Information Center

    Goodwyn, Lauren; Salm, Sarah

    2007-01-01

    Teaching the anatomy of the muscle system to high school students can be challenging. Students often learn about muscle anatomy by memorizing information from textbooks or by observing plastic, inflexible models. Although these mediums help students learn about muscle placement, the mediums do not facilitate understanding regarding integration of

  5. Entrepreneurship Models.

    ERIC Educational Resources Information Center

    Finger Lakes Regional Education Center for Economic Development, Mount Morris, NY.

    This guide describes seven model programs that were developed by the Finger Lakes Regional Center for Economic Development (New York) to meet the training needs of female and minority entrepreneurs to help their businesses survive and grow and to assist disabled and dislocated workers and youth in beginning small businesses. The first three models…

  6. Modeling Lessons

    ERIC Educational Resources Information Center

    Casey, Katherine

    2011-01-01

    As teachers learn new pedagogical strategies, they crave explicit demonstrations that show them how the new strategies will work with their students in their classrooms. Successful instructional coaches, therefore, understand the importance of modeling lessons to help teachers develop a vision of effective instruction. The author, an experienced…

  7. Modeling Convection

    ERIC Educational Resources Information Center

    Ebert, James R.; Elliott, Nancy A.; Hurteau, Laura; Schulz, Amanda

    2004-01-01

    Students must understand the fundamental process of convection before they can grasp a wide variety of Earth processes, many of which may seem abstract because of the scales on which they operate. Presentation of a very visual, concrete model prior to instruction on these topics may facilitate students' understanding of processes that are largely

  8. Modeling Convection

    ERIC Educational Resources Information Center

    Ebert, James R.; Elliott, Nancy A.; Hurteau, Laura; Schulz, Amanda

    2004-01-01

    Students must understand the fundamental process of convection before they can grasp a wide variety of Earth processes, many of which may seem abstract because of the scales on which they operate. Presentation of a very visual, concrete model prior to instruction on these topics may facilitate students' understanding of processes that are largely…

  9. Criticality Model

    SciTech Connect

    A. Alsaed

    2004-09-14

    The ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003) presents the methodology for evaluating potential criticality situations in the monitored geologic repository. As stated in the referenced Topical Report, the detailed methodology for performing the disposal criticality analyses will be documented in model reports. Many of the models developed in support of the Topical Report differ from the definition of models as given in the Office of Civilian Radioactive Waste Management procedure AP-SIII.10Q, ''Models'', in that they are procedural, rather than mathematical. These model reports document the detailed methodology necessary to implement the approach presented in the Disposal Criticality Analysis Methodology Topical Report and provide calculations utilizing the methodology. Thus, the governing procedure for this type of report is AP-3.12Q, ''Design Calculations and Analyses''. The ''Criticality Model'' is of this latter type, providing a process evaluating the criticality potential of in-package and external configurations. The purpose of this analysis is to layout the process for calculating the criticality potential for various in-package and external configurations and to calculate lower-bound tolerance limit (LBTL) values and determine range of applicability (ROA) parameters. The LBTL calculations and the ROA determinations are performed using selected benchmark experiments that are applicable to various waste forms and various in-package and external configurations. The waste forms considered in this calculation are pressurized water reactor (PWR), boiling water reactor (BWR), Fast Flux Test Facility (FFTF), Training Research Isotope General Atomic (TRIGA), Enrico Fermi, Shippingport pressurized water reactor, Shippingport light water breeder reactor (LWBR), N-Reactor, Melt and Dilute, and Fort Saint Vrain Reactor spent nuclear fuel (SNF). The scope of this analysis is to document the criticality computational method. The criticality computational method will be used for evaluating the criticality potential of configurations of fissionable materials (in-package and external to the waste package) within the repository at Yucca Mountain, Nevada for all waste packages/waste forms. The criticality computational method is also applicable to preclosure configurations. The criticality computational method is a component of the methodology presented in ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003). How the criticality computational method fits in the overall disposal criticality analysis methodology is illustrated in Figure 1 (YMP 2003, Figure 3). This calculation will not provide direct input to the total system performance assessment for license application. It is to be used as necessary to determine the criticality potential of configuration classes as determined by the configuration probability analysis of the configuration generator model (BSC 2003a).

  10. Models, Part V: Composition Models.

    ERIC Educational Resources Information Center

    Callison, Daniel

    2003-01-01

    Describes four models: The Authoring Cycle, a whole language approach that reflects the inquiry process; I-Search, an approach to research that uses the power of student interests; Cultural Celebration, using local heritage topics; and Science Lab Report, for the composition of a lab report. (LRW)

  11. Fibre Models

    NASA Astrophysics Data System (ADS)

    Herrmann, H. J.; Kun, F.

    2007-12-01

    Fibre models have been introduced as simple models to describe failure. They are based on the probability distribution of broken fibres. The load redistribution after a fibre yields can be global or local and the first case can often be solved analytically. We will present an interpolation between these the local and the global case and apply it to experimental situations like the compression of granular packings. Introducing viscoelastic fibres allows to describe the creep of wood. It is even possible to deal analytically with a gradual degradation of fibres and consider damage as well as healing. In this way Basquin's law of fatigue can be reproduced and new universalities concerning the histograms of bursts and waiting times can be uncovered.

  12. Model checking

    NASA Technical Reports Server (NTRS)

    Dill, David L.

    1995-01-01

    Automatic formal verification methods for finite-state systems, also known as model-checking, successfully reduce labor costs since they are mostly automatic. Model checkers explicitly or implicitly enumerate the reachable state space of a system, whose behavior is described implicitly, perhaps by a program or a collection of finite automata. Simple properties, such as mutual exclusion or absence of deadlock, can be checked by inspecting individual states. More complex properties, such as lack of starvation, require search for cycles in the state graph with particular properties. Specifications to be checked may consist of built-in properties, such as deadlock or 'unspecified receptions' of messages, another program or implicit description, to be compared with a simulation, bisimulation, or language inclusion relation, or an assertion in one of several temporal logics. Finite-state verification tools are beginning to have a significant impact in commercial designs. There are many success stories of verification tools finding bugs in protocols or hardware controllers. In some cases, these tools have been incorporated into design methodology. Research in finite-state verification has been advancing rapidly, and is showing no signs of slowing down. Recent results include probabilistic algorithms for verification, exploitation of symmetry and independent events, and the use symbolic representations for Boolean functions and systems of linear inequalities. One of the most exciting areas for further research is the combination of model-checking with theorem-proving methods.

  13. Modeling biomembranes.

    SciTech Connect

    Plimpton, Steven James; Heffernan, Julieanne; Sasaki, Darryl Yoshio; Frischknecht, Amalie Lucile; Stevens, Mark Jackson; Frink, Laura J. Douglas

    2005-11-01

    Understanding the properties and behavior of biomembranes is fundamental to many biological processes and technologies. Microdomains in biomembranes or ''lipid rafts'' are now known to be an integral part of cell signaling, vesicle formation, fusion processes, protein trafficking, and viral and toxin infection processes. Understanding how microdomains form, how they depend on membrane constituents, and how they act not only has biological implications, but also will impact Sandia's effort in development of membranes that structurally adapt to their environment in a controlled manner. To provide such understanding, we created physically-based models of biomembranes. Molecular dynamics (MD) simulations and classical density functional theory (DFT) calculations using these models were applied to phenomena such as microdomain formation, membrane fusion, pattern formation, and protein insertion. Because lipid dynamics and self-organization in membranes occur on length and time scales beyond atomistic MD, we used coarse-grained models of double tail lipid molecules that spontaneously self-assemble into bilayers. DFT provided equilibrium information on membrane structure. Experimental work was performed to further help elucidate the fundamental membrane organization principles.

  14. Biomimetic modelling.

    PubMed Central

    Vincent, Julian F V

    2003-01-01

    Biomimetics is seen as a path from biology to engineering. The only path from engineering to biology in current use is the application of engineering concepts and models to biological systems. However, there is another pathway: the verification of biological mechanisms by manufacture, leading to an iterative process between biology and engineering in which the new understanding that the engineering implementation of a biological system can bring is fed back into biology, allowing a more complete and certain understanding and the possibility of further revelations for application in engineering. This is a pathway as yet unformalized, and one that offers the possibility that engineers can also be scientists. PMID:14561351

  15. New Fission Fragment Distributions and r-Process Origin of the Rare-Earth Elements

    NASA Astrophysics Data System (ADS)

    Goriely, S.; Sida, J.-L.; Lemaître, J.-F.; Panebianco, S.; Dubray, N.; Hilaire, S.; Bauswein, A.; Janka, H.-T.

    2013-12-01

    Neutron star (NS) merger ejecta offer a viable site for the production of heavy r-process elements with nuclear mass numbers A≳140. The crucial role of fission recycling is responsible for the robustness of this site against many astrophysical uncertainties, but calculations sensitively depend on nuclear physics. In particular, the fission fragment yields determine the creation of 110≲A≲170 nuclei. Here, we apply a new scission-point model, called SPY, to derive the fission fragment distribution (FFD) of all relevant neutron-rich, fissioning nuclei. The model predicts a doubly asymmetric FFD in the abundant A≃278 mass region that is responsible for the final recycling of the fissioning material. Using ejecta conditions based on relativistic NS merger calculations, we show that this specific FFD leads to a production of the A≃165 rare-earth peak that is nicely compatible with the abundance patterns in the Sun and metal-poor stars. This new finding further strengthens the case of NS mergers as possible dominant origin of r nuclei with A≳140.

  16. New fission fragment distributions and r-process origin of the rare-earth elements.

    PubMed

    Goriely, S; Sida, J-L; Lemaître, J-F; Panebianco, S; Dubray, N; Hilaire, S; Bauswein, A; Janka, H-T

    2013-12-13

    Neutron star (NS) merger ejecta offer a viable site for the production of heavy r-process elements with nuclear mass numbers A≳140. The crucial role of fission recycling is responsible for the robustness of this site against many astrophysical uncertainties, but calculations sensitively depend on nuclear physics. In particular, the fission fragment yields determine the creation of 110≲A≲170 nuclei. Here, we apply a new scission-point model, called SPY, to derive the fission fragment distribution (FFD) of all relevant neutron-rich, fissioning nuclei. The model predicts a doubly asymmetric FFD in the abundant A≃278 mass region that is responsible for the final recycling of the fissioning material. Using ejecta conditions based on relativistic NS merger calculations, we show that this specific FFD leads to a production of the A≃165 rare-earth peak that is nicely compatible with the abundance patterns in the Sun and metal-poor stars. This new finding further strengthens the case of NS mergers as possible dominant origin of r nuclei with A≳140. PMID:24483647

  17. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  18. Pre-Modeling Ensures Accurate Solid Models

    ERIC Educational Resources Information Center

    Gow, George

    2010-01-01

    Successful solid modeling requires a well-organized design tree. The design tree is a list of all the object's features and the sequential order in which they are modeled. The solid-modeling process is faster and less prone to modeling errors when the design tree is a simple and geometrically logical definition of the modeled object. Few high…

  19. Pre-Modeling Ensures Accurate Solid Models

    ERIC Educational Resources Information Center

    Gow, George

    2010-01-01

    Successful solid modeling requires a well-organized design tree. The design tree is a list of all the object's features and the sequential order in which they are modeled. The solid-modeling process is faster and less prone to modeling errors when the design tree is a simple and geometrically logical definition of the modeled object. Few high

  20. CISNET lung models: Comparison of model assumptions and model structures

    PubMed Central

    McMahon, Pamela M.; Hazelton, William; Kimmel, Marek; Clarke, Lauren

    2012-01-01

    Sophisticated modeling techniques can be powerful tools to help us understand the effects of cancer control interventions on population trends in cancer incidence and mortality. Readers of journal articles are however rarely supplied with modeling details. Six modeling groups collaborated as part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network (CISNET) to investigate the contribution of US tobacco control efforts towards reducing lung cancer deaths over the period 1975 to 2000. The models included in this monograph were developed independently and use distinct, complementary approaches towards modeling the natural history of lung cancer. The models used the same data for inputs and agreed on the design of the analysis and the outcome measures. This article highlights aspects of the models that are most relevant to similarities of or differences between the results. Structured comparisons can increase the transparency of these complex models. PMID:22882887

  1. Fission of actinides through quasimolecular shapes

    NASA Astrophysics Data System (ADS)

    Royer, Guy; Zhang, Hongfei; Eudes, Philippe; Moustabchir, Rachid; Moreau, Damien; Jaffré, Muriel; Morabit, Youssef; Particelli, Benjamin

    2013-12-01

    The potential energy of heavy nuclei has been calculated in the quasimolecular shape path from a generalized liquid drop model including the proximity energy, the charge and mass asymmetries and the microscopic corrections. The potential barriers are multiple-humped. The second maximum is the saddle-point. It corresponds to the transition from compact one-body shapes with a deep neck to two touching ellipsoids. The scission point lies at the end of an energy plateau well below the saddle-point and where the effects of the nuclear attractive forces between two separated fragments vanish. The energy on this plateau is the sum of the kinetic and excitation energies of the fragments. The shell and pairing corrections play an essential role to select the most probable fission path. The potential barrier heights agree with the experimental data and the theoretical half-lives follow the trend of the experimental values. A third peak and a shallow third minimum appear in asymmetric decay paths when one fragment is close to a double magic quasi-spherical nucleus, while the smaller one changes from oblate to prolate shapes.

  2. Fission barriers and half-lives of actinides in the quasimolecular shape valley

    NASA Astrophysics Data System (ADS)

    Royer, G.; Jaffré, M.; Moreau, D.

    2012-10-01

    The energy of actinide nuclei in the fusionlike deformation valley has been determined from a liquid-drop model, taking into account the proximity energy, the mass and charge asymmetries, and the shell and pairing energies. Double-humped potential barriers appear. The saddle point corresponds to the second maximum and to the transition from compact one-body shapes with a deep neck to two touching ellipsoids. The scission point, where the effects of the nuclear attractive forces between the fragments vanish, lies at the end of an energy plateau below the saddle point and corresponds to two well-separated fragments. The kinetic and excitation energies of the fragments come from the energy on this plateau. The shell and pairing effects play a main role to decide the most probable decay path. The heights of the potential barriers roughly agree with the experimental data and the calculated half-lives follow the trend of the experimental values. A shallow third minimum and a third peak appear in specific asymmetric exit channels where one fragment is close to a double magic quasispherical nucleus, while the other one evolves from oblate to prolate shapes.

  3. Angular Distributions of Fragments Originating from the Spontaneous Fission of Oriented Nuclei and Problem of the Conservation of the Spin Projection onto the Symmetry Axis of a Fissile Nucleus

    SciTech Connect

    Kadmensky, S.G.; Rodionova, L.V.

    2005-09-01

    The concept of transition fission states, which was successfully used to describe the angular distributions of fragments for the spontaneous and low-energy induced fission of axisymmetric nuclei, proves to be correct if the spin projection onto the symmetry axis of a fissile nucleus is an integral of the motion for the external region from the descent of the fissile nucleus from the external fission barrier to the scission point. Upon heating a fissile nucleus in this region to temperatures of T {approx_equal} 1 MeV (this is predicted by many theoretical models of the fission process), the Coriolis interaction uniformly mixes the possible projections of the fissile-nucleus spin for the case of low spin values, this leading to the loss of memory about transition fission states in the asymptotic region where the angular distributions of fragments are formed. Within quantum-mechanical fission theory, which takes into account deviations from A. Bohr's formula, the angular distributions of fragments are calculated for spontaneously fissile nuclei aligned by an external magnetic field at ultralow temperatures, and it is shown that an analysis of experimental angular distributions of fragments would make it possible to solve the problem of spin-projection conservation for fissile nuclei in the external region.

  4. Building Mental Models by Dissecting Physical Models

    ERIC Educational Resources Information Center

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…

  5. I&C Modeling in SPAR Models

    SciTech Connect

    John A. Schroeder

    2012-06-01

    The Standardized Plant Analysis Risk (SPAR) models for the U.S. commercial nuclear power plants currently have very limited instrumentation and control (I&C) modeling [1]. Most of the I&C components in the operating plant SPAR models are related to the reactor protection system. This was identified as a finding during the industry peer review of SPAR models. While the Emergency Safeguard Features (ESF) actuation and control system was incorporated into the Peach Bottom Unit 2 SPAR model in a recent effort [2], various approaches to expend resources for detailed I&C modeling in other SPAR models are investigated.

  6. Comparative protein structure modeling using MODELLER.

    PubMed

    Eswar, Narayanan; Webb, Ben; Marti-Renom, Marc A; Madhusudhan, M S; Eramian, David; Shen, Min-Yi; Pieper, Ursula; Sali, Andrej

    2007-11-01

    Functional characterization of a protein sequence is a common goal in biology, and is usually facilitated by having an accurate three-dimensional (3-D) structure of the studied protein. In the absence of an experimentally determined structure, comparative or homology modeling can sometimes provide a useful 3-D model for a protein that is related to at least one known protein structure. Comparative modeling predicts the 3-D structure of a given protein sequence (target) based primarily on its alignment to one or more proteins of known structure (templates). The prediction process consists of fold assignment, target-template alignment, model building, and model evaluation. This unit describes how to calculate comparative models using the program MODELLER and discusses all four steps of comparative modeling, frequently observed errors, and some applications. Modeling lactate dehydrogenase from Trichomonas vaginalis (TvLDH) is described as an example. The download and installation of the MODELLER software is also described. PMID:18429317

  7. Comparative Protein Structure Modeling Using MODELLER.

    PubMed

    Webb, Benjamin; Sali, Andrej

    2014-01-01

    Functional characterization of a protein sequence is one of the most frequent problems in biology. This task is usually facilitated by accurate three-dimensional (3-D) structure of the studied protein. In the absence of an experimentally determined structure, comparative or homology modeling can sometimes provide a useful 3-D model for a protein that is related to at least one known protein structure. Comparative modeling predicts the 3-D structure of a given protein sequence (target) based primarily on its alignment to one or more proteins of known structure (templates). The prediction process consists of fold assignment, target-template alignment, model building, and model evaluation. This unit describes how to calculate comparative models using the program MODELLER and discusses all four steps of comparative modeling, frequently observed errors, and some applications. Modeling lactate dehydrogenase from Trichomonas vaginalis (TvLDH) is described as an example. The download and installation of the MODELLER software is also described. PMID:25199792

  8. Comparative protein structure modeling using Modeller.

    PubMed

    Eswar, Narayanan; Webb, Ben; Marti-Renom, Marc A; Madhusudhan, M S; Eramian, David; Shen, Min-Yi; Pieper, Ursula; Sali, Andrej

    2006-10-01

    Functional characterization of a protein sequence is one of the most frequent problems in biology. This task is usually facilitated by accurate three-dimensional (3-D) structure of the studied protein. In the absence of an experimentally determined structure, comparative or homology modeling can sometimes provide a useful 3-D model for a protein that is related to at least one known protein structure. Comparative modeling predicts the 3-D structure of a given protein sequence (target) based primarily on its alignment to one or more proteins of known structure (templates). The prediction process consists of fold assignment, target-template alignment, model building, and model evaluation. This unit describes how to calculate comparative models using the program MODELLER and discusses all four steps of comparative modeling, frequently observed errors, and some applications. Modeling lactate dehydrogenase from Trichomonas vaginalis (TvLDH) is described as an example. The download and installation of the MODELLER software is also described. PMID:18428767

  9. Geoscientific Model Development - a journal about models, for modellers

    NASA Astrophysics Data System (ADS)

    Lunt, Daniel; Annan, James; Hargreaves, Julia; Rutt, Ian; Sander, Rolf

    2010-05-01

    The journal Geoscientific Model Development arose from the observation that despite modelling being central to climate/earth system science, the models themselves are not generally subject to the same level of scrutiny and peer review as the results they generate. Model descriptions are generally (with some exceptions) difficult to publish independent from scientific results, and so are necessarily space-limited when they do appear. Consequently, it is not uncommon that the description of a given model is spread across several papers, and crucial aspects of the formulation may not be published at all. Issues of reproducibility, platform-dependence, version proliferation and the various fudges and corrections often needed in modelling, are rarely addressed in the literature. GMD aims to change this by providing a place to publish detailed, peer-reviewed descriptions of numerical models, including verification and validation. Model developers can publish an initial description of a numbered version of their model, and address subsequent changes with a sequence of update papers. Thus, a body of citable literature can be developed which provides an authoritative reference for a given version of the model, greatly improving traceability and giving confidence in the provenance of the code. An additional benefit is that the citations generated will at last recognise the important contribution which model developers make to science. The publication process is typical for an open access EGU journal: papers are initially published in an on-line discussion journal (Geoscientific Model Development Discussions), for a period of eight weeks. Anonymous reviews are solicited as normal, but are also published in the discussion journal. Anyone else may contribute to the discussions, if they wish. After the discussion period, the revision/review process operates as normal, until the paper is finally accepted or rejected by the handling topical editor. In this paper we describe the journal, and present statistics of submissions, papers accepted etc. since its first issue in 2008. For more details, see http://www.geoscientific-model-development.net

  10. Model selection for logistic regression models

    NASA Astrophysics Data System (ADS)

    Duller, Christine

    2012-09-01

    Model selection for logistic regression models decides which of some given potential regressors have an effect and hence should be included in the final model. The second interesting question is whether a certain factor is heterogeneous among some subsets, i.e. whether the model should include a random intercept or not. In this paper these questions will be answered with classical as well as with Bayesian methods. The application show some results of recent research projects in medicine and business administration.

  11. Hydrodynamic model for electrical arc modelling

    SciTech Connect

    Chevrier, P.; Barrault, M.; Fievet, C.

    1996-10-01

    A hydrodynamic model for electrical arc modelling is presented. The model, which takes into account Joule heating, radiation, Laplace forces, arc-wall interactions and real gas effect, has been validated through comparisons with measurements. Industrial applications have already been computed and concern high, medium and low voltage circuit breakers.

  12. Modeling transient rootzone salinity (SWS Model)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The combined, water quality criteria for irrigation, water and ion processes in soils, and plant and soil response is sufficiently complex that adequate analysis requires computer models. Models for management are also needed but these models must consider that the input requirements must be reasona...

  13. "Bohr's Atomic Model."

    ERIC Educational Resources Information Center

    Willden, Jeff

    2001-01-01

    "Bohr's Atomic Model" is a small interactive multimedia program that introduces the viewer to a simplified model of the atom. This interactive simulation lets students build an atom using an atomic construction set. The underlying design methodology for "Bohr's Atomic Model" is model-centered instruction, which means the central model of the…

  14. Multilevel Model Prediction

    ERIC Educational Resources Information Center

    Frees, Edward W.; Kim, Jee-Seon

    2006-01-01

    Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…

  15. Building mental models by dissecting physical models.

    PubMed

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to ensure focused learning; models that are too constrained require less supervision, but can be constructed mechanically, with little to no conceptual engagement. We propose "model-dissection" as an alternative to "model-building," whereby instructors could make efficient use of supervisory resources, while simultaneously promoting focused learning. We report empirical results from a study conducted with biology undergraduate students, where we demonstrate that asking them to "dissect" out specific conceptual structures from an already built 3D physical model leads to a significant improvement in performance than asking them to build the 3D model from simpler components. Using questionnaires to measure understanding both before and after model-based interventions for two cohorts of students, we find that both the "builders" and the "dissectors" improve in the post-test, but it is the latter group who show statistically significant improvement. These results, in addition to the intrinsic time-efficiency of "model dissection," suggest that it could be a valuable pedagogical tool. © 2015 by The International Union of Biochemistry and Molecular Biology, 44:7-11, 2016. PMID:26712513

  16. Geologic Framework Model Analysis Model Report

    SciTech Connect

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the repository design. These downstream models include the hydrologic flow models and the radionuclide transport models. All the models and the repository design, in turn, will be incorporated into the Total System Performance Assessment (TSPA) of the potential radioactive waste repository block and vicinity to determine the suitability of Yucca Mountain as a host for the repository. The interrelationship of the three components of the ISM and their interface with downstream uses are illustrated in Figure 2.

  17. Educating with Aircraft Models

    ERIC Educational Resources Information Center

    Steele, Hobie

    1976-01-01

    Described is utilization of aircraft models, model aircraft clubs, and model aircraft magazines to promote student interest in aerospace education. The addresses for clubs and magazines are included. (SL)

  18. Modeling of geothermal systems

    SciTech Connect

    Bodvarsson, G.S.; Pruess, K.; Lippmann, M.J.

    1985-03-01

    During the last decade the use of numerical modeling for geothermal resource evaluation has grown significantly, and new modeling approaches have been developed. In this paper we present a summary of the present status in numerical modeling of geothermal systems, emphasizing recent developments. Different modeling approaches are described and their applicability discussed. The various modeling tasks, including natural-state, exploitation, injection, multi-component and subsidence modeling, are illustrated with geothermal field examples. 99 refs., 14 figs.

  19. Orbital Debris Modeling

    NASA Technical Reports Server (NTRS)

    Liou, J. C.

    2012-01-01

    Presentation outlne: (1) The NASA Orbital Debris (OD) Engineering Model -- A mathematical model capable of predicting OD impact risks for the ISS and other critical space assets (2) The NASA OD Evolutionary Model -- A physical model capable of predicting future debris environment based on user-specified scenarios (3) The NASA Standard Satellite Breakup Model -- A model describing the outcome of a satellite breakup (explosion or collision)

  20. Continuous system modeling

    NASA Technical Reports Server (NTRS)

    Cellier, Francois E.

    1991-01-01

    A comprehensive and systematic introduction is presented for the concepts associated with 'modeling', involving the transition from a physical system down to an abstract description of that system in the form of a set of differential and/or difference equations, and basing its treatment of modeling on the mathematics of dynamical systems. Attention is given to the principles of passive electrical circuit modeling, planar mechanical systems modeling, hierarchical modular modeling of continuous systems, and bond-graph modeling. Also discussed are modeling in equilibrium thermodynamics, population dynamics, and system dynamics, inductive reasoning, artificial neural networks, and automated model synthesis.

  1. Scaled models, scaled frequencies, and model fitting

    NASA Astrophysics Data System (ADS)

    Roxburgh, Ian W.

    2015-12-01

    I show that given a model star of mass M, radius R, and density profile ρ(x) [x = r/R], there exists a two parameter family of models with masses Mk, radii Rk, density profile ρk(x) = λρ(x) and frequencies νknℓ = λ1/2νnℓ, where λ,Rk/RA are scaling factors. These models have different internal structures, but all have the same value of separation ratios calculated at given radial orders n, and all exactly satisfy a frequency matching algorithm with an offset function determined as part of the fitting procedure. But they do not satisfy ratio matching at given frequencies nor phase shift matching. This illustrates that erroneous results may be obtained when model fitting with ratios at given n values or frequency matching. I give examples from scaled models and from non scaled evolutionary models.

  2. ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT

    SciTech Connect

    Clinton Lum

    2002-02-04

    The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS M&O 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS M&O 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3) Development of geostatistical simulations of porosity; (4) Generation of derivative property models via linear coregionalization with porosity; (5) Post-processing of the simulated models to impart desired secondary geologic attributes and to create summary and uncertainty models; and (6) Conversion of the models into real-world coordinates. The conversion to real world coordinates is performed as part of the integration of the RPM into the Integrated Site Model (ISM) 3.1; this activity is not part of the current analysis. The ISM provides a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site and consists of three components: (1) Geologic Framework Model (GFM); (2) RPM, which is the subject of this AMR; and (3) Mineralogic Model. The interrelationship of the three components of the ISM and their interface with downstream uses are illustrated in Figure 1. Figure 2 shows the geographic boundaries of the RPM and other component models of the ISM.

  3. Model Reduction of Viscoelastic Finite Element Models

    NASA Astrophysics Data System (ADS)

    Park, C. H.; Inman, D. J.; Lam, M. J.

    1999-01-01

    This paper examines a method of adding viscoelastic properties to finite element models by using additional co-ordinates to account for the frequency dependence usually associated with such damping materials. Several such methods exist and all suffer from an increase in order of the final finite model which is undesirable in many applications. Here we propose to combine one of these methods, the GHM (Golla-Hughes-McTavish) method, with model reduction techniques to remove the objection of increased model order. The result of combining several methods is an ability to add the effects of visoelastic components to finite element or other analytical models without increasing the order of the system. The procedure is illustrated by a numerical example. The method proposed here results in a viscoelastic finite element of a structure without increasing the order of the original model.

  4. To model or not to model?

    PubMed

    Fletcher, Daniel A

    2011-04-01

    In theory, the combination of mathematical modeling with experimental studies can be a powerful and compelling approach to understanding cell biology. In practice, choosing appropriate problems, identifying willing and able collaborators, and publishing the resulting research can be remarkably challenging. To provide perspective on the question of whether and when to combine modeling and experiments, a panel of experts at the 2010 ASCB Annual Meeting shared their personal experiences and advice on how to use modeling effectively. PMID:21454831

  5. Biosphere Model Report

    SciTech Connect

    M. A. Wasiolek

    2003-10-27

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).

  6. Biosphere Model Report

    SciTech Connect

    D. W. Wu

    2003-07-16

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).

  7. Qualitative Student Models.

    ERIC Educational Resources Information Center

    Clancey, William J.

    The concept of a qualitative model is used as the focus of this review of qualitative student models in order to compare alternative computational models and to contrast domain requirements. The report is divided into eight sections: (1) Origins and Goals (adaptive instruction, qualitative models of processes, components of an artificial

  8. Multimodeling and Model Abstraction

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The multiplicity of models of the same process or phenomenon is the commonplace in environmental modeling. Last 10 years brought marked interest to making use of the variety of conceptual approaches instead of attempting to find the best model or using a single preferred model. Two systematic approa...

  9. AIDS Epidemiological models

    NASA Astrophysics Data System (ADS)

    Rahmani, Fouad Lazhar

    2010-11-01

    The aim of this paper is to present mathematical modelling of the spread of infection in the context of the transmission of the human immunodeficiency virus (HIV) and the acquired immune deficiency syndrome (AIDS). These models are based in part on the models suggested in the field of th AIDS mathematical modelling as reported by ISHAM [6].

  10. Generative Models of Disfluency

    ERIC Educational Resources Information Center

    Miller, Timothy A.

    2010-01-01

    This thesis describes a generative model for representing disfluent phenomena in human speech. This model makes use of observed syntactic structure present in disfluent speech, and uses a right-corner transform on syntax trees to model this structure in a very natural way. Specifically, the phenomenon of speech repair is modeled by explicitly…

  11. Biomass Scenario Model

    SciTech Connect

    2015-09-01

    The Biomass Scenario Model (BSM) is a unique, carefully validated, state-of-the-art dynamic model of the domestic biofuels supply chain which explicitly focuses on policy issues, their feasibility, and potential side effects. It integrates resource availability, physical/technological/economic constraints, behavior, and policy. The model uses a system dynamics simulation (not optimization) to model dynamic interactions across the supply chain.

  12. The Instrumental Model

    ERIC Educational Resources Information Center

    Yeates, Devin Rodney

    2011-01-01

    The goal of this dissertation is to enable better predictive models by engaging raw experimental data through the Instrumental Model. The Instrumental Model captures the protocols and procedures of experimental data analysis. The approach is formalized by encoding the Instrumental Model in an XML record. Decoupling the raw experimental data from…

  13. Qualitative Student Models.

    ERIC Educational Resources Information Center

    Clancey, William J.

    The concept of a qualitative model is used as the focus of this review of qualitative student models in order to compare alternative computational models and to contrast domain requirements. The report is divided into eight sections: (1) Origins and Goals (adaptive instruction, qualitative models of processes, components of an artificial…

  14. Efficient polarimetric BRDF model.

    PubMed

    Renhorn, Ingmar G E; Hallberg, Tomas; Boreman, Glenn D

    2015-11-30

    The purpose of the present manuscript is to present a polarimetric bidirectional reflectance distribution function (BRDF) model suitable for hyperspectral and polarimetric signature modelling. The model is based on a further development of a previously published four-parameter model that has been generalized in order to account for different types of surface structures (generalized Gaussian distribution). A generalization of the Lambertian diffuse model is presented. The pBRDF-functions are normalized using numerical integration. Using directional-hemispherical reflectance (DHR) measurements, three of the four basic parameters can be determined for any wavelength. This simplifies considerably the development of multispectral polarimetric BRDF applications. The scattering parameter has to be determined from at least one BRDF measurement. The model deals with linear polarized radiation; and in similarity with e.g. the facet model depolarization is not included. The model is very general and can inherently model extreme surfaces such as mirrors and Lambertian surfaces. The complex mixture of sources is described by the sum of two basic models, a generalized Gaussian/Fresnel model and a generalized Lambertian model. Although the physics inspired model has some ad hoc features, the predictive power of the model is impressive over a wide range of angles and scattering magnitudes. The model has been applied successfully to painted surfaces, both dull and glossy and also on metallic bead blasted surfaces. The simple and efficient model should be attractive for polarimetric simulations and polarimetric remote sensing. PMID:26698753

  15. Introduction to Adjoint Models

    NASA Technical Reports Server (NTRS)

    Errico, Ronald M.

    2015-01-01

    In this lecture, some fundamentals of adjoint models will be described. This includes a basic derivation of tangent linear and corresponding adjoint models from a parent nonlinear model, the interpretation of adjoint-derived sensitivity fields, a description of methods of automatic differentiation, and the use of adjoint models to solve various optimization problems, including singular vectors. Concluding remarks will attempt to correct common misconceptions about adjoint models and their utilization.

  16. Stable models of superacceleration

    SciTech Connect

    Kaplinghat, Manoj; Rajaraman, Arvind

    2007-05-15

    We discuss an instability in a large class of models where dark energy is coupled to matter. In these models the mass of the scalar field is much larger than the expansion rate of the Universe. We find models in which this instability is absent, and show that these models generically predict an apparent equation of state for dark energy smaller than -1, i.e., superacceleration. These models have no acausal behavior or ghosts.

  17. WASP TRANSPORT MODELING AND WASP ECOLOGICAL MODELING

    EPA Science Inventory

    A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...

  18. Model Shrinkage for Discriminative Language Models

    NASA Astrophysics Data System (ADS)

    Oba, Takanobu; Hori, Takaaki; Nakamura, Atsushi; Ito, Akinori

    This paper describes a technique for overcoming the model shrinkage problem in automatic speech recognition (ASR), which allows application developers and users to control the model size with less degradation of accuracy. Recently, models for ASR systems tend to be large and this can constitute a bottleneck for developers and users without special knowledge of ASR with respect to introducing the ASR function. Specifically, discriminative language models (DLMs) are usually designed in a high-dimensional parameter space, although DLMs have gained increasing attention as an approach for improving recognition accuracy. Our proposed method can be applied to linear models including DLMs, in which the score of an input sample is given by the inner product of its features and the model parameters, but our proposed method can shrink models in an easy computation by obtaining simple statistics, which are square sums of feature values appearing in a data set. Our experimental results show that our proposed method can shrink a DLM with little degradation in accuracy and perform properly whether or not the data for obtaining the statistics are the same as the data for training the model.

  19. Causal Models in Educational Research: Recursive Models.

    ERIC Educational Resources Information Center

    Anderson, James G.; Evans, Francis B.

    Causal modelling procedures developed in economics and biology provide social scientists with powerful methodological tools that permit them to bridge the gap between theory and research. In this paper one type of causal modelling technique involving a structural set of equations that are recursive in form has been used to reanalyze the data from…

  20. ADAPT model: Model use, calibration and validation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper presents an overview of the Agricultural Drainage and Pesticide Transport (ADAPT) model and a case study to illustrate the calibration and validation steps for predicting subsurface tile drainage and nitrate-N losses from an agricultural system. The ADAPT model is a daily time step field ...

  1. Geochemistry Model Validation Report: External Accumulation Model

    SciTech Connect

    K. Zarrabi

    2001-09-27

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation, density of accumulation, and the geometry of the accumulation zone. The density of accumulation and the geometry of the accumulation zone are calculated using a characterization of the fracture system based on field measurements made in the proposed repository (BSC 2001k). The model predicts that accumulation would spread out in a conical accumulation volume. The accumulation volume is represented with layers as shown in Figure 1. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance.

  2. Trapped Radiation Model Uncertainties: Model-Data and Model-Model Comparisons

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux and dose measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP. LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir space station. This report gives the details of the model-data comparisons -- summary results in terms of empirical model uncertainty factors that can be applied for spacecraft design applications are given in a companion report. The results of model-model comparisons are also presented from standard AP8 and AE8 model predictions compared with the European Space Agency versions of AP8 and AE8 and with Russian trapped radiation models.

  3. Trapped Radiation Model Uncertainties: Model-Data and Model-Model Comparisons

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux and dose measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP, LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir Space Station. This report gives the details of the model-data comparisons-summary results in terms of empirical model uncertainty factors that can be applied for spacecraft design applications are given in a combination report. The results of model-model comparisons are also presented from standard AP8 and AE8 model predictions compared with the European Space Agency versions of AP8 and AE8 and with Russian-trapped radiation models.

  4. Model Validation Status Review

    SciTech Connect

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and engineered barriers, plus the TSPA model itself Description of the model areas is provided in Section 3, and the documents reviewed are described in Section 4. The responsible manager for the Model Validation Status Review was the Chief Science Officer (CSO) for Bechtel-SAIC Co. (BSC). The team lead was assigned by the CSO. A total of 32 technical specialists were engaged to evaluate model validation status in the 21 model areas. The technical specialists were generally independent of the work reviewed, meeting technical qualifications as discussed in Section 5.

  5. Modeling volatility using state space models.

    PubMed

    Timmer, J; Weigend, A S

    1997-08-01

    In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system. In this framework, we show that empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility ranging from three weeks (for foreign exchange) to three to five months (for stock indices). In most cases, a two-dimensional hidden state is required to yield residuals that are consistent with white noise. We compare these results with ordinary autoregressive models (without a hidden state) and find that autoregressive models underestimate the relaxation times by about two orders of magnitude since they do not distinguish between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities. Data sets used: Olsen & Associates high frequency DEM/USD foreign exchange rates (8 years). Nikkei 225 index (40 years). Dow Jones Industrial Average (25 years). PMID:9730016

  6. Holographic twin Higgs model.

    PubMed

    Geller, Michael; Telem, Ofri

    2015-05-15

    We present the first realization of a "twin Higgs" model as a holographic composite Higgs model. Uniquely among composite Higgs models, the Higgs potential is protected by a new standard model (SM) singlet elementary "mirror" sector at the sigma model scale f and not by the composite states at m_{KK}, naturally allowing for m_{KK} beyond the LHC reach. As a result, naturalness in our model cannot be constrained by the LHC, but may be probed by precision Higgs measurements at future lepton colliders, and by direct searches for Kaluza-Klein excitations at a 100 TeV collider. PMID:26024160

  7. Energy-consumption modelling

    SciTech Connect

    Reiter, E.R.

    1980-01-01

    A highly sophisticated and accurate approach is described to compute on an hourly or daily basis the energy consumption for space heating by individual buildings, urban sectors, and whole cities. The need for models and specifically weather-sensitive models, composite models, and space-heating models are discussed. Development of the Colorado State University Model, based on heat-transfer equations and on a heuristic, adaptive, self-organizing computation learning approach, is described. Results of modeling energy consumption by the city of Minneapolis and Cheyenne are given. Some data on energy consumption in individual buildings are included.

  8. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  9. Modeling the transition region

    NASA Technical Reports Server (NTRS)

    Singer, Bart A.

    1993-01-01

    The current status of transition-region models is reviewed in this report. To understand modeling problems, various flow features that influence the transition process are discussed first. Then an overview of the different approaches to transition-region modeling is given. This is followed by a detailed discussion of turbulence models and the specific modifications that are needed to predict flows undergoing laminar-turbulent transition. Methods for determining the usefulness of the models are presented, and an outlook for the future of transition-region modeling is suggested.

  10. Modeling worldwide highway networks

    NASA Astrophysics Data System (ADS)

    Villas Boas, Paulino R.; Rodrigues, Francisco A.; da F. Costa, Luciano

    2009-12-01

    This Letter addresses the problem of modeling the highway systems of different countries by using complex networks formalism. More specifically, we compare two traditional geographical models with a modified geometrical network model where paths, rather than edges, are incorporated at each step between the origin and the destination vertices. Optimal configurations of parameters are obtained for each model and used for the comparison. The highway networks of Australia, Brazil, India, and Romania are considered and shown to be properly modeled by the modified geographical model.

  11. Modeling Guru: Knowledge Base for NASA Modelers

    NASA Astrophysics Data System (ADS)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the same document, even allowing the author to select those who may edit and approve the document. To maintain knowledge integrity, all documents are moderated before they are visible to the public. Modeling Guru, running on Clearspace by Jive Software, has been an active resource to the NASA modeling and HEC communities for more than a year and currently has more than 100 active users. SIVO will soon install live instant messaging support, as well as a user-customizable homepage with social-networking features. In addition, SIVO plans to implement a large dataset/file storage capability so that users can quickly and easily exchange datasets and files with one another. Continued active community participation combined with periodic software updates and improved features will ensure that Modeling Guru remains a vibrant, effective, easy-to-use tool for the NASA scientific community.

  12. A future of the model organism model.

    PubMed

    Rine, Jasper

    2014-03-01

    Changes in technology are fundamentally reframing our concept of what constitutes a model organism. Nevertheless, research advances in the more traditional model organisms have enabled fresh and exciting opportunities for young scientists to establish new careers and offer the hope of comprehensive understanding of fundamental processes in life. New advances in translational research can be expected to heighten the importance of basic research in model organisms and expand opportunities. However, researchers must take special care and implement new resources to enable the newest members of the community to engage fully with the remarkable legacy of information in these fields. PMID:24577733

  13. Biosphere Model Report

    SciTech Connect

    D.W. Wu; A.J. Smith

    2004-11-08

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).

  14. Develop a Model Component

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler S.

    2013-01-01

    During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.

  15. Aerosol Modeling for the Global Model Initiative

    NASA Technical Reports Server (NTRS)

    Weisenstein, Debra K.; Ko, Malcolm K. W.

    2001-01-01

    The goal of this project is to develop an aerosol module to be used within the framework of the Global Modeling Initiative (GMI). The model development work will be preformed jointly by the University of Michigan and AER, using existing aerosol models at the two institutions as starting points. The GMI aerosol model will be tested, evaluated against observations, and then applied to assessment of the effects of aircraft sulfur emissions as needed by the NASA Subsonic Assessment in 2001. The work includes the following tasks: 1. Implementation of the sulfur cycle within GMI, including sources, sinks, and aqueous conversion of sulfur. Aerosol modules will be added as they are developed and the GMI schedule permits. 2. Addition of aerosol types other than sulfate particles, including dust, soot, organic carbon, and black carbon. 3. Development of new and more efficient parameterizations for treating sulfate aerosol nucleation, condensation, and coagulation among different particle sizes and types.

  16. Nonlinear Modeling by Assembling Piecewise Linear Models

    NASA Technical Reports Server (NTRS)

    Yao, Weigang; Liou, Meng-Sing

    2013-01-01

    To preserve nonlinearity of a full order system over a parameters range of interest, we propose a simple modeling approach by assembling a set of piecewise local solutions, including the first-order Taylor series terms expanded about some sampling states. The work by Rewienski and White inspired our use of piecewise linear local solutions. The assembly of these local approximations is accomplished by assigning nonlinear weights, through radial basis functions in this study. The efficacy of the proposed procedure is validated for a two-dimensional airfoil moving at different Mach numbers and pitching motions, under which the flow exhibits prominent nonlinear behaviors. All results confirm that our nonlinear model is accurate and stable for predicting not only aerodynamic forces but also detailed flowfields. Moreover, the model is robustness-accurate for inputs considerably different from the base trajectory in form and magnitude. This modeling preserves nonlinearity of the problems considered in a rather simple and accurate manner.

  17. Aggregation in ecosystem models and model stability

    NASA Astrophysics Data System (ADS)

    Giricheva, Evgeniya

    2015-05-01

    Using a multimodal approach to research ecosystems improves usage of available information on an object. This study presents several models of the Bering Sea ecosystem. The ecosystem is considered as a closed object, that is, the influence of the environment is not provided. We then add the links with the external medium in the models. The models differ in terms of the degree and method of grouping components. Our method is based on the differences in habitat and food source of groups, which allows us to determine the grouping of species with a greater effect on system dynamics. In particular, we determine whether benthic fish aggregation or pelagic fish aggregation can change the consumption structure of some groups of species, and consequently, the behavior of the entire model system.

  18. Solid Waste Projection Model: Model user's guide

    SciTech Connect

    Stiles, D.L.; Crow, V.L.

    1990-08-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford company (WHC) specifically to address solid waste management issues at the Hanford Central Waste Complex (HCWC). This document, one of six documents supporting the SWPM system, contains a description of the system and instructions for preparing to use SWPM and operating Version 1 of the model. 4 figs., 1 tab.

  19. Protein solubility modeling

    NASA Technical Reports Server (NTRS)

    Agena, S. M.; Pusey, M. L.; Bogle, I. D.

    1999-01-01

    A thermodynamic framework (UNIQUAC model with temperature dependent parameters) is applied to model the salt-induced protein crystallization equilibrium, i.e., protein solubility. The framework introduces a term for the solubility product describing protein transfer between the liquid and solid phase and a term for the solution behavior describing deviation from ideal solution. Protein solubility is modeled as a function of salt concentration and temperature for a four-component system consisting of a protein, pseudo solvent (water and buffer), cation, and anion (salt). Two different systems, lysozyme with sodium chloride and concanavalin A with ammonium sulfate, are investigated. Comparison of the modeled and experimental protein solubility data results in an average root mean square deviation of 5.8%, demonstrating that the model closely follows the experimental behavior. Model calculations and model parameters are reviewed to examine the model and protein crystallization process. Copyright 1999 John Wiley & Sons, Inc.

  20. Modeling DNA Replication.

    ERIC Educational Resources Information Center

    Bennett, Joan

    1998-01-01

    Recommends the use of a model of DNA made out of Velcro to help students visualize the steps of DNA replication. Includes a materials list, construction directions, and details of the demonstration using the model parts. (DDR)

  1. Bounding species distribution models

    USGS Publications Warehouse

    Stohlgren, T.J.; Jarnevich, C.S.; Esaias, W.E.; Morisette, J.T.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used. ?? 2011 Current Zoology.

  2. Bounding Species Distribution Models

    NASA Technical Reports Server (NTRS)

    Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].

  3. Consistent model driven architecture

    NASA Astrophysics Data System (ADS)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  4. The Cap Pele Model.

    ERIC Educational Resources Information Center

    Pruneau, Diane; Chouinard, Omer; Arsenault, Charline

    1998-01-01

    Reports on a model of environmental education that aims to encourage greater attachment to the bioregion of Arcadia. The model results from cooperation within a village community and addresses the environmental education of people of all ages. (DDR)

  5. PERSISTENCE IN MODEL ECOSYSTEMS

    EPA Science Inventory

    Mathematical models aid in understanding environmental systems and in developing testable hypotheses relevant to the fate and ecological effects of toxic substances in such systems. Within the framework of microcosm or laboratory ecosystem modeling, some differential equation mod...

  6. SEDIMENT GEOCHEMICAL MODEL

    EPA Science Inventory

    Until recently, sediment geochemical models (diagenetic models) have been only able to explain sedimentary flux and concentration profiles for a few simplified geochemical cycles (e.g., nitrogen, carbon and sulfur). However with advances in numerical methods, increased accuracy ...

  7. Modeling Infectious Diseases

    MedlinePlus

    ... more confidence in the predictions. Who builds the models? Since May 2004, a network of researchers has ... National Academy of Sciences What diseases does MIDAS model? The MIDAS network focuses on infectious disease outbreaks, ...

  8. System Advisor Model

    Energy Science and Technology Software Center (ESTSC)

    2010-03-01

    The System Advisor Model (SAM) is a performance and economic model designed to facilitate decision making for people involved in the renewable energy industry, ranging from project managers and engineers to incentive program designers, technology developers, and researchers.

  9. TMDL RUSLE MODEL

    EPA Science Inventory

    We developed a simplified spreadsheet modeling approach for characterizing and prioritizing sources of sediment loadings from watersheds in the United States. A simplified modeling approach was developed to evaluate sediment loadings from watersheds and selected land segments. ...

  10. Modeling EERE deployment programs

    SciTech Connect

    Cort, K. A.; Hostick, D. J.; Belzer, D. B.; Livingston, O. V.

    2007-11-01

    The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.

  11. Communication system modeling

    NASA Technical Reports Server (NTRS)

    Holland, L. D.; Walsh, J. R., Jr.; Wetherington, R. D.

    1971-01-01

    This report presents the results of work on communications systems modeling and covers three different areas of modeling. The first of these deals with the modeling of signals in communication systems in the frequency domain and the calculation of spectra for various modulations. These techniques are applied in determining the frequency spectra produced by a unified carrier system, the down-link portion of the Command and Communications System (CCS). The second modeling area covers the modeling of portions of a communication system on a block basis. A detailed analysis and modeling effort based on control theory is presented along with its application to modeling of the automatic frequency control system of an FM transmitter. A third topic discussed is a method for approximate modeling of stiff systems using state variable techniques.

  12. METEOROLOGICAL AND TRANSPORT MODELING

    EPA Science Inventory

    Advanced air quality simulation models, such as CMAQ, as well as other transport and dispersion models, require accurate and detailed meteorology fields. These meteorology fields include primary 3-dimensional dynamical and thermodynamical variables (e.g., winds, temperature, mo...

  13. Of Molecules and Models.

    ERIC Educational Resources Information Center

    Brinner, Bonnie

    1992-01-01

    Presents an activity in which models help students visualize both the DNA process and transcription. After constructing DNA, RNA messenger, and RNA transfer molecules; students model cells, protein synthesis, codons, and RNA movement. (MDH)

  14. Mass modeling for bars

    NASA Technical Reports Server (NTRS)

    Butler, Thomas G.

    1987-01-01

    Methods of modeling mass for bars are surveyed. A method for extending John Archer's concept of consistent mass beyond just translational inertia effects is included. Recommendations are given for various types of modeling situations.

  15. Evaluation of comparative protein modeling by MODELLER.

    PubMed

    Sali, A; Potterton, L; Yuan, F; van Vlijmen, H; Karplus, M

    1995-11-01

    We evaluate 3D models of human nucleoside diphosphate kinase, mouse cellular retinoic acid binding protein I, and human eosinophil neurotoxin that were calculated by MODELLER, a program for comparative protein modeling by satisfaction of spatial restraints. The models have good stereochemistry and are at least as similar to the crystallographic structures as the closest template structures. The largest errors occur in the regions that were not aligned correctly or where the template structures are not similar to the correct structure. These regions correspond predominantly to exposed loops, insertions of any length, and non-conserved side chains. When a template structure with more than 40% sequence identity to the target protein is available, the model is likely to have about 90% of the mainchain atoms modeled with an rms deviation from the X-ray structure of approximately 1 A, in large part because the templates are likely to be that similar to the X-ray structure of the target. This rms deviation is comparable to the overall differences between refined NMR and X-ray crystallography structures of the same protein. PMID:8710825

  16. Hierarchical Bass model

    NASA Astrophysics Data System (ADS)

    Tashiro, Tohru

    2014-03-01

    We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.

  17. Soil moisture modeling review

    NASA Technical Reports Server (NTRS)

    Hildreth, W. W.

    1978-01-01

    A determination of the state of the art in soil moisture transport modeling based on physical or physiological principles was made. It was found that soil moisture models based on physical principles have been under development for more than 10 years. However, these models were shown to represent infiltration and redistribution of soil moisture quite well. Evapotranspiration has not been as adequately incorporated into the models.

  18. Modeling of spacecraft charging

    NASA Technical Reports Server (NTRS)

    Whipple, E. C., Jr.

    1977-01-01

    Three types of modeling of spacecraft charging are discussed: statistical models, parametric models, and physical models. Local time dependence of circuit upset for DoD and communication satellites, and electron current to a sphere with an assumed Debye potential distribution are presented. Four regions were involved in spacecraft charging: (1) undisturbed plasma, (2) plasma sheath region, (3) spacecraft surface, and (4) spacecraft equivalent circuit.

  19. Wonderland climate model

    NASA Astrophysics Data System (ADS)

    Hansen, J.; Ruedy, R.; Lacis, A.; Russell, G.; Sato, M.; Lerner, J.; Rind, D.; Stone, P.

    1997-03-01

    We obtain a highly efficient global climate model by defining a sector version (120 of longitude) of the coarse resolution Goddard Institute for Space Studies model II. The geography of Wonderland is chosen such that the amount of land as a function of latitude is the same as on Earth. We show that the zonal mean climate of the Wonderland model is very similar to that of the parent model II.

  20. Future of groundwater modeling

    USGS Publications Warehouse

    Langevin, Christian D.; Panday, Sorab

    2012-01-01

    With an increasing need to better manage water resources, the future of groundwater modeling is bright and exciting. However, while the past can be described and the present is known, the future of groundwater modeling, just like a groundwater model result, is highly uncertain and any prediction is probably not going to be entirely representative. Thus we acknowledge this as we present our vision of where groundwater modeling may be headed.

  1. Avionics Architecture Modelling Language

    NASA Astrophysics Data System (ADS)

    Alana, Elena; Naranjo, Hector; Valencia, Raul; Medina, Alberto; Honvault, Christophe; Rugina, Ana; Panunzia, Marco; Dellandrea, Brice; Garcia, Gerald

    2014-08-01

    This paper presents the ESA AAML (Avionics Architecture Modelling Language) study, which aimed at advancing the avionics engineering practices towards a model-based approach by (i) identifying and prioritising the avionics-relevant analyses, (ii) specifying the modelling language features necessary to support the identified analyses, and (iii) recommending/prototyping software tooling to demonstrate the automation of the selected analyses based on a modelling language and compliant with the defined specification.

  2. Modelling of cells bioenergetics.

    PubMed

    Kasperski, Andrzej

    2008-09-01

    This paper presents an integrated model describing the control of Saccharomyces cerevisiae yeast cells bioenergetics. This model describes the oxidative and respirofermentative metabolism. The model assumes that the mitochondria of the Saccharomyces cerevisiae cells are charged with NADH during the tricarboxylic acid cycle, and NADH is discharged from mitochondria later in the electron transport system. Selected effects observed in the Saccharomyces cerevisiae eucaryotic cells, including the Pasteur's and Crabtree effects, are also modeled. PMID:18379882

  3. Mathematical circulatory system model

    NASA Technical Reports Server (NTRS)

    Lakin, William D. (Inventor); Stevens, Scott A. (Inventor)

    2010-01-01

    A system and method of modeling a circulatory system including a regulatory mechanism parameter. In one embodiment, a regulatory mechanism parameter in a lumped parameter model is represented as a logistic function. In another embodiment, the circulatory system model includes a compliant vessel, the model having a parameter representing a change in pressure due to contraction of smooth muscles of a wall of the vessel.

  4. Modeling Complex Calorimeters

    NASA Technical Reports Server (NTRS)

    Figueroa-Feliciano, Enectali

    2004-01-01

    We have developed a software suite that models complex calorimeters in the time and frequency domain. These models can reproduce all measurements that we currently do in a lab setting, like IV curves, impedance measurements, noise measurements, and pulse generation. Since all these measurements are modeled from one set of parameters, we can fully describe a detector and characterize its behavior. This leads to a model than can be used effectively for engineering and design of detectors for particular applications.

  5. Models, Norms and Sharing.

    ERIC Educational Resources Information Center

    Harris, Mary B.

    To investigate the effect of modeling on altruism, 156 third and fifth grade children were exposed to a model who either shared with them, gave to a charity, or refused to share. The test apparatus, identified as a game, consisted of a box with signal lights and a chute through which marbles were dispensed. Subjects and the model played the game…

  6. Boys Town Education Model.

    ERIC Educational Resources Information Center

    Wells, Patricia L.

    This monograph describes the curriculum and teaching methods used to teach socialization skills at the Boys Town (Nebraska) special residential school for boys with behavioral disorders as well as replications of the Boys Town model in other locations. The model takes the basic techniques of the schools's Family/Home model and applies them to…

  7. Modern Media Education Models

    ERIC Educational Resources Information Center

    Fedorov, Alexander

    2011-01-01

    The author supposed that media education models can be divided into the following groups: (1) educational-information models (the study of the theory, history, language of media culture, etc.), based on the cultural, aesthetic, semiotic, socio-cultural theories of media education; (2) educational-ethical models (the study of moral, religions,…

  8. Impact-GMI Model

    Energy Science and Technology Software Center (ESTSC)

    2007-03-22

    IMPACT-GMI is an atmospheric chemical transport model designed to run on massively parallel computers. It is designed to model trace pollutants in the atmosphere. It includes models for emission, chemistry and deposition of pollutants. It can be used to assess air quality and its impact on future climate change.

  9. Rock Properties Model

    SciTech Connect

    C. Lum

    2004-09-16

    The purpose of this model report is to document the Rock Properties Model version 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties model provides mean matrix and lithophysae porosity, and the cross-correlated mean bulk density as direct input to the ''Saturated Zone Flow and Transport Model Abstraction'', MDL-NBS-HS-000021, REV 02 (BSC 2004 [DIRS 170042]). The constraints, caveats, and limitations associated with this model are discussed in Section 6.6 and 8.2. Model validation accomplished by corroboration with data not cited as direct input is discussed in Section 7. The revision of this model report was performed as part of activities being conducted under the ''Technical Work Plan for: The Integrated Site Model, Revision 05'' (BSC 2004 [DIRS 169635]). The purpose of this revision is to bring the report up to current procedural requirements and address the Regulatory Integration Team evaluation comments. The work plan describes the scope, objectives, tasks, methodology, and procedures for this process.

  10. Models in Science Education.

    ERIC Educational Resources Information Center

    Marx, George; Toth, Esther

    1981-01-01

    Models are used in everyday life and in school to predict and understand reality. In science classes, models should not be taught as absolute truths but used instead to promote reasoning. Exercises, strategies, and games are suggested to help students better learn about and use models. (DC)

  11. REGULATORY AIR QUALITY MODELS

    EPA Science Inventory

    Appendix W to 40CFR Part 51 (Guideline on Air Quality Models) specifies the models to be used for purposes of permitting, PSD, and SIPs. Through a formal regulatory process this modeling guidance is periodically updated to reflect current science. In the most recent action, thr...

  12. Modeling and Simulation.

    ERIC Educational Resources Information Center

    Root, Gus

    Any effort to change a complex social system should begin by constructing models and simulations of the problem and its social context. These will increase the likelihood of long-term success, decrease the chance of unexpected negative side effects, and help conserve scarce national resources. Models may be of three types: (1) prose models--which…

  13. Biophysical and spectral modeling

    NASA Technical Reports Server (NTRS)

    Goel, N. S. (Principal Investigator)

    1982-01-01

    Activities and results of a project to develop strategies for modeling vegetative canopy reflectance are reported. Specific tasks included the inversion of canopy reflectance models to estimate agronomic variables (particularly leaf area index) from in-situ reflectance measurements, and a study of possible uses of ecological models in analyzing temporal profiles of greenness.

  14. Modeling Natural Selection

    ERIC Educational Resources Information Center

    Bogiages, Christopher A.; Lotter, Christine

    2011-01-01

    In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a better understanding of the content, process, and nature of science (Kenyon, Schwarz, and Hug…

  15. A Model Performance

    ERIC Educational Resources Information Center

    Thornton, Bradley D.; Smalley, Robert A.

    2008-01-01

    Building information modeling (BIM) uses three-dimensional modeling concepts, information technology and interoperable software to design, construct and operate a facility. However, BIM can be more than a tool for virtual modeling--it can provide schools with a 3-D walkthrough of a project while it still is on the electronic drawing board. BIM can

  16. Surface complexation modeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Adsorption-desorption reactions are important processes that affect the transport of contaminants in the environment. Surface complexation models are chemical models that can account for the effects of variable chemical conditions, such as pH, on adsorption reactions. These models define specific ...

  17. General Graded Response Model.

    ERIC Educational Resources Information Center

    Samejima, Fumiko

    This paper describes the graded response model. The graded response model represents a family of mathematical models that deal with ordered polytomous categories, such as: (1) letter grading; (2) an attitude survey with "strongly disagree, disagree, agree, and strongly agree" choices; (3) partial credit given in accord with an individual's degree

  18. Volition Support Design Model

    ERIC Educational Resources Information Center

    Kim, ChanMin

    2013-01-01

    The purpose of this paper is to introduce a design model for supporting student volition. First, the construct of volition is explained and the importance of volition is further described in the context of goal attainment. Next, the theoretical basis of the model is described. Last, implications of the model are discussed for the design of…

  19. Modelling a Suspension Bridge.

    ERIC Educational Resources Information Center

    Rawlins, Phil

    1991-01-01

    The quadratic function can be modeled in real life by a suspension bridge that supports a uniform weight. This activity uses concrete models and computer generated graphs to discover the mathematical model of the shape of the main cable of a suspension bridge. (MDH)

  20. SECOND GENERATION MODEL

    EPA Science Inventory

    One of the environmental and economic models that the U.S. EPA uses to assess climate change policies is the Second Generation Model (SGM). SGM is a 13 region, 24 sector computable general equilibrium (CGE) model of the world that can be used to estimate the domestic and intern...

  1. A Model Performance

    ERIC Educational Resources Information Center

    Thornton, Bradley D.; Smalley, Robert A.

    2008-01-01

    Building information modeling (BIM) uses three-dimensional modeling concepts, information technology and interoperable software to design, construct and operate a facility. However, BIM can be more than a tool for virtual modeling--it can provide schools with a 3-D walkthrough of a project while it still is on the electronic drawing board. BIM can…

  2. Developing Structural Equation Models.

    ERIC Educational Resources Information Center

    Schumacker, Randall E.

    1993-01-01

    Structural equation models merge multiple regression, path analysis, and factor analysis techniques into a single data analytic framework. Measurement models are developed to define latent variables, and structural equations are then established among the latent variables. Explains the development of these models. (KS)

  3. Modeling Natural Selection

    ERIC Educational Resources Information Center

    Bogiages, Christopher A.; Lotter, Christine

    2011-01-01

    In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a better understanding of the content, process, and nature of science (Kenyon, Schwarz, and Hug

  4. The Synergistic Evaluation Model.

    ERIC Educational Resources Information Center

    Hunter, Michael G.; Schooley, Daniel E.

    This paper presented the concepts underlying an evaluation model which has been developed and used for several years. The model emphasizes the synergism necessary for optimal evaluation strategies. The model is divided into four domains: (1) policy, (2) program development, (3) instruction, and (4) feedback. It is synergistic in that (1) its…

  5. Models for Products

    ERIC Educational Resources Information Center

    Speiser, Bob; Walter, Chuck

    2011-01-01

    This paper explores how models can support productive thinking. For us a model is a "thing", a tool to help make sense of something. We restrict attention to specific models for whole-number multiplication, hence the wording of the title. They support evolving thinking in large measure through the ways their users redesign them. They assume new…

  6. Progress in mix modeling

    SciTech Connect

    Harrison, A.K.

    1997-03-14

    We have identified the Cranfill multifluid turbulence model (Cranfill, 1992) as a starting point for development of subgrid models of instability, turbulent and mixing processes. We have differenced the closed system of equations in conservation form, and coded them in the object-oriented hydrodynamics code FLAG, which is to be used as a testbed for such models.

  7. Model Breaking Points Conceptualized

    ERIC Educational Resources Information Center

    Vig, Rozy; Murray, Eileen; Star, Jon R.

    2014-01-01

    Current curriculum initiatives (e.g., National Governors Association Center for Best Practices and Council of Chief State School Officers 2010) advocate that models be used in the mathematics classroom. However, despite their apparent promise, there comes a point when models break, a point in the mathematical problem space where the model cannot,

  8. JB6 Mouse Model

    Cancer.gov

    JB6 Mouse Model The mouse Balb/C JB6 model (1) is the only well characterized model of genetic variants for a neoplastic transformation response to tumor promoters. These cells are not differentially sensitive to tumor promoter induced mitogenesis or diff

  9. A Model Chemistry Class.

    ERIC Educational Resources Information Center

    Summerlin, Lee; Borgford, Christie

    1989-01-01

    Described is an activity which uses a 96-well reaction plate and soda straws to construct a model of the periodic table of the elements. The model illustrates the ionization energies of the various elements. Construction of the model and related concepts are discussed. (CW)

  10. A GENERAL CROP MODEL

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Agricultural and ecosystem simulation models valuable for technology transfer require a realistic, process-oriented plant model that can be easily applied to different crops, grasses, and woody species. The objective of this chapter was to describe a general plant model that can be easily applied i...

  11. Modeling and Remodeling Writing

    ERIC Educational Resources Information Center

    Hayes, John R.

    2012-01-01

    In Section 1 of this article, the author discusses the succession of models of adult writing that he and his colleagues have proposed from 1980 to the present. He notes the most important changes that differentiate earlier and later models and discusses reasons for the changes. In Section 2, he describes his recent efforts to model young…

  12. Models and Indicators.

    ERIC Educational Resources Information Center

    Land, Kenneth C.

    2001-01-01

    Examines the definition, construction, and interpretation of social indicators. Shows how standard classes of formalisms used to construct models in contemporary sociology are derived from the general theory of models. Reviews recent model building and evaluation related to active life expectancy among the elderly, fertility rates, and indicators…

  13. Generalized Latent Trait Models.

    ERIC Educational Resources Information Center

    Moustaki, Irini; Knott, Martin

    2000-01-01

    Discusses a general model framework within which manifest variables with different distributions in the exponential family can be analyzed with a latent trait model. Presents a unified maximum likelihood method for estimating the parameters of the generalized latent trait model and discusses the scoring of individuals on the latent dimensions.…

  14. Model Breaking Points Conceptualized

    ERIC Educational Resources Information Center

    Vig, Rozy; Murray, Eileen; Star, Jon R.

    2014-01-01

    Current curriculum initiatives (e.g., National Governors Association Center for Best Practices and Council of Chief State School Officers 2010) advocate that models be used in the mathematics classroom. However, despite their apparent promise, there comes a point when models break, a point in the mathematical problem space where the model cannot,…

  15. Model Rockets and Microchips.

    ERIC Educational Resources Information Center

    Fitzsimmons, Charles P.

    1986-01-01

    Points out the instructional applications and program possibilities of a unit on model rocketry. Describes the ways that microcomputers can assist in model rocket design and in problem calculations. Provides a descriptive listing of model rocket software for the Apple II microcomputer. (ML)

  16. Groves model accuracy study

    NASA Astrophysics Data System (ADS)

    Peterson, Matthew C.

    1991-08-01

    The United States Air Force Environmental Technical Applications Center (USAFETAC) was tasked to review the scientific literature for studies of the Groves Neutral Density Climatology Model and compare the Groves Model with others in the 30-60 km range. The tasking included a request to investigate the merits of comparing accuracy of the Groves Model to rocketsonde data. USAFETAC analysts found the Groves Model to be state of the art for middle-atmospheric climatological models. In reviewing previous comparisons with other models and with space shuttle-derived atmospheric densities, good density vs altitude agreement was found in almost all cases. A simple technique involving comparison of the model with range reference atmospheres was found to be the most economical way to compare the Groves Model with rocketsonde data; an example of this type is provided. The Groves 85 Model is used routinely in USAFETAC's Improved Point Analysis Model (IPAM). To create this model, Dr. Gerald Vann Groves produced tabulations of atmospheric density based on data derived from satellite observations and modified by rocketsonde observations. Neutral Density as presented here refers to the monthly mean density in 10-degree latitude bands as a function of altitude. The Groves 85 Model zonal mean density tabulations are given in their entirety.

  17. General modeling methods

    NASA Astrophysics Data System (ADS)

    Pinson, Larry D.

    The benefits of structural dynamics modeling methods in aerospace structures are reviewed. Four major issues in structural dynamics modeling are discussed which encompass most of its subdisciplines: reduced order modeling, constraints in problems with large motions, computational strategies, and fundamental methods. Directions for future research in these areas are addressed.

  18. Crushed Salt Constitutive Model

    SciTech Connect

    Callahan, G.D.

    1999-02-01

    The constitutive model used to describe the deformation of crushed salt is presented in this report. Two mechanisms -- dislocation creep and grain boundary diffusional pressure solution -- are combined to form the basis for the constitutive model governing the deformation of crushed salt. The constitutive model is generalized to represent three-dimensional states of stress. Upon complete consolidation, the crushed-salt model reproduces the Multimechanism Deformation (M-D) model typically used for the Waste Isolation Pilot Plant (WIPP) host geological formation salt. New shear consolidation tests are combined with an existing database that includes hydrostatic consolidation and shear consolidation tests conducted on WIPP and southeastern New Mexico salt. Nonlinear least-squares model fitting to the database produced two sets of material parameter values for the model -- one for the shear consolidation tests and one for a combination of the shear and hydrostatic consolidation tests. Using the parameter values determined from the fitted database, the constitutive model is validated against constant strain-rate tests. Shaft seal problems are analyzed to demonstrate model-predicted consolidation of the shaft seal crushed-salt component. Based on the fitting statistics, the ability of the model to predict the test data, and the ability of the model to predict load paths and test data outside of the fitted database, the model appears to capture the creep consolidation behavior of crushed salt reasonably well.

  19. Modeling Climate Dynamically

    ERIC Educational Resources Information Center

    Walsh, Jim; McGehee, Richard

    2013-01-01

    A dynamical systems approach to energy balance models of climate is presented, focusing on low order, or conceptual, models. Included are global average and latitude-dependent, surface temperature models. The development and analysis of the differential equations and corresponding bifurcation diagrams provides a host of appropriate material for…

  20. Retrofitted supersymmetric models

    NASA Astrophysics Data System (ADS)

    Bose, Manatosh

    This thesis explores several models of metastable dynamic supersymmetry breaking (MDSB) and a supersymmetric model of hybrid inflation. All of these models possess discrete R-symmetries. We specially focus on the retrofitted models for supersymmetry breaking models. At first we construct retrofitted models of gravity mediation. In these models we explore the genericity of the so-called "split supersymmetry." We show that with the simplest models, where the goldstino multiplet is neutral under the discrete R-symmetry, a split spectrum is not generic. However if the goldstino superfield is charged under some symmetry other than the R-symmetry, then a split spectrum is achievable but not generic. We also present a gravity mediated model where the fine tuning of the Z-boson mass is dictated by a discrete choice rather than a continuous tuning. Then we construct retrofitted models of gauge mediated SUSY breaking. We show that, in these models, if the approximate R-symmetry of the theory is spontaneously broken, the messenger scale is fixed; if explicitly broken by retrofitted couplings, a very small dimensionless number is required; if supergravity corrections are responsible for the symmetry breaking, at least two moderately small couplings are required, and that there is a large range of possible messenger scales. Finally we switch our attention to small field hybrid inflation. We construct a model that yields a spectral index ns = 0.96. Here, we also briefly discuss the possibility of relating the scale of inflation with the dynamics responsible for supersymmetry breaking.

  1. Modeling Climate Dynamically

    ERIC Educational Resources Information Center

    Walsh, Jim; McGehee, Richard

    2013-01-01

    A dynamical systems approach to energy balance models of climate is presented, focusing on low order, or conceptual, models. Included are global average and latitude-dependent, surface temperature models. The development and analysis of the differential equations and corresponding bifurcation diagrams provides a host of appropriate material for

  2. Global Timber Model (GTM)

    EPA Science Inventory

    GTM is an economic model capable of examining global forestry land-use, management, and trade responses to policies. In responding to a policy, the model captures afforestation, forest management, and avoided deforestation behavior. The model estimates harvests in industrial fore...

  3. Modeling EERE Deployment Programs

    SciTech Connect

    Cort, K. A.; Hostick, D. J.; Belzer, D. B.; Livingston, O. V.

    2007-11-01

    This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.

  4. Molecular Models in Biology

    ERIC Educational Resources Information Center

    Goodman, Richard E.

    1970-01-01

    Describes types of molecular models (ball-and-stick, framework, and space-filling) and evaluates commercially available kits. Gives instructions for constructive models from polystyrene balls and pipe-cleaners. Models are useful for class demonstrations although not sufficiently accurate for research use. Illustrations show biologically important…

  5. Modeling agriculture in the Community Land Model

    NASA Astrophysics Data System (ADS)

    Drewniak, B.; Song, J.; Prell, J.; Kotamarthi, V. R.; Jacob, R.

    2013-04-01

    The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types - maize, soybean, and spring wheat - into the coupled carbon-nitrogen version of the Community Land Model (CLM), to help address these questions. Here we present the new model, CLM-Crop, validated against observations from two AmeriFlux sites in the United States, planted with maize and soybean. Seasonal carbon fluxes compared well with field measurements for soybean, but not as well for maize. CLM-Crop yields were comparable with observations in countries such as the United States, Argentina, and China, although the generality of the crop model and its lack of technology and irrigation made direct comparison difficult. CLM-Crop was compared against the standard CLM3.5, which simulates crops as grass. The comparison showed improvement in gross primary productivity in regions where crops are the dominant vegetation cover. Crop yields and productivity were negatively correlated with temperature and positively correlated with precipitation, in agreement with other modeling studies. In case studies with the new crop model looking at impacts of residue management and planting date on crop yield, we found that increased residue returned to the litter pool increased crop yield, while reduced residue returns resulted in yield decreases. Using climate controls to signal planting date caused different responses in different crops. Maize and soybean had opposite reactions: when low temperature threshold resulted in early planting, maize responded with a loss of yield, but soybean yields increased. Our improvements in CLM demonstrate a new capability in the model - simulating agriculture in a realistic way, complete with fertilizer and residue management practices. Results are encouraging, with improved representation of human influences on the land surface and the potentially resulting climate impacts.

  6. CRAC2 model description

    SciTech Connect

    Ritchie, L.T.; Alpert, D.J.; Burke, R.P.; Johnson, J.D.; Ostmeyer, R.M.; Aldrich, D.C.; Blond, R.M.

    1984-03-01

    The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.

  7. Laser range camera modeling

    NASA Astrophysics Data System (ADS)

    Storjohann, Kai

    1990-04-01

    An imaging model is described that was derived for use with a laser range camera (LRC) developed by the Advanced Intelligent Machines Division of Odetics. However, this model could be applied to any comparable imaging system. Both the derivation of the model and the determination of the LRC's intrinsic parameters are explained. For the purpose of evaluating the LRC's imaging model into a standard camera's (SC) pinhole model is derived. By virtue of this transformation, the evaluation of the LRC's external orientation can be found by applying any SC calibration technique.

  8. Mathematical models for rabies.

    PubMed

    Panjeti, Vijay G; Real, Leslie A

    2011-01-01

    Rabies virus and its associated host-pathogen population dynamics have proven a remarkable model system for developing mathematical models of infectious disease emergence and spread. Beginning with simple susceptible-infectious-removed (SIR) compartment models of fox rabies emergence and spread across Western Europe, mathematical models have now been developed to incorporate dynamics across heterogeneous landscapes, host demographic variation, and environmental stochasticity. Model structures range from systems of ordinary differential equations (ODEs) to stochastic agent-based computational simulations. We have reviewed the variety of mathematical approaches now available for analyzing dynamics in different host populations; most notably rabies virus spread in raccoon hosts. PMID:21601056

  9. Pediatric Computational Models

    NASA Astrophysics Data System (ADS)

    Soni, Bharat K.; Kim, Jong-Eun; Ito, Yasushi; Wagner, Christina D.; Yang, King-Hay

    A computational model is a computer program that attempts to simulate a behavior of a complex system by solving mathematical equations associated with principles and laws of physics. Computational models can be used to predict the body's response to injury-producing conditions that cannot be simulated experimentally or measured in surrogate/animal experiments. Computational modeling also provides means by which valid experimental animal and cadaveric data can be extrapolated to a living person. Widely used computational models for injury biomechanics include multibody dynamics and finite element (FE) models. Both multibody and FE methods have been used extensively to study adult impact biomechanics in the past couple of decades.

  10. Complex matrix model duality

    SciTech Connect

    Brown, T. W.

    2011-04-15

    The same complex matrix model calculates both tachyon scattering for the c=1 noncritical string at the self-dual radius and certain correlation functions of operators which preserve half the supersymmetry in N=4 super-Yang-Mills theory. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich-Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces.

  11. Adaptive background model

    NASA Astrophysics Data System (ADS)

    Lu, Xiaochun; Xiao, Yijun; Chai, Zhi; Wang, Bangping

    2007-11-01

    An adaptive background model aiming at outdoor vehicle detection is presented in this paper. This model is an improved model of PICA (pixel intensity classification algorithm), it classifies pixels into K-distributions by color similarity, and then a hypothesis that the background pixel color appears in image sequence with a high frequency is used to evaluate all the distributions to determine which presents the current background color. As experiments show, the model presented in this paper is a robust, adaptive and flexible model, which can deal with situations like camera motions, lighting changes and so on.

  12. UZ Colloid Transport Model

    SciTech Connect

    M. McGraw

    2000-04-13

    The UZ Colloid Transport model development plan states that the objective of this Analysis/Model Report (AMR) is to document the development of a model for simulating unsaturated colloid transport. This objective includes the following: (1) use of a process level model to evaluate the potential mechanisms for colloid transport at Yucca Mountain; (2) Provide ranges of parameters for significant colloid transport processes to Performance Assessment (PA) for the unsaturated zone (UZ); (3) Provide a basis for development of an abstracted model for use in PA calculations.

  13. The FREZCHEM Model

    NASA Astrophysics Data System (ADS)

    Marion, Giles M.; Kargel, Jeffrey S.

    Implementation of the Pitzer approach is through the FREZCHEM (FREEZING CHEMISTRY) model, which is at the core of this work. This model was originally designed to simulate salt chemistries and freezing processes at low temperatures (-54 to 25°C) and 1 atm pressure. Over the years, this model has been broadened to include more chemistries (from 16 to 58 solid phases), a broader temperature range for some chemistries (to 113°C), and incorporation of a pressure dependence (1 to 1000 bars) into the model. Implementation, parameterization, validation, and limitations of the FREZCHEM model are extensively discussed in Chapter 3.

  14. Models of Goldstone gauginos

    NASA Astrophysics Data System (ADS)

    Alves, Daniele S. M.; Galloway, Jamison; McCullough, Matthew; Weiner, Neal

    2016-04-01

    Models with Dirac gauginos are appealing scenarios for physics beyond the Standard Model. They have smaller radiative corrections to scalar soft masses, a suppression of certain supersymmetry (SUSY) production processes at the LHC, and ameliorated flavor constraints. Unfortunately, they are generically plagued by tachyons charged under the Standard Model, and attempts to eliminate such states typically spoil the positive features. The recently proposed "Goldstone gaugino" mechanism provides a simple realization of Dirac gauginos that is automatically free of dangerous tachyonic states. We provide details on this mechanism and explore models for its origin. In particular, we find SUSY QCD models that realize this idea simply and discuss scenarios for unification.

  15. Models of Abnormal Scarring

    PubMed Central

    Seo, Bommie F.; Lee, Jun Yong; Jung, Sung-No

    2013-01-01

    Keloids and hypertrophic scars are thick, raised dermal scars, caused by derailing of the normal scarring process. Extensive research on such abnormal scarring has been done; however, these being refractory disorders specific to humans, it has been difficult to establish a universal animal model. A wide variety of animal models have been used. These include the athymic mouse, rats, rabbits, and pigs. Although these models have provided valuable insight into abnormal scarring, there is currently still no ideal model. This paper reviews the models that have been developed. PMID:24078916

  16. Surrogate waveform models

    NASA Astrophysics Data System (ADS)

    Blackman, Jonathan; Field, Scott; Galley, Chad; Scheel, Mark; Szilagyi, Bela; Tiglio, Manuel

    2015-04-01

    With the advanced detector era just around the corner, there is a strong need for fast and accurate models of gravitational waveforms from compact binary coalescence. Fast surrogate models can be built out of an accurate but slow waveform model with minimal to no loss in accuracy, but may require a large number of evaluations of the underlying model. This may be prohibitively expensive if the underlying is extremely slow, for example if we wish to build a surrogate for numerical relativity. We examine alternate choices to building surrogate models which allow for a more sparse set of input waveforms. Research supported in part by NSERC.

  17. TEAMS Model Analyzer

    NASA Technical Reports Server (NTRS)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  18. Modelling Farm Animal Welfare

    PubMed Central

    Collins, Lisa M.; Part, Chérie E.

    2013-01-01

    Simple Summary In this review paper we discuss the different modeling techniques that have been used in animal welfare research to date. We look at what questions they have been used to answer, the advantages and pitfalls of the methods, and how future research can best use these approaches to answer some of the most important upcoming questions in farm animal welfare. Abstract The use of models in the life sciences has greatly expanded in scope and advanced in technique in recent decades. However, the range, type and complexity of models used in farm animal welfare is comparatively poor, despite the great scope for use of modeling in this field of research. In this paper, we review the different modeling approaches used in farm animal welfare science to date, discussing the types of questions they have been used to answer, the merits and problems associated with the method, and possible future applications of each technique. We find that the most frequently published types of model used in farm animal welfare are conceptual and assessment models; two types of model that are frequently (though not exclusively) based on expert opinion. Simulation, optimization, scenario, and systems modeling approaches are rarer in animal welfare, despite being commonly used in other related fields. Finally, common issues such as a lack of quantitative data to parameterize models, and model selection and validation are discussed throughout the review, with possible solutions and alternative approaches suggested. PMID:26487411

  19. A model of strength

    USGS Publications Warehouse

    Johnson, Douglas H.; Cook, R.D.

    2013-01-01

    In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.

  20. Coalbed methane modeling analysis

    SciTech Connect

    Covatch, G.L.; Layne, A.W.; Salamy, S.P.

    1985-12-01

    Systems analyses or the Department of Energy's (DOE) Coalbed Methane Project (CMP) were performed at the Morgantown Energy Technology Center (METC). In the analyses, both reservoir and stimulation models were evaluated using data from US Steel's Oak Grove Coal Degasification Field. In the first part of the study two reservoir models designed for predicting methane and water production from coalbeds, WELL2D and ARRAY, were evaluated. WELL2D is a two-dimensional, single-well, radial flow model; ARRAY is a two-dimensional, multiwell production model. In the evaluation, the models were used to history match the actual production of the individual wells. The resultant information was then factored into a full-field simulation of the Oak Grove Field. This report summarizes the technical approaches used in the two models, their installation onto the DOE/METC computer system, and gives the results from their evaluation. In the second part of the study, three stimulation models were evaluated to determine their applicability to the CMP. The stimulation models, OSUFRAC (generalized hydraulic fracture), ORUFRAC1 (stress contrast hydraulic fracture model), and TUFRAC (hydraulic fracture proppant placement model), were designed for hydraulic fracturing of homogeneous reservoirs. A summary of the technical approach used in each model and the results of the analyses are presented. 11 refs., 27 figs., 12 tabs.

  1. Animal models of atherosclerosis

    PubMed Central

    Kapourchali, Fatemeh Ramezani; Surendiran, Gangadaran; Chen, Li; Uitz, Elisabeth; Bahadori, Babak; Moghadasian, Mohammed H

    2014-01-01

    In this mini-review several commonly used animal models of atherosclerosis have been discussed. Among them, emphasis has been made on mice, rabbits, pigs and non-human primates. Although these animal models have played a significant role in our understanding of induction of atherosclerotic lesions, we still lack a reliable animal model for regression of the disease. Researchers have reported several genetically modified and transgenic animal models that replicate human atherosclerosis, however each of current animal models have some limitations. Among these animal models, the apolipoprotein (apo) E-knockout (KO) mice have been used extensively because they develop spontaneous atherosclerosis. Furthermore, atherosclerotic lesions developed in this model depending on experimental design may resemble humans’ stable and unstable atherosclerotic lesions. This mouse model of hypercholesterolemia and atherosclerosis has been also used to investigate the impact of oxidative stress and inflammation on atherogenesis. Low density lipoprotein (LDL)-r-KO mice are a model of human familial hypercholesterolemia. However, unlike apo E-KO mice, the LDL-r-KO mice do not develop spontaneous atherosclerosis. Both apo E-KO and LDL-r-KO mice have been employed to generate other relevant mouse models of cardiovascular disease through breeding strategies. In addition to mice, rabbits have been used extensively particularly to understand the mechanisms of cholesterol-induced atherosclerosis. The present review paper details the characteristics of animal models that are used in atherosclerosis research. PMID:24868511

  2. Animal models of atherosclerosis.

    PubMed

    Kapourchali, Fatemeh Ramezani; Surendiran, Gangadaran; Chen, Li; Uitz, Elisabeth; Bahadori, Babak; Moghadasian, Mohammed H

    2014-05-16

    In this mini-review several commonly used animal models of atherosclerosis have been discussed. Among them, emphasis has been made on mice, rabbits, pigs and non-human primates. Although these animal models have played a significant role in our understanding of induction of atherosclerotic lesions, we still lack a reliable animal model for regression of the disease. Researchers have reported several genetically modified and transgenic animal models that replicate human atherosclerosis, however each of current animal models have some limitations. Among these animal models, the apolipoprotein (apo) E-knockout (KO) mice have been used extensively because they develop spontaneous atherosclerosis. Furthermore, atherosclerotic lesions developed in this model depending on experimental design may resemble humans' stable and unstable atherosclerotic lesions. This mouse model of hypercholesterolemia and atherosclerosis has been also used to investigate the impact of oxidative stress and inflammation on atherogenesis. Low density lipoprotein (LDL)-r-KO mice are a model of human familial hypercholesterolemia. However, unlike apo E-KO mice, the LDL-r-KO mice do not develop spontaneous atherosclerosis. Both apo E-KO and LDL-r-KO mice have been employed to generate other relevant mouse models of cardiovascular disease through breeding strategies. In addition to mice, rabbits have been used extensively particularly to understand the mechanisms of cholesterol-induced atherosclerosis. The present review paper details the characteristics of animal models that are used in atherosclerosis research. PMID:24868511

  3. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  4. Modeling the dentate gyrus.

    PubMed

    Morgan, Robert J; Santhakumar, Vijayalakshmi; Soltesz, Ivan

    2007-01-01

    Computational modeling has become an increasingly useful tool for studying complex neuronal circuits such as the dentate gyrus. In order to effectively apply computational techniques and theories to answer pressing biological questions, however, it is necessary to develop detailed, data-driven models. Development of such models is a complicated process, akin to putting together a jigsaw puzzle with the pieces being such things as cell types, cell numbers, and specific connectivity. This chapter provides a walkthrough for the development of a very large-scale, biophysically realistic model of the dentate gyrus. Subsequently, it demonstrates the utility of a modeling approach in asking and answering questions about both healthy and pathological states involving the modeled brain region. Finally, this chapter discusses some predictions that come directly from the model that can be tested in future experimental approaches. PMID:17765743

  5. Titan atmospheric models intercomparison

    NASA Astrophysics Data System (ADS)

    Pernot, P.

    2008-09-01

    Several groups over the world have developed independently models of the photochemistry of Titan. The Cassini mission reveals daily that the chemical complexity is beyond our expectations e. g. observation of heavy positive and negative ions..., and the models are updated accordingly. At this stage, there is no consensus on the various input parameters, and it becomes increasingly difficult to compare outputs form different models. An ISSI team of experts of those models will be gathered shortly to proceed to an intercomparison, i.e. to assess how the models behave, given identical sets of inputs (collectively defined). Expected discrepancies will have to be elucidated and reduced. This intercomparison will also be an occasion to estimate explicitly the importance of various physicalchemical processes on model predictions versus observations. More robust and validated models are expected from this study for the interpretation of Titanrelated data.

  6. Multiscale Modeling: A Review

    NASA Astrophysics Data System (ADS)

    Horstemeyer, M. F.

    This review of multiscale modeling covers a brief history of various multiscale methodologies related to solid materials and the associated experimental influences, the various influence of multiscale modeling on different disciplines, and some examples of multiscale modeling in the design of structural components. Although computational multiscale modeling methodologies have been developed in the late twentieth century, the fundamental notions of multiscale modeling have been around since da Vinci studied different sizes of ropes. The recent rapid growth in multiscale modeling is the result of the confluence of parallel computing power, experimental capabilities to characterize structure-property relations down to the atomic level, and theories that admit multiple length scales. The ubiquitous research that focus on multiscale modeling has broached different disciplines (solid mechanics, fluid mechanics, materials science, physics, mathematics, biological, and chemistry), different regions of the world (most continents), and different length scales (from atoms to autos).

  7. [Health care financing models].

    PubMed

    Giermaziak, Wojciech; Kamiński, Jarosław

    2012-12-01

    The functioning of health care depends mainly on the level and method of financing. Countries choose between different models. Bismarck's model is financing based mainly on the contributions that are obligatory for employers and employees, decentralized model of managing and contracting services. Beveridge's model is financed mostly from the government taxes, it allows contributing to the cost of benefits for patients and participation by private sector. Residual model is based on the optional and private health insurances, supplemented only by National Health Service. Siemaszko's model in his assumption is based on the financing of benefits by the state budget, provides permanent control of the state and equal access to all the benefits for citizens. Choice of a specific financing model entails certain impact on all of the system's participants. The purpose of this article is to introduce subject of health care financing based on the literature and the authors' own thoughts. PMID:23437697

  8. Fuel delivery system model

    SciTech Connect

    Ricci, G.; Verma, A.

    1996-09-01

    A fuel delivery system hydraulic model has been developed by coupling a distributed hydraulic network model with lumped models for the various components of the fuel system like the injectors, regulators, accumulators, etc. The resulting governing equations are linearized around the nominal system pressure and integrated using a fourth order Runge-Kutta algorithm with a variable time-stepping scheme. The model assumes isothermal behavior, negligible frictional losses and single-phase flow. The goal of the model is to study small signal type perturbations around the operating system pressure. Typical outputs from exercising the model are presented. The model can be used to study fuel pressure and velocity transients throughout the system and to design the various fuel system components in a system context.

  9. Toward Scientific Numerical Modeling

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2007-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.

  10. Multiscale Modeling of Recrystallization

    SciTech Connect

    Godfrey, A.W.; Holm, E.A.; Hughes, D.A.; Lesar, R.; Miodownik, M.A.

    1998-12-07

    We propose a multi length scale approach to modeling recrystallization which links a dislocation model, a cell growth model and a macroscopic model. Although this methodology and linking framework will be applied to recrystallization, it is also applicable to other types of phase transformations in bulk and layered materials. Critical processes such as the dislocation structure evolution, nucleation, the evolution of crystal orientations into a preferred texture, and grain size evolution all operate at different length scales. In this paper we focus on incorporating experimental measurements of dislocation substructures, rnisorientation measurements of dislocation boundaries, and dislocation simulations into a mesoscopic model of cell growth. In particular, we show how feeding information from the dislocation model into the cell growth model can create realistic initial microstructure.

  11. Ventilation Model Report

    SciTech Connect

    V. Chipman; J. Case

    2002-12-20

    The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their post-closure analyses. The Ventilation Model report was initially developed to analyze the effects of preclosure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts, and to provide heat removal data to support EBS design. Revision 00 of the Ventilation Model included documentation of the modeling results from the ANSYS-based heat transfer model. Revision 01 ICN 01 included the results of the unqualified software code MULTIFLUX to assess the influence of moisture on the ventilation efficiency. The purposes of Revision 02 of the Ventilation Model are: (1) To validate the conceptual model for preclosure ventilation of emplacement drifts and verify its numerical application in accordance with new procedural requirements as outlined in AP-SIII-10Q, Models (Section 7.0). (2) To satisfy technical issues posed in KTI agreement RDTME 3.14 (Reamer and Williams 2001a). Specifically to demonstrate, with respect to the ANSYS ventilation model, the adequacy of the discretization (Section 6.2.3.1), and the downstream applicability of the model results (i.e. wall heat fractions) to initialize post-closure thermal models (Section 6.6). (3) To satisfy the remainder of KTI agreement TEF 2.07 (Reamer and Williams 2001b). Specifically to provide the results of post-test ANSYS modeling of the Atlas Facility forced convection tests (Section 7.1.2). This portion of the model report also serves as a validation exercise per AP-SIII.10Q, Models, for the ANSYS ventilation model. (4) To asses the impacts of moisture on the ventilation efficiency.

  12. Modeling agriculture in the Community Land Model

    NASA Astrophysics Data System (ADS)

    Drewniak, B.; Song, J.; Prell, J.; Kotamarthi, V. R.; Jacob, R.

    2012-12-01

    The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types - maize, soybean, and spring wheat - into the coupled carbon-nitrogen version of the Community Land Model (CLM), to help address these questions. Here we present the new model, CLM-Crop, validated against observations from two AmeriFlux sites in the United States, planted with maize and soybean. Seasonal carbon fluxes compared well with field measurements. CLM-Crop yields were comparable with observations in some regions, although the generality of the crop model and its lack of technology and irrigation made direct comparison difficult. CLM-Crop was compared against the standard CLM3.5, which simulates crops as grass. The comparison showed improvement in gross primary productivity in regions where crops are the dominant vegetation cover. Crop yields and productivity were negatively correlated with temperature and positively correlated with precipitation. In case studies with the new crop model looking at impacts of residue management and planting date on crop yield, we found that increased residue returned to the litter pool increased crop yield, while reduced residue returns resulted in yield decreases. Using climate controls to signal planting date caused different responses in different crops. Maize and soybean had opposite reactions: when low temperature threshold resulted in early planting, maize responded with a loss of yield, but soybean yields increased. Our improvements in CLM demonstrate a new capability in the model - simulating agriculture in a realistic way, complete with fertilizer and residue management practices. Results are encouraging, with improved representation of human influences on the land surface and the potentially resulting climate impacts.

  13. Integrated modeling, data transfers, and physical models

    NASA Astrophysics Data System (ADS)

    Brookshire, D. S.; Chermak, J. M.

    2003-04-01

    Difficulties in developing precise economic policy models for water reallocation and re-regulation in various regional and transboundary settings has been exacerbated not only by climate issues but also by institutional changes reflected in the promulgation of environmental laws, changing regional populations, and an increased focus on water quality standards. As complexity of the water issues have increased, model development at a micro-policy level is necessary to capture difficult institutional nuances and represent the differing national, regional and stakeholders' viewpoints. More often than not, adequate "local" or specific micro-data are not available in all settings for modeling and policy decisions. Economic policy analysis increasingly deals with this problem through data transfers (transferring results from one study area to another) and significant progress has been made in understanding the issue of the dimensionality of data transfers. This paper explores the conceptual and empirical dimensions of data transfers in the context of integrated modeling when the transfers are not only from the behavioral, but also from the hard sciences. We begin by exploring the domain of transfer issues associated with policy analyses that directly consider uncertainty in both the behavioral and physical science settings. We then, through a stylized, hybrid, economic-engineering model of water supply and demand in the Middle Rio Grand Valley of New Mexico (USA) analyze the impacts of; (1) the relative uncertainty of data transfers methods, (2) the uncertainty of climate data and, (3) the uncertainly of population growth. These efforts are motivated by the need to address the relative importance of more accurate data both from the physical sciences as well as from demography and economics for policy analyses. We evaluate the impacts by empirically addressing (within the Middle Rio Grand model): (1) How much does the surrounding uncertainty of the benefit transfer, climate information, and other forecast information impact policy decisions in reallocation issues? and (2) Where should research efforts be focused in order to improve analyses on which policy decisions are based?

  14. Phyloclimatic modeling: combining phylogenetics and bioclimatic modeling.

    PubMed

    Yesson, C; Culham, A

    2006-10-01

    We investigate the impact of past climates on plant diversification by tracking the "footprint" of climate change on a phylogenetic tree. Diversity within the cosmopolitan carnivorous plant genus Drosera (Droseraceae) is focused within Mediterranean climate regions. We explore whether this diversity is temporally linked to Mediterranean-type climatic shifts of the mid-Miocene and whether climate preferences are conservative over phylogenetic timescales. Phyloclimatic modeling combines environmental niche (bioclimatic) modeling with phylogenetics in order to study evolutionary patterns in relation to climate change. We present the largest and most complete such example to date using Drosera. The bioclimatic models of extant species demonstrate clear phylogenetic patterns; this is particularly evident for the tuberous sundews from southwestern Australia (subgenus Ergaleium). We employ a method for establishing confidence intervals of node ages on a phylogeny using replicates from a Bayesian phylogenetic analysis. This chronogram shows that many clades, including subgenus Ergaleium and section Bryastrum, diversified during the establishment of the Mediterranean-type climate. Ancestral reconstructions of bioclimatic models demonstrate a pattern of preference for this climate type within these groups. Ancestral bioclimatic models are projected into palaeo-climate reconstructions for the time periods indicated by the chronogram. We present two such examples that each generate plausible estimates of ancestral lineage distribution, which are similar to their current distributions. This is the first study to attempt bioclimatic projections on evolutionary time scales. The sundews appear to have diversified in response to local climate development. Some groups are specialized for Mediterranean climates, others show wide-ranging generalism. This demonstrates that Phyloclimatic modeling could be repeated for other plant groups and is fundamental to the understanding of evolutionary responses to climate change. PMID:17060200

  15. Constitutive models in LAME.

    SciTech Connect

    Hammerand, Daniel Carl; Scherzinger, William Mark

    2007-09-01

    The Library of Advanced Materials for Engineering (LAME) provides a common repository for constitutive models that can be used in computational solid mechanics codes. A number of models including both hypoelastic (rate) and hyperelastic (total strain) constitutive forms have been implemented in LAME. The structure and testing of LAME is described in Scherzinger and Hammerand ([3] and [4]). The purpose of the present report is to describe the material models which have already been implemented into LAME. The descriptions are designed to give useful information to both analysts and code developers. Thus far, 33 non-ITAR/non-CRADA protected material models have been incorporated. These include everything from the simple isotropic linear elastic models to a number of elastic-plastic models for metals to models for honeycomb, foams, potting epoxies and rubber. A complete description of each model is outside the scope of the current report. Rather, the aim here is to delineate the properties, state variables, functions, and methods for each model. However, a brief description of some of the constitutive details is provided for a number of the material models. Where appropriate, the SAND reports available for each model have been cited. Many models have state variable aliases for some or all of their state variables. These alias names can be used for outputting desired quantities. The state variable aliases available for results output have been listed in this report. However, not all models use these aliases. For those models, no state variable names are listed. Nevertheless, the number of state variables employed by each model is always given. Currently, there are four possible functions for a material model. This report lists which of these four methods are employed in each material model. As far as analysts are concerned, this information is included only for the awareness purposes. The analyst can take confidence in the fact that model has been properly implemented and the methods necessary for achieving accurate and efficient solutions have been incorporated. The most important method is the getStress function where the actual material model evaluation takes place. Obviously, all material models incorporate this function. The initialize function is included in most material models. The initialize function is called once at the beginning of an analysis and its primary purpose is to initialize the material state variables associated with the model. Many times, there is some information which can be set once per load step. For instance, we may have temperature dependent material properties in an analysis where temperature is prescribed. Instead of setting those parameters at each iteration in a time step, it is much more efficient to set them once per time step at the beginning of the step. These types of load step initializations are performed in the loadStepInit method. The final function used by many models is the pcElasticModuli method which changes the moduli that are to be used by the elastic preconditioner in Adagio. The moduli for the elastic preconditioner are set during the initialization of Adagio. Sometimes, better convergence can be achieved by changing these moduli for the elastic preconditioner. For instance, it typically helps to modify the preconditioner when the material model has temperature dependent moduli. For many material models, it is not necessary to change the values of the moduli that are set initially in the code. Hence, those models do not have pcElasticModuli functions. All four of these methods receive information from the matParams structure as described by Scherzinger and Hammerand.

  16. Modeling and Prediction Overview

    SciTech Connect

    Ermak, D L

    2002-10-18

    Effective preparation for and response to the release of toxic materials into the atmosphere hinges on accurate predictions of the dispersion pathway, concentration, and ultimate fate of the chemical or biological agent. Of particular interest is the threat to civilian populations within major urban areas, which are likely targets for potential attacks. The goals of the CBNP Modeling and Prediction area are: (1) Development of a suite of validated, multi-scale, atmospheric transport and fate modeling capabilities for chemical and biological agent releases within the complex urban environment; (2) Integration of these models and related user tools into operational emergency response systems. Existing transport and fate models are being adapted to treat the complex atmospheric flows within and around structures (e.g., buildings, subway systems, urban areas) and over terrain. Relevant source terms and the chemical and physical behavior of gas- and particle-phase species (e.g., losses due to deposition, bio-agent viability, degradation) are also being developed and incorporated into the models. Model validation is performed using both laboratory and field data. CBNP is producing and testing a suite of models with differing levels of complexity and fidelity to address the full range of user needs and applications. Lumped-parameter transport models are being developed for subway systems and building interiors, supplemented by the use of computational fluid dynamics (CFD) models to describe the circulation within large, open spaces such as auditoriums. Both sophisticated CFD transport models and simpler fast-response models are under development to treat the complex flow around individual structures and arrays of buildings. Urban parameterizations are being incorporated into regional-scale weather forecast, meteorological data assimilation, and dispersion models for problems involving larger-scale urban and suburban areas. Source term and dose response models are being developed for use in the transport models. ''Rules of thumb'' provide guidance to emergency responders in situations when immediate response is necessary and model simulations are not available. These modeling capabilities and tools are being integrated into operational systems for planning and training, real time emergency response, and post-event consequence analysis. CBNP interior modeling tools are directed in large part toward implementation into the PROTECT system for CB defense of interior infrastructure facilities. CBNP's exterior modeling tools for treating CB releases within the urban environment are integrated into the existing DOE National Atmospheric Release Advisory Center (NARAC), which provides real-time atmospheric hazard assessments. Internet and Web based software tools provide authorized users with secure remote access to the operational NARAC system. NARAC plume dispersion and health-risk predictions, as well as recommended actions, aid emergency managers and first responders in coordinating multi-agency responses.

  17. The Earth System Model

    NASA Technical Reports Server (NTRS)

    Schoeberl, Mark; Rood, Richard B.; Hildebrand, Peter; Raymond, Carol

    2003-01-01

    The Earth System Model is the natural evolution of current climate models and will be the ultimate embodiment of our geophysical understanding of the planet. These models are constructed from components - atmosphere, ocean, ice, land, chemistry, solid earth, etc. models and merged together through a coupling program which is responsible for the exchange of data from the components. Climate models and future earth system models will have standardized modules, and these standards are now being developed by the ESMF project funded by NASA. The Earth System Model will have a variety of uses beyond climate prediction. The model can be used to build climate data records making it the core of an assimilation system, and it can be used in OSSE experiments to evaluate. The computing and storage requirements for the ESM appear to be daunting. However, the Japanese ES theoretical computing capability is already within 20% of the minimum requirements needed for some 2010 climate model applications. Thus it seems very possible that a focused effort to build an Earth System Model will achieve succcss.

  18. Quantitative Rheological Model Selection

    NASA Astrophysics Data System (ADS)

    Freund, Jonathan; Ewoldt, Randy

    2014-11-01

    The more parameters in a rheological the better it will reproduce available data, though this does not mean that it is necessarily a better justified model. Good fits are only part of model selection. We employ a Bayesian inference approach that quantifies model suitability by balancing closeness to data against both the number of model parameters and their a priori uncertainty. The penalty depends upon prior-to-calibration expectation of the viable range of values that model parameters might take, which we discuss as an essential aspect of the selection criterion. Models that are physically grounded are usually accompanied by tighter physical constraints on their respective parameters. The analysis reflects a basic principle: models grounded in physics can be expected to enjoy greater generality and perform better away from where they are calibrated. In contrast, purely empirical models can provide comparable fits, but the model selection framework penalizes their a priori uncertainty. We demonstrate the approach by selecting the best-justified number of modes in a Multi-mode Maxwell description of PVA-Borax. We also quantify relative merits of the Maxwell model relative to powerlaw fits and purely empirical fits for PVA-Borax, a viscoelastic liquid, and gluten.

  19. Geochemical modeling: a review

    SciTech Connect

    Jenne, E.A.

    1981-06-01

    Two general families of geochemical models presently exist. The ion speciation-solubility group of geochemical models contain submodels to first calculate a distribution of aqueous species and to secondly test the hypothesis that the water is near equilibrium with particular solid phases. These models may or may not calculate the adsorption of dissolved constituents and simulate the dissolution and precipitation (mass transfer) of solid phases. Another family of geochemical models, the reaction path models, simulates the stepwise precipitation of solid phases as a result of reacting specified amounts of water and rock. Reaction path models first perform an aqueous speciation of the dissolved constituents of the water, test solubility hypotheses, then perform the reaction path modeling. Certain improvements in the present versions of these models would enhance their value and usefulness to applications in nuclear-waste isolation, etc. Mass-transfer calculations of limited extent are certainly within the capabilities of state-of-the-art models. However, the reaction path models require an expansion of their thermodynamic data bases and systematic validation before they are generally accepted.

  20. Differential Topic Models.

    PubMed

    Chen, Changyou; Buntine, Wray; Ding, Nan; Xie, Lexing; Du, Lan

    2015-02-01

    In applications we may want to compare different document collections: they could have shared content but also different and unique aspects in particular collections. This task has been called comparative text mining or cross-collection modeling. We present a differential topic model for this application that models both topic differences and similarities. For this we use hierarchical Bayesian nonparametric models. Moreover, we found it was important to properly model power-law phenomena in topic-word distributions and thus we used the full Pitman-Yor process rather than just a Dirichlet process. Furthermore, we propose the transformed Pitman-Yor process (TPYP) to incorporate prior knowledge such as vocabulary variations in different collections into the model. To deal with the non-conjugate issue between model prior and likelihood in the TPYP, we thus propose an efficient sampling algorithm using a data augmentation technique based on the multinomial theorem. Experimental results show the model discovers interesting aspects of different collections. We also show the proposed MCMC based algorithm achieves a dramatically reduced test perplexity compared to some existing topic models. Finally, we show our model outperforms the state-of-the-art for document classification/ideology prediction on a number of text collections. PMID:26353238

  1. Turbulence modeling and experiments

    NASA Technical Reports Server (NTRS)

    Shabbir, Aamir

    1992-01-01

    The best way of verifying turbulence is to do a direct comparison between the various terms and their models. The success of this approach depends upon the availability of the data for the exact correlations (both experimental and DNS). The other approach involves numerically solving the differential equations and then comparing the results with the data. The results of such a computation will depend upon the accuracy of all the modeled terms and constants. Because of this it is sometimes difficult to find the cause of a poor performance by a model. However, such a calculation is still meaningful in other ways as it shows how a complete Reynolds stress model performs. Thirteen homogeneous flows are numerically computed using the second order closure models. We concentrate only on those models which use a linear (or quasi-linear) model for the rapid term. This, therefore, includes the Launder, Reece and Rodi (LRR) model; the isotropization of production (IP) model; and the Speziale, Sarkar, and Gatski (SSG) model. Which of the three models performs better is examined along with what are their weaknesses, if any. The other work reported deal with the experimental balances of the second moment equations for a buoyant plume. Despite the tremendous amount of activity toward the second order closure modeling of turbulence, very little experimental information is available about the budgets of the second moment equations. Part of the problem stems from our inability to measure the pressure correlations. However, if everything else appearing in these equations is known from the experiment, pressure correlations can be obtained as the closing terms. This is the closest we can come to in obtaining these terms from experiment, and despite the measurement errors which might be present in such balances, the resulting information will be extremely useful for the turbulence modelers. The purpose of this part of the work was to provide such balances of the Reynolds stress and heat flux equations for the buoyant plume.

  2. Contrasting Disciplinary Models in Education.

    ERIC Educational Resources Information Center

    Morris, Robert C.

    1996-01-01

    Discusses advantages and disadvantages of eight discipline approaches: the Neo-Skinnerian reinforcement model, Redl and Wattenberg's group dynamics model, Kounin's lesson-management model, Ginott's communication model, Dreikurs' student choice model, Canter's Assertive Discipline model, Jones's classroom-management model, and Glasser's…

  3. Modeling Imports in a Keynesian Expenditure Model

    ERIC Educational Resources Information Center

    Findlay, David W.

    2010-01-01

    The author discusses several issues that instructors of introductory macroeconomics courses should consider when introducing imports in the Keynesian expenditure model. The analysis suggests that the specification of the import function should partially, if not completely, be the result of a simple discussion about the spending and import

  4. Modeling Imports in a Keynesian Expenditure Model

    ERIC Educational Resources Information Center

    Findlay, David W.

    2010-01-01

    The author discusses several issues that instructors of introductory macroeconomics courses should consider when introducing imports in the Keynesian expenditure model. The analysis suggests that the specification of the import function should partially, if not completely, be the result of a simple discussion about the spending and import…

  5. Modelling of biofilm reactors

    SciTech Connect

    Rodrigues, A.; Grasmick, A.; Elmaleh, S.

    1982-10-01

    Comprehensive models of biofilm reactors are developed. Model I assumes a zero-order reaction of a limiting substrate and a diffusional mass transport through the biofilm; in the diffusion-controlled regime the model is fully characterized by one parameter alpha. From this model the conversion of substrate or reactor efficiency can be calculated, for continuously stirred tank reactors (CSTRs) and plug flow reactors respectively, as follows: EA = )alpha(alpha + 2)) 1/2 - alpha; and Ep = (2 alpha) 1/2 - alpha/2: Validation of the model is tested for different experimental systems. Model II includes liquid film mass transfer resistance. The conversion gap between plug flow reactors and CSTRs is always lower than 25% and, as a first approximation, the biofilm reactor design does not then require accurate residence time distribution measurements. (Refs. 23).

  6. Outside users payload model

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The outside users payload model which is a continuation of documents and replaces and supersedes the July 1984 edition is presented. The time period covered by this model is 1985 through 2000. The following sections are included: (1) definition of the scope of the model; (2) discussion of the methodology used; (3) overview of total demand; (4) summary of the estimated market segmentation by launch vehicle; (5) summary of the estimated market segmentation by user type; (6) details of the STS market forecast; (7) summary of transponder trends; (8) model overview by mission category; and (9) detailed mission models. All known non-NASA, non-DOD reimbursable payloads forecast to be flown by non-Soviet-block countries are included in this model with the exception of Spacelab payloads and small self contained payloads. Certain DOD-sponsored or cosponsored payloads are included if they are reimbursable launches.

  7. Cloud Model Bat Algorithm

    PubMed Central

    Zhou, Yongquan; Xie, Jian; Li, Liangliang; Ma, Mingzhi

    2014-01-01

    Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformation theory of cloud model to depict the qualitative concept: “bats approach their prey.” Furthermore, Lévy flight mode and population information communication mechanism of bats are introduced to balance the advantage between exploration and exploitation. The simulation results show that the cloud model bat algorithm has good performance on functions optimization. PMID:24967425

  8. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  9. Alternative tsunami models

    NASA Astrophysics Data System (ADS)

    Tan, A.; Lyatskaya, I.

    2009-01-01

    The interesting papers by Margaritondo (2005 Eur. J. Phys. 26 401) and by Helene and Yamashita (2006 Eur. J. Phys. 27 855) analysed the great Indian Ocean tsunami of 2004 using a simple one-dimensional canal wave model, which was appropriate for undergraduate students in physics and related fields of discipline. In this paper, two additional, easily understandable models, suitable for the same level of readership, are proposed: one, a two-dimensional model in flat space, and two, the same on a spherical surface. The models are used to study the tsunami produced by the central Kuril earthquake of November 2006. It is shown that the two alternative models, especially the latter one, give better representations of the wave amplitude, especially at far-flung locations. The latter model further demonstrates the enhancing effect on the amplitude due to the curvature of the Earth for far-reaching tsunami propagation.

  10. Models for poloidal divertors

    SciTech Connect

    Post, D.E.; Heifetz, D.; Petravic, M.

    1982-07-01

    Recent progress in models for poloidal divertors has both helped to explain current divertor experiments and contributed significantly to design efforts for future large tokamak (INTOR, etc.) divertor systems. These models range in sophistication from zero-dimensional treatments and dimensional analysis to two-dimensional models for plasma and neutral particle transport which include a wide variety of atomic and molecular processes as well as detailed treatments of the plasma-wall interaction. This paper presents a brief review of some of these models, describing the physics and approximations involved in each model. We discuss the wide variety of physics necessary for a comprehensive description of poloidal divertors. To illustrate the progress in models for poloidal divertors, we discuss some of our recent work as typical examples of the kinds of calculations being done.

  11. Extended frequency turbofan model

    NASA Technical Reports Server (NTRS)

    Mason, J. R.; Park, J. W.; Jaekel, R. F.

    1980-01-01

    The fan model was developed using two dimensional modeling techniques to add dynamic radial coupling between the core stream and the bypass stream of the fan. When incorporated into a complete TF-30 engine simulation, the fan model greatly improved compression system frequency response to planar inlet pressure disturbances up to 100 Hz. The improved simulation also matched engine stability limits at 15 Hz, whereas the one dimensional fan model required twice the inlet pressure amplitude to stall the simulation. With verification of the two dimensional fan model, this program formulated a high frequency F-100(3) engine simulation using row by row compression system characteristics. In addition to the F-100(3) remote splitter fan, the program modified the model fan characteristics to simulate a proximate splitter version of the F-100(3) engine.

  12. Energy balance climate models

    NASA Technical Reports Server (NTRS)

    North, G. R.; Cahalan, R. F.; Coakley, J. A., Jr.

    1981-01-01

    An introductory survey of the global energy balance climate models is presented with an emphasis on analytical results. A sequence of increasingly complicated models involving ice cap and radiative feedback processes are solved, and the solutions and parameter sensitivities are studied. The model parameterizations are examined critically in light of many current uncertainties. A simple seasonal model is used to study the effects of changes in orbital elements on the temperature field. A linear stability theorem and a complete nonlinear stability analysis for the models are developed. Analytical solutions are also obtained for the linearized models driven by stochastic forcing elements. In this context the relation between natural fluctuation statistics and climate sensitivity is stressed.

  13. Load Model Data Tool

    Energy Science and Technology Software Center (ESTSC)

    2013-04-30

    The LMDT software automates the process of the load composite model data preparation in the format supported by the major power system software vendors (GE and Siemens). Proper representation of the load composite model in power system dynamic analysis is very important. Software tools for power system simulation like GE PSLF and Siemens PSSE already include algorithms for the load composite modeling. However, these tools require that the input information on composite load to bemore » provided in custom formats. Preparation of this data is time consuming and requires multiple manual operations. The LMDT software enables to automate this process. Software is designed to generate composite load model data. It uses the default load composition data, motor information, and bus information as an input. Software processes the input information and produces load composition model. Generated model can be stored in .dyd format supported by GE PSLF package or .dyr format supported by Siemens PSSE package.« less

  14. HOMER® Micropower Optimization Model

    SciTech Connect

    Lilienthal, P.

    2005-01-01

    NREL has developed the HOMER micropower optimization model. The model can analyze all of the available small power technologies individually and in hybrid configurations to identify least-cost solutions to energy requirements. This capability is valuable to a diverse set of energy professionals and applications. NREL has actively supported its growing user base and developed training programs around the model. These activities are helping to grow the global market for solar technologies.

  15. Liftoff Model for MELCOR.

    SciTech Connect

    Young, Michael F.

    2015-07-01

    Aerosol particles that deposit on surfaces may be subsequently resuspended by air flowing over the surface. A review of models for this liftoff process is presented and compared to available data. Based on this review, a model that agrees with existing data and is readily computed is presented for incorporation into a system level code such as MELCOR. Liftoff Model for MELCOR July 2015 4 This page is intentionally blank

  16. F-14 modeling study

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Baron, S.

    1984-01-01

    Preliminary results in the application of a closed loop pilot/simulator model to the analysis of some simulator fidelity issues are discussed in the context of an air to air target tracking task. The closed loop model is described briefly. Then, problem simplifications that are employed to reduce computational costs are discussed. Finally, model results showing sensitivity of performance to various assumptions concerning the simulator and/or the pilot are presented.

  17. The Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Minard, Charles; Saile, Lynn; Freiere deCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Iyengar, Sriram; Johnson-Throop, Kathy; Baumann, David

    2010-01-01

    The goals of the Integrated Medical Model (IMM) are to develop an integrated, quantified, evidence-based decision support tool useful to crew health and mission planners and to help align science, technology, and operational activities intended to optimize crew health, safety, and mission success. Presentation slides address scope and approach, beneficiaries of IMM capabilities, history, risk components, conceptual models, development steps, and the evidence base. Space adaptation syndrome is used to demonstrate the model's capabilities.

  18. Rat Endovascular Perforation Model

    PubMed Central

    Sehba, Fatima A.

    2014-01-01

    Experimental animal models of aneurysmal subarachnoid hemorrhage (SAH) have provided a wealth of information on the mechanisms of brain injury. The Rat endovascular perforation model (EVP) replicates the early pathophysiology of SAH and hence is frequently used to study early brain injury following SAH. This paper presents a brief review of historical development of the EVP model, details the technique used to create SAH and considerations necessary to overcome technical challenges. PMID:25213427

  19. Invariant turbulence models

    NASA Astrophysics Data System (ADS)

    Bihlo, Alexander; Dos Santos Cardoso-Bihlo, Elsa Maria; Nave, Jean-Christophe; Popovych, Roman

    2012-11-01

    Various subgrid-scale closure models break the invariance of the Euler or Navier-Stokes equations and thus violate the geometric structure of these equations. A method is shown which allows one to systematically derive invariant turbulence models starting from non-invariant turbulence models and thus to correct artificial symmetry-breaking. The method is illustrated by finding invariant hyperdiffusion schemes to be applied in the two-dimensional turbulence problem.

  20. Conceptual IT model

    NASA Astrophysics Data System (ADS)

    Arnaoudova, Kristina; Stanchev, Peter

    2015-11-01

    The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.

  1. Solid model design simplification

    SciTech Connect

    Ames, A.L.; Rivera, J.J.; Webb, A.J.; Hensinger, D.M.

    1997-12-01

    This paper documents an investigation of approaches to improving the quality of Pro/Engineer-created solid model data for use by downstream applications. The investigation identified a number of sources of problems caused by deficiencies in Pro/Engineer`s geometric engine, and developed prototype software capable of detecting many of these problems and guiding users towards simplified, useable models. The prototype software was tested using Sandia production solid models, and provided significant leverage in attacking the simplification problem.

  2. Atmospheric prediction model survey

    NASA Technical Reports Server (NTRS)

    Wellck, R. E.

    1976-01-01

    As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.

  3. AREST model description

    SciTech Connect

    Engel, D.W.; McGrail, B.P.

    1993-11-01

    The Office of Civilian Radioactive Waste Management and the Power Reactor and Nuclear Fuel Development Corporation of Japan (PNC) have supported the development of the Analytical Repository Source-Term (AREST) at Pacific Northwest Laboratory. AREST is a computer model developed to evaluate radionuclide release from an underground geologic repository. The AREST code can be used to calculate/estimate the amount and rate of each radionuclide that is released from the engineered barrier system (EBS) of the repository. The EBS is the man-made or disrupted area of the repository. AREST was designed as a system-level models to simulate the behavior of the total repository by combining process-level models for the release from an individual waste package or container. AREST contains primarily analytical models for calculating the release/transport of radionuclides to the lost rock that surrounds each waste package. Analytical models were used because of the small computational overhead that allows all the input parameters to be derived from a statistical distribution. Recently, a one-dimensional numerical model was also incorporated into AREST, to allow for more detailed modeling of the transport process with arbitrary length decay chains. The next step in modeling the EBS, is to develop a model that couples the probabilistic capabilities of AREST with a more detailed process model. This model will need to look at the reactive coupling of the processes that are involved with the release process. Such coupling would include: (1) the dissolution of the waste form, (2) the geochemical modeling of the groundwater, (3) the corrosion of the container overpacking, and (4) the backfill material, just to name a few. Several of these coupled processes are already incorporated in the current version of AREST.

  4. Geometry of Winter model

    NASA Astrophysics Data System (ADS)

    Aglietti, U. G.; Santini, P. M.

    2015-06-01

    By constructing the Riemann surface controlling the resonance structure of Winter model, we determine the limitations of perturbation theory. We then derive explicit non-perturbative results for various observables in the weak-coupling regime, in which the model has an infinite tower of long-lived resonant states. The problem of constructing proper initial wavefunctions coupled to single excitations of the model is also treated within perturbative and non-perturbative methods.

  5. Global Atmospheric Aerosol Modeling

    NASA Astrophysics Data System (ADS)

    Hendricks, Johannes; Righi, Mattia; Aquila, Valentina

    Global aerosol models are used to study the distribution and properties of atmospheric aerosol particles as well as their effects on clouds, atmospheric chemistry, radiation, and climate. The present article provides an overview of the basic concepts of global atmospheric aerosol modeling and shows some examples from a global aerosol simulation. Particular emphasis is placed on the simulation of aerosol particles and their effects within global climate models.

  6. Triangular lattice exciton model.

    PubMed

    Gunlycke, Daniel; Tseng, Frank

    2016-03-16

    We present a minimalistic equilateral triangular lattice model showing explicitly that the two-dimensional hydrogen model for excitons breaks down for excitons in semiconducting monolayer transition-metal dichalcogenides due to lattice effects and that these excitons are neither Wannier nor Frenkel excitons but rather span an intermediate regime. The model is formulated on sparse form in direct space, allowing it to be solved with great computational efficiency. PMID:26947357

  7. Global Atmospheric Aerosol Modeling

    NASA Technical Reports Server (NTRS)

    Hendricks, Johannes; Aquila, Valentina; Righi, Mattia

    2012-01-01

    Global aerosol models are used to study the distribution and properties of atmospheric aerosol particles as well as their effects on clouds, atmospheric chemistry, radiation, and climate. The present article provides an overview of the basic concepts of global atmospheric aerosol modeling and shows some examples from a global aerosol simulation. Particular emphasis is placed on the simulation of aerosol particles and their effects within global climate models.

  8. Wind power prediction models

    NASA Technical Reports Server (NTRS)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  9. Dataset Modelability by QSAR

    PubMed Central

    Golbraikh, Alexander; Muratov, Eugene; Fourches, Denis; Tropsha, Alexander

    2014-01-01

    We introduce a simple MODelability Index (MODI) that estimates the feasibility of obtaining predictive QSAR models (Correct Classification Rate above 0.7) for a binary dataset of bioactive compounds. MODI is defined as an activity class-weighted ratio of the number of the nearest neighbor pairs of compounds with the same activity class versus the total number of pairs. The MODI values were calculated for more than 100 datasets and the threshold of 0.65 was found to separate non-modelable from the modelable datasets. PMID:24251851

  10. Multifamily Envelope Leakage Model

    SciTech Connect

    Faakye, O.; Griffiths, D.

    2015-05-01

    The objective of the 2013 research project was to develop the model for predicting fully guarded test results (FGT), using unguarded test data and specific building features of apartment units. The model developed has a coefficient of determination R2 value of 0.53 with a root mean square error (RMSE) of 0.13. Both statistical metrics indicate that the model is relatively strong. When tested against data that was not included in the development of the model, prediction accuracy was within 19%, which is reasonable given that seasonal differences in blower door measurements can vary by as much as 25%.

  11. The Generalized SLW Model

    NASA Astrophysics Data System (ADS)

    Solovjov, Vladimir P.; Andre, Frédéric; Lemonnier, Denis; Webb, Brent W.

    2016-01-01

    The Generalized SLW Method is presented, formulating the SLW method with the help of both the ALBDF and the Inverse ALBDF. The result is two equivalent symmetric models: the SLW Model and the Inverse SLW Model. The advantage of the unified dual formulation and of application of the ALBDF and the Inverse ALBDF is in more efficient implementation of the model and the elimination of the solution of the implicit equations for the absorption cross-sections in the construction of the spectral model in the case of nonisothermal media. The generalized approach explores all possibilities of the SLW method under both direct and inverse formulations including its limiting cases: the minimal one clear gas-one gray gas SLW-1 model, and the case when the number of gray gases approaches infinity termed the Exact SLW model. The present work outlines the steps in a unified construction of the generalized SLW model in isothermal and non-isothermal media, and compares different forms of the modelled radiative quantities in plane parallel media: directional total radiative flux, total emissivity, Planck mean and Rosseland mean absorption coefficients.

  12. Lightning return stroke models

    NASA Technical Reports Server (NTRS)

    Lin, Y. T.; Uman, M. A.; Standler, R. B.

    1980-01-01

    We test the two most commonly used lightning return stroke models, Bruce-Golde and transmission line, against subsequent stroke electric and magnetic field wave forms measured simultaneously at near and distant stations and show that these models are inadequate to describe the experimental data. We then propose a new return stroke model that is physically plausible and that yields good approximations to the measured two-station fields. Using the new model, we derive return stroke charge and current statistics for about 100 subsequent strokes.

  13. Solution Crystallization Modeling Tools

    SciTech Connect

    2001-09-01

    Software Tools Will Optimize Crystallization Processes. Crystallization processes could be more effective, economical, and energy efficient if accurate software design and process modeling tools were available.

  14. Modelling the magnetic dipole

    NASA Astrophysics Data System (ADS)

    Seleznyova, Kira; Strugatsky, Mark; Kliava, Janis

    2016-03-01

    Three different models of a magnetic dipole, viz., a uniformly magnetised sphere, a circular current loop and a pair of fictitious magnetic charges, have been systematically analysed within the formalism based on the vector potential of the magnetic field. The expressions of the potentials and magnetic fields produced by each dipole model have been obtained. A computer code has been put forward in order to visualise magnetic field lines for different dipole models. It has been shown that the magnetic field outside the uniformly magnetised sphere coincides with that of a point dipole. The other two models give considerably different results at distances small or intermediate in comparison with the dipole size.

  15. Outdoor ground impedance models.

    PubMed

    Attenborough, Keith; Bashir, Imran; Taherzadeh, Shahram

    2011-05-01

    Many models for the acoustical properties of rigid-porous media require knowledge of parameter values that are not available for outdoor ground surfaces. The relationship used between tortuosity and porosity for stacked spheres results in five characteristic impedance models that require not more than two adjustable parameters. These models and hard-backed-layer versions are considered further through numerical fitting of 42 short range level difference spectra measured over various ground surfaces. For all but eight sites, slit-pore, phenomenological and variable porosity models yield lower fitting errors than those given by the widely used one-parameter semi-empirical model. Data for 12 of 26 grassland sites and for three beech wood sites are fitted better by hard-backed-layer models. Parameter values obtained by fitting slit-pore and phenomenological models to data for relatively low flow resistivity grounds, such as forest floors, porous asphalt, and gravel, are consistent with values that have been obtained non-acoustically. Three impedance models yield reasonable fits to a narrow band excess attenuation spectrum measured at short range over railway ballast but, if extended reaction is taken into account, the hard-backed-layer version of the slit-pore model gives the most reasonable parameter values. PMID:21568385

  16. Particle bed reactor modeling

    NASA Technical Reports Server (NTRS)

    Sapyta, Joe; Reid, Hank; Walton, Lew

    1993-01-01

    The topics are presented in viewgraph form and include the following: particle bed reactor (PBR) core cross section; PBR bleed cycle; fuel and moderator flow paths; PBR modeling requirements; characteristics of PBR and nuclear thermal propulsion (NTP) modeling; challenges for PBR and NTP modeling; thermal hydraulic computer codes; capabilities for PBR/reactor application; thermal/hydralic codes; limitations; physical correlations; comparison of predicted friction factor and experimental data; frit pressure drop testing; cold frit mask factor; decay heat flow rate; startup transient simulation; and philosophy of systems modeling.

  17. Visualizing Risk Prediction Models

    PubMed Central

    Van Belle, Vanya; Van Calster, Ben

    2015-01-01

    Objective Risk prediction models can assist clinicians in making decisions. To boost the uptake of these models in clinical practice, it is important that end-users understand how the model works and can efficiently communicate its results. We introduce novel methods for interpretable model visualization. Methods The proposed visualization techniques are applied to two prediction models from the Framingham Heart Study for the prediction of intermittent claudication and stroke after atrial fibrillation. We represent models using color bars, and visualize the risk estimation process for a specific patient using patient-specific contribution charts. Results The color-based model representations provide users with an attractive tool to instantly gauge the relative importance of the predictors. The patient-specific representations allow users to understand the relative contribution of each predictor to the patient’s estimated risk, potentially providing insightful information on which to base further patient management. Extensions towards non-linear models and interactions are illustrated on an artificial dataset. Conclusion The proposed methods summarize risk prediction models and risk predictions for specific patients in an alternative way. These representations may facilitate communication between clinicians and patients. PMID:26176945

  18. Rethinking Context Models

    NASA Astrophysics Data System (ADS)

    Pérez, Emiliano; Fortier, Andrés; Rossi, Gustavo; Gordillo, Silvia

    Since the first context-aware applications were designed, context modelling has played a central role. During the last decade many different approaches were proposed to model context, ranging from ad-hoc models to extensions to relational databases or ontologies. In this paper we propose to take a step back and analyse those approaches using the seminal views presented by Paul Dourish in his work (What we talk about when we talk about context). Based on that analysis we propose a set of guidelines that any context model should follow.

  19. Models of Reality.

    SciTech Connect

    Brown-VanHoozer, S. A.

    1999-06-02

    Conscious awareness of our environment is based on a feedback loop comprised of sensory input transmitted to the central nervous system leading to construction of our ''model of the world,'' (Lewis et al, 1982). We then assimilate the neurological model at the unconscious level into information we can later consciously consider useful in identifying belief systems and behaviors for designing diverse systems. Thus, we can avoid potential problems based on our open-to-error perceived reality of the world. By understanding how our model of reality is organized, we allow ourselves to transcend content and develop insight into how effective choices and belief systems are generated through sensory derived processes. These are the processes which provide the designer the ability to meta model (build a model of a model) the user; consequently, matching the mental model of the user with that of the designer's and, coincidentally, forming rapport between the two participants. The information shared between the participants is neither assumed nor generalized, it is closer to equivocal; thus minimizing error through a sharing of each other's model of reality. How to identify individual mental mechanisms or processes, how to organize the individual strategies of these mechanisms into useful patterns, and to formulate these into models for success and knowledge based outcomes is the subject of the discussion that follows.

  20. Computer Modeling and Simulation

    SciTech Connect

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  1. The LISA Integrated Model

    NASA Technical Reports Server (NTRS)

    Merkowitz, Stephen M.

    2002-01-01

    The Laser Interferometer Space Antenna (LISA) space mission has unique needs that argue for an aggressive modeling effort. These models ultimately need to forecast and interrelate the behavior of the science input, structure, optics, control systems, and many other factors that affect the performance of the flight hardware. In addition, many components of these integrated models will also be used separately for the evaluation and investigation of design choices, technology development and integration and test. This article presents an overview of the LISA integrated modeling effort.

  2. Selected System Models

    NASA Astrophysics Data System (ADS)

    Schmidt-Eisenlohr, F.; Puñal, O.; Klagges, K.; Kirsche, M.

    Apart from the general issue of modeling the channel, the PHY and the MAC of wireless networks, there are specific modeling assumptions that are considered for different systems. In this chapter we consider three specific wireless standards and highlight modeling options for them. These are IEEE 802.11 (as example for wireless local area networks), IEEE 802.16 (as example for wireless metropolitan networks) and IEEE 802.15 (as example for body area networks). Each section on these three systems discusses also at the end a set of model implementations that are available today.

  3. Model Error Budgets

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    2008-01-01

    An error budget is a commonly used tool in design of complex aerospace systems. It represents system performance requirements in terms of allowable errors and flows these down through a hierarchical structure to lower assemblies and components. The requirements may simply be 'allocated' based upon heuristics or experience, or they may be designed through use of physics-based models. This paper presents a basis for developing an error budget for models of the system, as opposed to the system itself. The need for model error budgets arises when system models are a principle design agent as is increasingly more common for poorly testable high performance space systems.

  4. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  5. Groundwater Model Validation

    SciTech Connect

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation data to constrain model input parameters is shown for the second case study using a Bayesian approach known as Markov Chain Monte Carlo. The approach shows a great potential to be helpful in the validation process and in incorporating prior knowledge with new field data to derive posterior distributions for both model input and output.

  6. Why business models matter.

    PubMed

    Magretta, Joan

    2002-05-01

    "Business model" was one of the great buzz-words of the Internet boom. A company didn't need a strategy, a special competence, or even any customers--all it needed was a Web-based business model that promised wild profits in some distant, ill-defined future. Many people--investors, entrepreneurs, and executives alike--fell for the fantasy and got burned. And as the inevitable counterreaction played out, the concept of the business model fell out of fashion nearly as quickly as the .com appendage itself. That's a shame. As Joan Magretta explains, a good business model remains essential to every successful organization, whether it's a new venture or an established player. To help managers apply the concept successfully, she defines what a business model is and how it complements a smart competitive strategy. Business models are, at heart, stories that explain how enterprises work. Like a good story, a robust business model contains precisely delineated characters, plausible motivations, and a plot that turns on an insight about value. It answers certain questions: Who is the customer? How do we make money? What underlying economic logic explains how we can deliver value to customers at an appropriate cost? Every viable organization is built on a sound business model, but a business model isn't a strategy, even though many people use the terms interchangeably. Business models describe, as a system, how the pieces of a business fit together. But they don't factor in one critical dimension of performance: competition. That's the job of strategy. Illustrated with examples from companies like American Express, EuroDisney, WalMart, and Dell Computer, this article clarifies the concepts of business models and strategy, which are fundamental to every company's performance. PMID:12024761

  7. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor. Collectively, the potential human receptor and exposure pathways form the biosphere model. More detailed technical information and data about potential human receptor groups and the characteristics of exposure pathways have been developed in a series of AMRs and Calculation Reports.

  8. Bayesian Data-Model Fit Assessment for Structural Equation Modeling

    ERIC Educational Resources Information Center

    Levy, Roy

    2011-01-01

    Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes

  9. Bayesian Data-Model Fit Assessment for Structural Equation Modeling

    ERIC Educational Resources Information Center

    Levy, Roy

    2011-01-01

    Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes…

  10. Spiral model pilot project information model

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The objective was an evaluation of the Spiral Model (SM) development approach to allow NASA Marshall to develop an experience base of that software management methodology. A discussion is presented of the Information Model (IM) that was used as part of the SM methodology. A key concept of the SM is the establishment of an IM to be used by management to track the progress of a project. The IM is the set of metrics that is to be measured and reported throughout the life of the project. These metrics measure both the product and the process to ensure the quality of the final delivery item and to ensure the project met programmatic guidelines. The beauty of the SM, along with the IM, is the ability to measure not only the correctness of the specification and implementation of the requirements but to also obtain a measure of customer satisfaction.

  11. Preliminary semiempirical transport models

    SciTech Connect

    Singer, C.E.

    1983-11-01

    A class of semiempirical transport models is proposed for testing against confinement data from tokamaks and for use in operations planning and machine design. A reference model is proposed to be compatible with published confinement data. Theoretical considerations are used to express the anomalous transport coefficients in terms of appropriate dimensionless parameters.

  12. Prewhirl Jet Model

    NASA Technical Reports Server (NTRS)

    Meng, S. Y.; Jensen, M.; Jackson, E. D.

    1985-01-01

    Simple accurate model of centrifugal or rocket engine pumps provides information necessary to design inducer backflow deflector, backflow eliminator and prewhirl jet in jet mixing zones. Jet design based on this model shows improvement in inducer suction performance and reduced cavitation damage.

  13. Computational Modeling of Tires

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Tanner, John A. (Compiler)

    1995-01-01

    This document contains presentations and discussions from the joint UVA/NASA Workshop on Computational Modeling of Tires. The workshop attendees represented NASA, the Army and Air force, tire companies, commercial software developers, and academia. The workshop objectives were to assess the state of technology in the computational modeling of tires and to provide guidelines for future research.

  14. Structural Equation Model Trees

    ERIC Educational Resources Information Center

    Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman

    2013-01-01

    In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree

  15. Math, Science, and Models

    ERIC Educational Resources Information Center

    Weinburgh, Molly; Silva, Cecilia

    2011-01-01

    For the past five summers, the authors have taught summer school to recent immigrants and refugees. Their experiences with these fourth-grade English language learners (ELL) have taught them the value of using models to build scientific and mathematical concepts. In this article, they describe the use of different forms of 2- and 3-D models to

  16. Modelling University Governance

    ERIC Educational Resources Information Center

    Trakman, Leon

    2008-01-01

    Twentieth century governance models used in public universities are subject to increasing doubt across the English-speaking world. Governments question if public universities are being efficiently governed; if their boards of trustees are adequately fulfilling their trust obligations towards multiple stakeholders; and if collegial models of…

  17. A night sky model.

    NASA Astrophysics Data System (ADS)

    Erpylev, N. P.; Smirnov, M. A.; Bagrov, A. V.

    A night sky model is proposed. It includes different components of light polution, such as solar twilight, moon scattered light, zodiacal light, Milky Way, air glow and artificial light pollution. The model is designed for calculating the efficiency of astronomical installations.

  18. Flowfield modeling and diagnostics

    SciTech Connect

    Gupta, A.K.; Lilley, D.G.

    1985-01-01

    This textbook is devoted solely to flowfield modeling and diagnostics; their practical use, recent and current research, and projected developments and trends. It provides an account of the use of a broad range of techniques in industrial and research practice, both with and without combustion. Application ideas are complemented by details about experimental and modeling techniques.

  19. HYBRID RECEPTOR MODELING

    EPA Science Inventory

    A hybrid receptor model is a specified mathematical procedure which uses not only the ambient species concentration measurements that form the input data for a pure receptor model, but in addition source emission rates or atmospheric dispersion or transformation information chara...

  20. Modelling extended chromospheres

    NASA Technical Reports Server (NTRS)

    Linsky, J. L.

    1986-01-01

    Attention is given to the concept that the warm, partially ionized plasma (presently called chromosphere) associated with such stars as Alpha Boo and Rho Per extends outwards at least several photospheric radii. Calculations are presented for the Mg II K line in light of two input model atmospheres. Specific predictions are deduced from the results obtained by each of the two models.

  1. SUSY GUT Model Building

    SciTech Connect

    Raby, Stuart

    2008-11-23

    In this talk I discuss the evolution of SUSY GUT model building as I see it. Starting with 4 dimensional model building, I then consider orbifold GUTs in 5 dimensions and finally orbifold GUTs embedded into the E{sub 8}xE{sub 8} heterotic string.

  2. NEP systems model

    NASA Technical Reports Server (NTRS)

    Gilland, Jim; George, Jeffrey A.

    1993-01-01

    Various aspects of nuclear electric propulsion (NEP) systems analysis and modeling are discussed. The following specific topics are covered: (1) systems analysis challenges; (2) goals for NEP systems analysis; (3) the Nuclear Propulsion Office approach; and (4) NEP subsystem model development. The discussion is presented in vugraph form.

  3. Modelling with Magnets.

    ERIC Educational Resources Information Center

    Gabel, Dorothy; And Others

    1992-01-01

    Chemistry can be described on three levels: sensory, molecular, and symbolic. Proposes a particle approach to teaching chemistry that uses magnets to aid students construct molecular models and solve particle problems. Includes examples of Johnstone's model of chemistry phenomena, a problem worksheet, and a student concept mastery sheet. (MDH)

  4. PHOTOCHEMICAL BOX MODEL (PBM)

    EPA Science Inventory

    This magnetic tape contains the FORTRAN source code, sample input data, and sample output data for the Photochemical Box Model (PBM). The PBM is a simple stationary single-cell model with a variable height lid designed to provide volume-integrated hour averages of O3 and other ph...

  5. SOSS ICN Model Validation

    NASA Technical Reports Server (NTRS)

    Zhu, Zhifan

    2016-01-01

    Under the NASA-KAIA-KARI ATM research collaboration agreement, SOSS ICN Model has been developed for Incheon International Airport. This presentation describes the model validation work in the project. The presentation will show the results and analysis of the validation.

  6. MODELING PIGEONPEA PHENOLOGY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pigeonpea (Cajanus cajan (L.) Millsp.) is a widely grown legume in tropical and subtropical areas. A crop simulation model that can assist in farmer decision-making was developed. The phenological module is one of the major elements of the crop model because accurate prediction of the timing of gr...

  7. MAVRIC-I model

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Model for Aeroelastic Validation Research Involving Computation (MAVRIC-I) model mounted in the Transonic Dynamics Tunnel. Group photo, including (from right to left): John Edwards, Z. Martinovic (Lockheed), Vic Spain (Lockheed), Robert Bartels, Don Keller, and Dave Schuster. Photographed in building 648.

  8. Fictional models in science

    NASA Astrophysics Data System (ADS)

    Morrison, Margaret

    2014-02-01

    When James Clerk Maxwell set out his famous equations 150 years ago, his model of electromagnetism included a piece of pure fiction: an invisible, all-pervasive "aether" made up of elastic vortices separated by electric charges. Margaret Morrison explores how this and other "fictional" models shape science.

  9. ATMOSPHERIC MODEL DEVELOPMENT

    EPA Science Inventory

    This task provides credible state of the art air quality models and guidance for use in implementation of National Ambient Air Quality Standards for ozone and PM. This research effort is to develop and improve air quality models, such as the Community Multiscale Air Quality (CMA...

  10. Modeling prosody: Different approaches

    NASA Astrophysics Data System (ADS)

    Carmichael, Lesley M.

    2002-11-01

    Prosody pervades all aspects of a speech signal, both in terms of raw acoustic outcomes and linguistically meaningful units, from the phoneme to the discourse unit. It is carried in the suprasegmental features of fundamental frequency, loudness, and duration. Several models have been developed to account for the way prosody organizes speech, and they vary widely in terms of their theoretical assumptions, organizational primitives, actual procedures of application to speech, and intended use (e.g., to generate speech from text vs. to model the prosodic phonology of a language). In many cases, these models overtly contradict one another with regard to their fundamental premises or their identification of the perceptible objects of linguistic prosody. These competing models are directly compared. Each model is applied to the same speech samples. This parallel analysis allows for a critical inspection of each model and its efficacy in assessing the suprasegmental behavior of the speech. The analyses illustrate how different approaches are better equipped to account for different aspects of prosody. Viewing the models and their successes from an objective perspective allows for creative possibilities in terms of combining strengths from models which might otherwise be considered fundamentally incompatible.

  11. Unitary Response Regression Models

    ERIC Educational Resources Information Center

    Lipovetsky, S.

    2007-01-01

    The dependent variable in a regular linear regression is a numerical variable, and in a logistic regression it is a binary or categorical variable. In these models the dependent variable has varying values. However, there are problems yielding an identity output of a constant value which can also be modelled in a linear or logistic regression with…

  12. Legal Policy Optimizing Models

    ERIC Educational Resources Information Center

    Nagel, Stuart; Neef, Marian

    1977-01-01

    The use of mathematical models originally developed by economists and operations researchers is described for legal process research. Situations involving plea bargaining, arraignment, and civil liberties illustrate the applicability of decision theory, inventory modeling, and linear programming in operations research. (LBH)

  13. Modelling Rating Scales.

    ERIC Educational Resources Information Center

    Linacre, John M.

    Determination of the intentions of the test developer is fundamental to the choice of the analytical model for a rating scale. For confirmatory analysis, the developer's intentions inform the choice of the general form of the model, representing the manner in which the respondent interacts with the scale; these intentions also inform the choice of

  14. Dynamic Eye Model.

    ERIC Educational Resources Information Center

    Journal of Science and Mathematics Education in Southeast Asia, 1981

    1981-01-01

    Instructions (with diagrams and parts list) are provided for constructing an eye model with a pliable lens made from a plastic bottle which can vary its convexity to accommodate changing positions of an object being viewed. Also discusses concepts which the model can assist in developing. (Author/SK)

  15. HYBRID RECEPTOR MODELS

    EPA Science Inventory

    A hybrid receptor model is a specified mathematical procedure which uses not only the ambient species concentration measurements that form the input data for a pure receptor model, but in addition source emission rates or atmospheric dispersion or transformation information chara...

  16. Dynamic accelerator modeling

    SciTech Connect

    Nishimura, Hiroshi

    1993-05-01

    Object-Oriented Programming has been used extensively to model the LBL Advanced Light Source 1.5 GeV electron storage ring. This paper is on the present status of the class library construction with emphasis on a dynamic modeling.

  17. Animal models for osteoporosis

    NASA Technical Reports Server (NTRS)

    Turner, R. T.; Maran, A.; Lotinun, S.; Hefferan, T.; Evans, G. L.; Zhang, M.; Sibonga, J. D.

    2001-01-01

    Animal models will continue to be important tools in the quest to understand the contribution of specific genes to establishment of peak bone mass and optimal bone architecture, as well as the genetic basis for a predisposition toward accelerated bone loss in the presence of co-morbidity factors such as estrogen deficiency. Existing animal models will continue to be useful for modeling changes in bone metabolism and architecture induced by well-defined local and systemic factors. However, there is a critical unfulfilled need to develop and validate better animal models to allow fruitful investigation of the interaction of the multitude of factors which precipitate senile osteoporosis. Well characterized and validated animal models that can be recommended for investigation of the etiology, prevention and treatment of several forms of osteoporosis have been listed in Table 1. Also listed are models which are provisionally recommended. These latter models have potential but are inadequately characterized, deviate significantly from the human response, require careful choice of strain or age, or are not practical for most investigators to adopt. It cannot be stressed strongly enough that the enormous potential of laboratory animals as models for osteoporosis can only be realized if great care is taken in the choice of an appropriate species, age, experimental design, and measurements. Poor choices will results in misinterpretation of results which ultimately can bring harm to patients who suffer from osteoporosis by delaying advancement of knowledge.

  18. MODELING WATER QUALITY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Water quality models are based on some representation of hydrology and may include movement of surface water, ground water, and mixing of water in lakes and water bodies. Water quality models simulate some combination of sediment, nutrients, heavy metals, xenobiotics, and aquatic biology. Althoug...

  19. Modelling University Governance

    ERIC Educational Resources Information Center

    Trakman, Leon

    2008-01-01

    Twentieth century governance models used in public universities are subject to increasing doubt across the English-speaking world. Governments question if public universities are being efficiently governed; if their boards of trustees are adequately fulfilling their trust obligations towards multiple stakeholders; and if collegial models of

  20. THE AQUATOX MODEL

    EPA Science Inventory

    This lecture will present AQUATOX, an aquatic ecosystem simulation model developed by Dr. Dick Park and supported by the U.S. EPA. The AQUATOX model predicts the fate of various pollutants, such as nutrients and organic chemicals, and their effects on the ecosystem, including fi...

  1. Computer Model Documentation Guide.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    These guidelines for communicating effectively the details of computer model design and operation to persons with varying interests in a model recommend the development of four different types of manuals to meet the needs of managers, users, analysts and programmers. The guidelines for preparing a management summary manual suggest a broad spectrum…

  2. Using Models Effectively

    ERIC Educational Resources Information Center

    Eichinger, John

    2005-01-01

    Models are crucial to science teaching and learning, yet they can create unforeseen and overlooked challenges for students and teachers. For example, consider the time-tested clay volcano that relies on a vinegar and-baking-soda mixture for its "eruption." Based on a classroom demonstration of that geologic model, elementary students may interpret

  3. Earth and ocean modeling

    NASA Technical Reports Server (NTRS)

    Knezovich, F. M.

    1976-01-01

    A modular structured system of computer programs is presented utilizing earth and ocean dynamical data keyed to finitely defined parameters. The model is an assemblage of mathematical algorithms with an inherent capability of maturation with progressive improvements in observational data frequencies, accuracies and scopes. The Eom in its present state is a first-order approach to a geophysical model of the earth's dynamics.

  4. AGRICULTURAL SIMULATION MODEL (AGSIM)

    EPA Science Inventory

    AGSIM is a large-scale econometric simulation model of regional crop and national livestock production in the United States. The model was initially developed to analyze the aggregate economic impacts of a wide variety issues facing agriculture, such as technological change, pest...

  5. Foundations of Biomolecular Modeling

    PubMed Central

    Jorgensen, William L.

    2014-01-01

    The 2013 Nobel Prize in Chemistry has been awarded to Martin Kaplus, Michael Levitt, and Arieh Warshel for “Development of Multiscale Models for Complex Chemical Systems”. The honored work from the 1970s has provided a foundation for the widespread activities today in modeling organic and biomolecular systems. PMID:24315087

  6. STREAM WATER QUALITY MODEL

    EPA Science Inventory

    QUAL2K (or Q2K) is a river and stream water quality model that is intended to represent a modernized version of the QUAL2E (or Q2E) model (Brown and Barnwell 1987). Q2K is similar to Q2E in the following respects:

    • One dimensional. The channel is well-mixed vertically a...

    • Modeling Water Filtration

      ERIC Educational Resources Information Center

      Parks, Melissa

      2014-01-01

      Model-eliciting activities (MEAs) are not new to those in engineering or mathematics, but they were new to Melissa Parks. Model-eliciting activities are simulated real-world problems that integrate engineering, mathematical, and scientific thinking as students find solutions for specific scenarios. During this process, students generate solutions…

    • Model-Based Reasoning

      ERIC Educational Resources Information Center

      Ifenthaler, Dirk; Seel, Norbert M.

      2013-01-01

      In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

    • MODELING THE AMES TEST

      EPA Science Inventory

      Despite the value and widespread use of the Ames test, little attention has been focused on standardizing quantitative methods of analyzing these data. In this paper, a realistic and statistically tractable model is developed for the evaluation of Ames-type data. The model assume...

    • Automated Student Model Improvement

      ERIC Educational Resources Information Center

      Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

      2012-01-01

      Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

    • Updating Situation Models

      ERIC Educational Resources Information Center

      Zwaan, Rolf A.; Madden, Carol J.

      2004-01-01

      The authors examined how situation models are updated during text comprehension. If comprehenders keep track of the evolving situation, they should update their models such that the most current information, the here and now, is more available than outdated information. Contrary to this updating hypothesis, E. J. O'Brien, M. L. Rizzella, J. E.…

    • Composite Load Model Evaluation

      SciTech Connect

      Lu, Ning; Qiao, Hong

      2007-09-30

      The WECC load modeling task force has dedicated its effort in the past few years to develop a composite load model that can represent behaviors of different end-user components. The modeling structure of the composite load model is recommended by the WECC load modeling task force. GE Energy has implemented this composite load model with a new function CMPLDW in its power system simulation software package, PSLF. For the last several years, Bonneville Power Administration (BPA) has taken the lead and collaborated with GE Energy to develop the new composite load model. Pacific Northwest National Laboratory (PNNL) and BPA joint force and conducted the evaluation of the CMPLDW and test its parameter settings to make sure that: • the model initializes properly, • all the parameter settings are functioning, and • the simulation results are as expected. The PNNL effort focused on testing the CMPLDW in a 4-bus system. An exhaustive testing on each parameter setting has been performed to guarantee each setting works. This report is a summary of the PNNL testing results and conclusions.

    • The EMEFS model evaluation

      SciTech Connect

      Barchet, W.R. ); Dennis, R.L. ); Seilkop, S.K. ); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. ); Byun, D.; McHenry, J.N.

      1991-12-01

      The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.

    • Dual-Schemata Model

      NASA Astrophysics Data System (ADS)

      Taniguchi, Tadahiro; Sawaragi, Tetsuo

      In this paper, a new machine-learning method, called Dual-Schemata model, is presented. Dual-Schemata model is a kind of self-organizational machine learning methods for an autonomous robot interacting with an unknown dynamical environment. This is based on Piaget's Schema model, that is a classical psychological model to explain memory and cognitive development of human beings. Our Dual-Schemata model is developed as a computational model of Piaget's Schema model, especially focusing on sensori-motor developing period. This developmental process is characterized by a couple of two mutually-interacting dynamics; one is a dynamics formed by assimilation and accommodation, and the other dynamics is formed by equilibration and differentiation. By these dynamics schema system enables an agent to act well in a real world. This schema's differentiation process corresponds to a symbol formation process occurring within an autonomous agent when it interacts with an unknown, dynamically changing environment. Experiment results obtained from an autonomous facial robot in which our model is embedded are presented; an autonomous facial robot becomes able to chase a ball moving in various ways without any rewards nor teaching signals from outside. Moreover, emergence of concepts on the target movements within a robot is shown and discussed in terms of fuzzy logics on set-subset inclusive relationships.

    • Evaluating Causal Models.

      ERIC Educational Resources Information Center

      Watt, James H., Jr.

      Pointing out that linear causal models can organize the interrelationships of a large number of variables, this paper contends that such models are particularly useful to mass communication research, which must by necessity deal with complex systems of variables. The paper first outlines briefly the philosophical requirements for establishing a…

    • Postinstability models in elasticity

      NASA Technical Reports Server (NTRS)

      Zak, M.

      1984-01-01

      It is demonstrated that the instability caused by the failure of hyperbolicity in elasticity and associated with the problem of unpredictability in classical mechanics expresses the incompleteness of the original model of an elastic medium. The instability as well as the ill-posedness of the Cauchy problem are eliminated by reformulating the original model.

    • Review of Model Specifications.

      ERIC Educational Resources Information Center

      Prather, James E.

      A salary prediction model for college faculty that is used at Georgia State University was reviewed and tested using multiple regression analysis. Various model specifications, incorporating academic rank, academic discipline, and academic experience, including professional and personal background characteristics, are reviewed. Academic rank is an…

    • The Rasch Testlet Model

      ERIC Educational Resources Information Center

      Wang, Wen-Chung; Wilson, Mark

      2005-01-01

      The Rasch testlet model for both dichotomous and polytomous items in testlet-based tests is proposed. It can be viewed as a special case of the multidimensional random coefficients multinomial logit model (MRCMLM). Therefore, the estimation procedures for the MRCMLM can be directly applied. Simulations were conducted to examine parameter recovery…

    • Systematic Eclectic Models.

      ERIC Educational Resources Information Center

      Mahalik, James R.

      1990-01-01

      Presents and evaluates four systematic eclectic models of psychotherapy: Beutler's eclectic psychotherapy; Howard, Nance, and Myers' adaptive counseling and therapy; Lazarus' multimodal therapy; and Prochaska and DiClemente's transtheoretical approach. Examines support for these models and makes conceptual and empirical recommendations.

    • VENTURI SCRUBBER PERFORMANCE MODEL

      EPA Science Inventory

      The paper presents a new model for predicting the particle collection performance of venturi scrubbers. It assumes that particles are collected by atomized liquid only in the throat section. The particle collection mechanism is inertial impaction, and the model uses a single drop...

  1. Modeling Antibody Diversity.

    ERIC Educational Resources Information Center

    Baker, William P.; Moore, Cathy Ronstadt

    1998-01-01

    Understanding antibody structure and function is difficult for many students. The rearrangement of constant and variable regions during antibody differentiation can be effectively simulated using a paper model. Describes a hands-on laboratory exercise which allows students to model antibody diversity using readily available resources. (PVD)

  2. Multilevel Mixture Factor Models

    ERIC Educational Resources Information Center

    Varriale, Roberta; Vermunt, Jeroen K.

    2012-01-01

    Factor analysis is a statistical method for describing the associations among sets of observed variables in terms of a small number of underlying continuous latent variables. Various authors have proposed multilevel extensions of the factor model for the analysis of data sets with a hierarchical structure. These Multilevel Factor Models (MFMs)…

  3. Model State Efforts.

    ERIC Educational Resources Information Center

    Morgan, Gwen

    Models of state involvement in training child care providers are briefly discussed and the employers' role in training is explored. Six criteria for states that are taken as models are identified, and four are described. Various state activities are described for each criterion. It is noted that little is known about employer and other private…

  4. Structural Equation Model Trees

    ERIC Educational Resources Information Center

    Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman

    2013-01-01

    In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree…

  5. Validation of mesoscale models

    NASA Technical Reports Server (NTRS)

    Kuo, Bill; Warner, Tom; Benjamin, Stan; Koch, Steve; Staniforth, Andrew

    1993-01-01

    The topics discussed include the following: verification of cloud prediction from the PSU/NCAR mesoscale model; results form MAPS/NGM verification comparisons and MAPS observation sensitivity tests to ACARS and profiler data; systematic errors and mesoscale verification for a mesoscale model; and the COMPARE Project and the CME.

  6. Using Models Effectively

    ERIC Educational Resources Information Center

    Eichinger, John

    2005-01-01

    Models are crucial to science teaching and learning, yet they can create unforeseen and overlooked challenges for students and teachers. For example, consider the time-tested clay volcano that relies on a vinegar and-baking-soda mixture for its "eruption." Based on a classroom demonstration of that geologic model, elementary students may interpret…

  7. Video Self-Modeling

    ERIC Educational Resources Information Center

    Buggey, Tom; Ogle, Lindsey

    2012-01-01

    Video self-modeling (VSM) first appeared on the psychology and education stage in the early 1970s. The practical applications of VSM were limited by lack of access to tools for editing video, which is necessary for almost all self-modeling videos. Thus, VSM remained in the research domain until the advent of camcorders and VCR/DVD players and,…

  8. Models and Metaphors

    ERIC Educational Resources Information Center

    Ivie, Stanley D.

    2007-01-01

    Humanity delights in spinning conceptual models of the world. These models, in turn, mirror their respective root metaphors. Three root metaphors--spiritual, organic, and mechanical--have dominated western thought. The spiritual metaphor runs from Plato, through Hegel, and connects with Montessori. The organic metaphor extends from Aristotle,…

  9. MULTIMEDIA EXPOSURE MODELING

    EPA Science Inventory

    This task addresses a number of issues that arise in multimedia modeling with an emphasis on interactions among the atmosphere and multiple other environmental media. Approaches for working with multiple types of models and the data sets are being developed. Proper software tool...

  10. Pathological Gambling: Psychiatric Models

    ERIC Educational Resources Information Center

    Westphal, James R.

    2008-01-01

    Three psychiatric conceptual models: addictive, obsessive-compulsive spectrum and mood spectrum disorder have been proposed for pathological gambling. The objectives of this paper are to (1) evaluate the evidence base from the most recent reviews of each model, (2) update the evidence through 2007 and (3) summarize the status of the evidence for…

  11. Modeling HIV Cure

    NASA Astrophysics Data System (ADS)

    Perelson, Alan; Conway, Jessica; Cao, Youfang

    A large effort is being made to find a means to cure HIV infection. I will present a dynamical model of post-treatment control (PTC) or ``functional cure'' of HIV-infection. Some patients treated with suppressive antiviral therapy have been taken off of therapy and then spontaneously control HIV infection such that the amount of virus in the circulation is maintained undetectable by clinical assays for years. The model explains PTC occurring in some patients by having a parameter regime in which the model exhibits bistability, with both a low and high steady state viral load being stable. The model makes a number of predictions about how to attain the low PTC steady state. Bistability in this model depends upon the immune response becoming exhausted when over stimulated. I will also present a generalization of the model in which immunotherapy can be used to reverse immune exhaustion and compare model predictions with experiments in SIV infected macaques given immunotherapy and then taken off of antiretroviral therapy. Lastly, if time permits, I will discuss one of the hurdles to true HIV eradication, latently infected cells, and present clinical trial data and a new model addressing pharmacological means of flushing out the latent reservoir. Supported by NIH Grants AI028433 and OD011095.

  12. Modeling Water Filtration

    ERIC Educational Resources Information Center

    Parks, Melissa

    2014-01-01

    Model-eliciting activities (MEAs) are not new to those in engineering or mathematics, but they were new to Melissa Parks. Model-eliciting activities are simulated real-world problems that integrate engineering, mathematical, and scientific thinking as students find solutions for specific scenarios. During this process, students generate solutions

  13. Diabetic atherosclerosis mouse models.

    PubMed

    Wu, Kenneth K; Huan, Youming

    2007-04-01

    Coronary heart disease (CHD) due to atherosclerosis is the leading cause of death in the USA, and accelerated CHD has emerged as a leading cause of morbidity and mortality in diabetic patients in the USA and worldwide. This has highlighted the importance and urgency of studying the mechanism of diabetic atherosclerosis and exploring therapeutic options. Due to its unique advantages over other animal models, the mouse is the most used model for studying the mechanism of diabetes-accelerated atherosclerosis and exploring effective therapeutic approaches. In the past decade, several diabetic atherosclerosis mouse models have been established. Currently, however, there is no ideal animal model for diabetic atherosclerosis. To determine the characteristics of the models that more closely resemble human diabetic atherosclerosis disease, this review focuses on the common diabetic atherosclerosis mouse models with respect to the following issues: (1) whether the mice retain diabetic condition; (2) whether the diabetes accelerates atherosclerosis or increases atherogenic inflammation; (3) whether these factors respond to medical interventions. The discussion is aimed at identifying different diabetic mouse models and their features, in order to heighten awareness of the appropriate models that may provide useful tools for studying the mechanism of diabetes-accelerated atherosclerosis and evaluating therapeutic options. PMID:16979174

  14. Modeling for Insights

    SciTech Connect

    Jacob J. Jacobson; Gretchen Matthern

    2007-04-01

    System Dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, System Dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The real power of System Dynamic modeling is gaining insights into total system behavior as time, and system parameters are adjusted and the effects are visualized in real time. System Dynamic models allow decision makers and stakeholders to explore long-term behavior and performance of complex systems, especially in the context of dynamic processes and changing scenarios without having to wait decades to obtain field data or risk failure if a poor management or design approach is used. The Idaho National Laboratory recently has been developing a System Dynamic model of the US Nuclear Fuel Cycle. The model is intended to be used to identify and understand interactions throughout the entire nuclear fuel cycle and suggest sustainable development strategies. This paper describes the basic framework of the current model and presents examples of useful insights gained from the model thus far with respect to sustainable development of nuclear power.

  15. Models in Biology.

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    1997-01-01

    Addresses the most popular models currently being chosen for biological research and the reasons behind those choices. Among the current favorites are zebra fish, fruit flies, mice, monkeys, and yeast. Concludes with a brief examination of the ethical issues involved, and why some animals may need to be replaced in research with model systems.…

  16. Stereolithography models. Final report

    SciTech Connect

    Smith, R.E.

    1995-03-01

    This report describes the first stereolithographic models made, which proved in a new release of ProEngineer software (Parametric Technologies, or PTC) and 3D Systems (Valencia, California) software for the SLA 250 machine. They are a model of benzene and the {alpha}-carbon backbone of the variable region of an antibody.

  17. Mathematical models of hysteresis

    SciTech Connect

    1998-08-01

    The ongoing research has largely been focused on the development of mathematical models of hysteretic nonlinearities with nonlocal memories. The distinct feature of these nonlinearities is that their current states depend on past histories of input variations. It turns out that memories of hysteretic nonlinearities are quite selective. Indeed, experiments show that only some past input extrema (not the entire input variations) leave their marks upon future states of hysteretic nonlinearities. Thus special mathematical tools are needed in order to describe nonlocal selective memories of hysteretic nonlinearities. The origin of such tools can be traced back to the landmark paper of Preisach. Their research has been primarily concerned with Preisach-type models of hysteresis. All these models have a common generic feature; they are constructed as superpositions of simplest hysteretic nonlinearities-rectangular loops. During the past four years, the study has been by and large centered around the following topics: (1) further development of Scalar and vector Preisach-type models of hysteresis; (2) experimental testing of Preisach-type models of hysteresis; (3) development of new models for viscosity (aftereffect) in hysteretic systems; (4) development of mathematical models for superconducting hysteresis in the case of gradual resistive transitions; (5) software implementation of Preisach-type models of hysteresis; and (6) development of new ideas which have emerged in the course of the research work. The author briefly describes the main scientific results obtained in the areas outlined above.

  18. Erosion by Wind: Modeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models of wind erosion are used to investigate fundamental processes and guide resource management. Many models are similar in that - temporal variables control soil wind erodibility; erosion begins when friction velocity exceeds a threshold; and transport capacity for saltation/creep is proportion...

  19. Modeling Carbon Exchange

    NASA Technical Reports Server (NTRS)

    Sellers, Piers

    2012-01-01

    Model results will be reviewed to assess different methods for bounding the terrestrial role in the global carbon cycle. It is proposed that a series of climate model runs could be scoped that would tighten the limits on the "missing sink" of terrestrial carbon and could also direct future satellite image analyses to search for its geographical location and understand its seasonal dynamics.

  20. Canister Model, Systems Analysis

    Energy Science and Technology Software Center (ESTSC)

    1993-09-29

    This packges provides a computer simulation of a systems model for packaging nuclear waste and spent nuclear fuel in canisters. The canister model calculates overall programmatic cost, number of canisters, and fuel and waste inventories for the Idaho Chemical Processing Plant (other initial conditions can be entered).

  1. Modeling and Interrogative Strategies.

    ERIC Educational Resources Information Center

    Denney, Douglas R.

    Three studies to determine the effects of adult models on interrogative strategies of children (ages 6-11) are reviewed. Two issues are analyzed: (1) the comparative effectiveness of various types of modeling procedures for changing rule-governed behaviors, and (2) the interaction between observational learning and the developmental level of the…

  2. Solar Atmosphere Models

    NASA Astrophysics Data System (ADS)

    Rutten, R. J.

    2002-12-01

    This contribution honoring Kees de Jager's 80th birthday is a review of "one-dimensional" solar atmosphere modeling that followed on the initial "Utrecht Reference Photosphere" of Heintze, Hubenet & de Jager (1964). My starting point is the Bilderberg conference, convened by de Jager in 1967 at the time when NLTE radiative transfer theory became mature. The resulting Bilderberg model was quickly superseded by the HSRA and later by the VAL-FAL sequence of increasingly sophisticated NLTE continuum-fitting models from Harvard. They became the "standard models" of solar atmosphere physics, but Holweger's relatively simple LTE line-fitting model still persists as a favorite of solar abundance determiners. After a brief model inventory I discuss subsequent work on the major modeling issues (coherency, NLTE, dynamics) listed as to-do items by de Jager in 1968. The present conclusion is that one-dimensional modeling recovers Schwarzschild's (1906) finding that the lower solar atmosphere is grosso modo in radiative equilibrium. This is a boon for applications regarding the solar atmosphere as one-dimensional stellar example - but the real sun, including all the intricate phenomena that now constitute the mainstay of solar physics, is vastly more interesting.

  3. Mars Interior Models

    NASA Astrophysics Data System (ADS)

    Spohn, T.

    Recent adjustments to interior structure models of Mars Sohl et al 2005 have used improved values of the polar moment of inertia factor These models were correctly calculated from the mean moment of inertia factor rather than using the polar moment of intertia factor The latter are linked through the gravitational oblateness J2 The new models suggest larger cores by tens of kilometers and mantle densities smaller by several tens of kilograms per cubicmeters in comparison with previous models The larger cores make a present day perovskite layer at the base of the mantle even less likely than previously thought but such a layer may still have been present in the early evolution when mantle temperatures were higher The absence of a perovskite layer makes it more difficult to accept models in which Tharsis is presently supported by a mantle super plume The super plume model is also difficult to reconcile with thermal evolution models of the mantle which suggest a decrease of the core heat flow over time While these models would allow Tharsis to be formed by a super plume they would suggest that the super plume disappeared some time in the Hesperian Post- Noachian volcanism may be fed by an asthenosphere caused by the thermal blanketing of the thick Tharsis crust Schuhmacher and Breuer 2006 Schuhmacher S and D Breuer 2006 JGR 111 doi 10 1029 2005JE002429 Sohl F G Schubert and T Spohn 2005 JGR 110 doi 10 1029 2005JE002520

  4. Turbulence Modeling: A NASA Perspective

    NASA Technical Reports Server (NTRS)

    Gatski, T. B.

    2001-01-01

    This paper presents turbulence modeling from NASA's perspective. The topics include: 1) Hierarchy of Solution Methods; 2) Turbulence Modeling Focus; 3) Linear Eddy Viscosity Models; and 4) Nonlinear Eddy Viscosity Algebraic Stress Models.

  5. Australia's Next Top Fraction Model

    ERIC Educational Resources Information Center

    Gould, Peter

    2013-01-01

    Peter Gould suggests Australia's next top fraction model should be a linear model rather than an area model. He provides a convincing argument and gives examples of ways to introduce a linear model in primary classrooms.

  6. XAFS Model Compound Library

    DOE Data Explorer

    Newville, Matthew

    The XAFS Model Compound Library contains XAFS data on model compounds. The term "model" compounds refers to compounds of homogeneous and well-known crystallographic or molecular structure. Each data file in this library has an associated atoms.inp file that can be converted to a feff.inp file using the program ATOMS. (See the related Searchable Atoms.inp Archive at http://cars9.uchicago.edu/~newville/adb/) This Library exists because XAFS data on model compounds is useful for several reasons, including comparing to unknown data for "fingerprinting" and testing calculations and analysis methods. The collection here is currently limited, but is growing. The focus to date has been on inorganic compounds and minerals of interest to the geochemical community. [Copied, with editing, from http://cars9.uchicago.edu/~newville/ModelLib/

  7. Impedance modelling of pipes

    NASA Astrophysics Data System (ADS)

    Creasy, M. Austin

    2016-03-01

    Impedance models of pipes can be used to estimate resonant frequencies of standing waves and model acoustic pressure of closed and open ended pipes. Modelling a pipe with impedance methods allows additional variations to the pipe to be included in the overall model as a system. Therefore an actuator can be attached and used to drive the system and the impedance model is able to include the dynamics of the actuator. Exciting the pipe system with a chirp signal allows resonant frequencies to be measured in both the time and frequency domain. The measurements in the time domain are beneficial for introducing undergraduates to resonances without needing an understanding of fast Fourier transforms. This paper also discusses resonant frequencies in open ended pipes and how numerous texts incorrectly approximate the resonant frequencies for this specific pipe system.

  8. Proton channel models

    PubMed Central

    Pupo, Amaury; Baez-Nieto, David; Martínez, Agustín; Latorre, Ramón; González, Carlos

    2014-01-01

    Voltage-gated proton channels are integral membrane proteins with the capacity to permeate elementary particles in a voltage and pH dependent manner. These proteins have been found in several species and are involved in various physiological processes. Although their primary topology is known, lack of details regarding their structures in the open conformation has limited analyses toward a deeper understanding of the molecular determinants of their function and regulation. Consequently, the function-structure relationships have been inferred based on homology models. In the present work, we review the existing proton channel models, their assumptions, predictions and the experimental facts that support them. Modeling proton channels is not a trivial task due to the lack of a close homolog template. Hence, there are important differences between published models. This work attempts to critically review existing proton channel models toward the aim of contributing to a better understanding of the structural features of these proteins. PMID:24755912

  9. Direct insolation models

    SciTech Connect

    Bird, R.; Hulstrom, R.L.

    1980-01-01

    Several recently published models of the direct component of the broadband insolation are compared for clear sky conditions. The comparison includes seven simple models and one rigorous model that is used as a basis for determining accuracy. Where possible, the comparison is made between the results of each model for each atmospheric constituent (H/sub 2/O, CO/sub 2/, O/sub 3/, O/sub 2/, aerosol and molecular scattering) separately as well as for the combined effect of all of the constituents. Two optimum simple models of varying degrees of complexity are developed as a result of this comparison. The study indicates: aerosols dominate the attenuation of the direct beam for reasonable atmospheric conditions; molecular scattering is next in importance; water vapor is an important absorber; and carbon dioxide and oxygen are relatively unimportant as attenuators of the broadband solar energy.

  10. Linear models: permutation methods

    USGS Publications Warehouse

    Cade, B.S.

    2005-01-01

    Permutation tests (see Permutation Based Inference) for the linear model have applications in behavioral studies when traditional parametric assumptions about the error term in a linear model are not tenable. Improved validity of Type I error rates can be achieved with properly constructed permutation tests. Perhaps more importantly, increased statistical power, improved robustness to effects of outliers, and detection of alternative distributional differences can be achieved by coupling permutation inference with alternative linear model estimators. For example, it is well-known that estimates of the mean in linear model are extremely sensitive to even a single outlying value of the dependent variable compared to estimates of the median [7, 19]. Traditionally, linear modeling focused on estimating changes in the center of distributions (means or medians). However, quantile regression allows distributional changes to be estimated in all or any selected part of a distribution or responses, providing a more complete statistical picture that has relevance to many biological questions [6]...

  11. Criticality Model Report

    SciTech Connect

    J.M. Scaglione

    2003-03-12

    The purpose of the ''Criticality Model Report'' is to validate the MCNP (CRWMS M&O 1998h) code's ability to accurately predict the effective neutron multiplication factor (k{sub eff}) for a range of conditions spanned by various critical configurations representative of the potential configurations commercial reactor assemblies stored in a waste package may take. Results of this work are an indication of the accuracy of MCNP for calculating eigenvalues, which will be used as input for criticality analyses for spent nuclear fuel (SNF) storage at the proposed Monitored Geologic Repository. The scope of this report is to document the development and validation of the criticality model. The scope of the criticality model is only applicable to commercial pressurized water reactor fuel. Valid ranges are established as part of the validation of the criticality model. This model activity follows the description in BSC (2002a).

  12. Integrated Environmental Control Model

    Energy Science and Technology Software Center (ESTSC)

    1999-09-03

    IECM is a powerful multimedia engineering software program for simulating an integrated coal-fired power plant. It provides a capability to model various conventional and advanced processes for controlling air pollutant emissions from coal-fired power plants before, during, or after combustion. The principal purpose of the model is to calculate the performance, emissions, and cost of power plant configurations employing alternative environmental control methods. The model consists of various control technology modules, which may be integratedmore » into a complete utility plant in any desired combination. In contrast to conventional deterministic models, the IECM offers the unique capability to assign probabilistic values to all model input parameters, and to obtain probabilistic outputs in the form of cumulative distribution functions indicating the likelihood of dofferent costs and performance results. A Graphical Use Interface (GUI) facilitates the configuration of the technologies, entry of data, and retrieval of results.« less

  13. Beyond the Standard Model

    SciTech Connect

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ``Beyond the Standard Model`` for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e{sup +}e{sup {minus}} colliders.

  14. Fuzzy object modeling

    NASA Astrophysics Data System (ADS)

    Udupa, Jayaram K.; Odhner, Dewey; Falcao, Alexandre X.; Ciesielski, Krzysztof C.; Miranda, Paulo A. V.; Vaideeswaran, Pavithra; Mishra, Shipra; Grevera, George J.; Saboury, Babak; Torigian, Drew A.

    2011-03-01

    To make Quantitative Radiology (QR) a reality in routine clinical practice, computerized automatic anatomy recognition (AAR) becomes essential. As part of this larger goal, we present in this paper a novel fuzzy strategy for building bodywide group-wise anatomic models. They have the potential to handle uncertainties and variability in anatomy naturally and to be integrated with the fuzzy connectedness framework for image segmentation. Our approach is to build a family of models, called the Virtual Quantitative Human, representing normal adult subjects at a chosen resolution of the population variables (gender, age). Models are represented hierarchically, the descendents representing organs contained in parent organs. Based on an index of fuzziness of the models, 32 thorax data sets, and 10 organs defined in them, we found that the hierarchical approach to modeling can effectively handle the non-linear relationships in position, scale, and orientation that exist among organs in different patients.

  15. Stratiform chromite deposit model

    USGS Publications Warehouse

    Schulte, Ruth F.; Taylor, Ryan D.; Piatak, Nadine M.; Seal, Robert R., II

    2010-01-01

    Stratiform chromite deposits are of great economic importance, yet their origin and evolution remain highly debated. Layered igneous intrusions such as the Bushveld, Great Dyke, Kemi, and Stillwater Complexes, provide opportunities for studying magmatic differentiation processes and assimilation within the crust, as well as related ore-deposit formation. Chromite-rich seams within layered intrusions host the majority of the world's chromium reserves and may contain significant platinum-group-element (PGE) mineralization. This model of stratiform chromite deposits is part of an effort by the U.S. Geological Survey's Mineral Resources Program to update existing models and develop new descriptive mineral deposit models to supplement previously published models for use in mineral-resource and mineral-environmental assessments. The model focuses on features that may be common to all stratiform chromite deposits as a way to gain insight into the processes that gave rise to their emplacement and to the significant economic resources contained in them.

  16. A model of plausibility.

    PubMed

    Connell, Louise; Keane, Mark T

    2006-01-01

    Plausibility has been implicated as playing a critical role in many cognitive phenomena from comprehension to problem solving. Yet, across cognitive science, plausibility is usually treated as an operationalized variable or metric rather than being explained or studied in itself. This article describes a new cognitive model of plausibility, the Plausibility Analysis Model (PAM), which is aimed at modeling human plausibility judgment. This model uses commonsense knowledge of concept-coherence to determine the degree of plausibility of a target scenario. In essence, a highly plausible scenario is one that fits prior knowledge well: with many different sources of corroboration, without complexity of explanation, and with minimal conjecture. A detailed simulation of empirical plausibility findings is reported, which shows a close correspondence between the model and human judgments. In addition, a sensitivity analysis demonstrates that PAM is robust in its operations. PMID:21702810

  17. A Preliminary Jupiter Model

    NASA Astrophysics Data System (ADS)

    Hubbard, W. B.; Militzer, B.

    2016-03-01

    In anticipation of new observational results for Jupiter's axial moment of inertia and gravitational zonal harmonic coefficients from the forthcoming Juno orbiter, we present a number of preliminary Jupiter interior models. We combine results from ab initio computer simulations of hydrogen-helium mixtures, including immiscibility calculations, with a new nonperturbative calculation of Jupiter's zonal harmonic coefficients, to derive a self-consistent model for the planet's external gravity and moment of inertia. We assume helium rain modified the interior temperature and composition profiles. Our calculation predicts zonal harmonic values to which measurements can be compared. Although some models fit the observed (pre-Juno) second- and fourth-order zonal harmonics to within their error bars, our preferred reference model predicts a fourth-order zonal harmonic whose absolute value lies above the pre-Juno error bars. This model has a dense core of about 12 Earth masses and a hydrogen-helium-rich envelope with approximately three times solar metallicity.

  18. Integrated Workforce Modeling System

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.

    2000-01-01

    There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.

  19. Modeling glacial climates

    NASA Technical Reports Server (NTRS)

    North, G. R.; Crowley, T. J.

    1984-01-01

    Mathematical climate modelling has matured as a discipline to the point that it is useful in paleoclimatology. As an example a new two dimensional energy balance model is described and applied to several problems of current interest. The model includes the seasonal cycle and the detailed land-sea geographical distribution. By examining the changes in the seasonal cycle when external perturbations are forced upon the climate system it is possible to construct hypotheses about the origin of midlatitude ice sheets and polar ice caps. In particular the model predicts a rather sudden potential for glaciation over large areas when the Earth's orbital elements are only slightly altered. Similarly, the drift of continents or the change of atmospheric carbon dioxide over geological time induces radical changes in continental ice cover. With the advance of computer technology and improved understanding of the individual components of the climate system, these ideas will be tested in far more realistic models in the near future.

  20. Strength Modeling Report

    NASA Technical Reports Server (NTRS)

    Badler, N. I.; Lee, P.; Wong, S.

    1985-01-01

    Strength modeling is a complex and multi-dimensional issue. There are numerous parameters to the problem of characterizing human strength, most notably: (1) position and orientation of body joints; (2) isometric versus dynamic strength; (3) effector force versus joint torque; (4) instantaneous versus steady force; (5) active force versus reactive force; (6) presence or absence of gravity; (7) body somatotype and composition; (8) body (segment) masses; (9) muscle group envolvement; (10) muscle size; (11) fatigue; and (12) practice (training) or familiarity. In surveying the available literature on strength measurement and modeling an attempt was made to examine as many of these parameters as possible. The conclusions reached at this point toward the feasibility of implementing computationally reasonable human strength models. The assessment of accuracy of any model against a specific individual, however, will probably not be possible on any realistic scale. Taken statistically, strength modeling may be an effective tool for general questions of task feasibility and strength requirements.

  1. Varicella infection modeling.

    SciTech Connect

    Jones, Katherine A.; Finley, Patrick D.; Moore, Thomas W.; Nozick, Linda Karen; Martin, Nathaniel; Bandlow, Alisa; Detry, Richard Joseph; Evans, Leland B.; Berger, Taylor Eugen

    2013-09-01

    Infectious diseases can spread rapidly through healthcare facilities, resulting in widespread illness among vulnerable patients. Computational models of disease spread are useful for evaluating mitigation strategies under different scenarios. This report describes two infectious disease models built for the US Department of Veteran Affairs (VA) motivated by a Varicella outbreak in a VA facility. The first model simulates disease spread within a notional contact network representing staff and patients. Several interventions, along with initial infection counts and intervention delay, were evaluated for effectiveness at preventing disease spread. The second model adds staff categories, location, scheduling, and variable contact rates to improve resolution. This model achieved more accurate infection counts and enabled a more rigorous evaluation of comparative effectiveness of interventions.

  2. Modeling earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Charpentier, Arthur; Durand, Marilou

    2015-07-01

    In this paper, we investigate questions arising in Parsons and Geist (Bull Seismol Soc Am 102:1-11, 2012). Pseudo causal models connecting magnitudes and waiting times are considered, through generalized regression. We do use conditional model (magnitude given previous waiting time, and conversely) as an extension to joint distribution model described in Nikoloulopoulos and Karlis (Environmetrics 19: 251-269, 2008). On the one hand, we fit a Pareto distribution for earthquake magnitudes, where the tail index is a function of waiting time following previous earthquake; on the other hand, waiting times are modeled using a Gamma or a Weibull distribution, where parameters are functions of the magnitude of the previous earthquake. We use those two models, alternatively, to generate the dynamics of earthquake occurrence, and to estimate the probability of occurrence of several earthquakes within a year or a decade.

  3. VENTILATION MODEL REPORT

    SciTech Connect

    V. Chipman

    2002-10-31

    The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their postclosure analyses.

  4. Global ice sheet modeling

    SciTech Connect

    Hughes, T.J.; Fastook, J.L.

    1994-05-01

    The University of Maine conducted this study for Pacific Northwest Laboratory (PNL) as part of a global climate modeling task for site characterization of the potential nuclear waste respository site at Yucca Mountain, NV. The purpose of the study was to develop a global ice sheet dynamics model that will forecast the three-dimensional configuration of global ice sheets for specific climate change scenarios. The objective of the third (final) year of the work was to produce ice sheet data for glaciation scenarios covering the next 100,000 years. This was accomplished using both the map-plane and flowband solutions of our time-dependent, finite-element gridpoint model. The theory and equations used to develop the ice sheet models are presented. Three future scenarios were simulated by the model and results are discussed.

  5. Hypertabastic survival model

    PubMed Central

    Tabatabai, Mohammad A; Bursac, Zoran; Williams, David K; Singh, Karan P

    2007-01-01

    A new two-parameter probability distribution called hypertabastic is introduced to model the survival or time-to-event data. A simulation study was carried out to evaluate the performance of the hypertabastic distribution in comparison with popular distributions. We then demonstrate the application of the hypertabastic survival model by applying it to data from two motivating studies. The first one demonstrates the proportional hazards version of the model by applying it to a data set from multiple myeloma study. The second one demonstrates an accelerated failure time version of the model by applying it to data from a randomized study of glioma patients who underwent radiotherapy treatment with and without radiosensitizer misonidazole. Based on the results from the simulation study and two applications, the proposed model shows to be a flexible and promising alternative to practitioners in this field. PMID:17963492

  6. Animal Models of Glaucoma

    PubMed Central

    A. Bouhenni, Rachida; Dunmire, Jeffrey; Sewell, Abby; Edward, Deepak P.

    2012-01-01

    Glaucoma is a heterogeneous group of disorders that progressively lead to blindness due to loss of retinal ganglion cells and damage to the optic nerve. It is a leading cause of blindness and visual impairment worldwide. Although research in the field of glaucoma is substantial, the pathophysiologic mechanisms causing the disease are not completely understood. A wide variety of animal models have been used to study glaucoma. These include monkeys, dogs, cats, rodents, and several other species. Although these models have provided valuable information about the disease, there is still no ideal model for studying glaucoma due to its complexity. In this paper we present a summary of most of the animal models that have been developed and used for the study of the different types of glaucoma, the strengths and limitations associated with each species use, and some potential criteria to develop a suitable model. PMID:22665989

  7. Slim Battery Modelling Features

    NASA Astrophysics Data System (ADS)

    Borthomieu, Y.; Prevot, D.

    2011-10-01

    Saft has developed a life prediction model for VES and MPS cells and batteries. The Saft Li-ion Model (SLIM) is a macroscopic electrochemical model based on energy (global at cell level). The main purpose is to predict the battery performances during the life for GEO, MEO and LEO missions. This model is based on electrochemical characteristics such as Energy, Capacity, EMF, Internal resistance, end of charge voltage. It uses fading and calendar law effects on energy and internal impedance vs. time, temperature, End of Charge voltage. Based on the mission profile, satellite power system characteristics, the model proposes the various battery configurations. For each configuration, the model gives the battery performances using mission figures and profiles: power, duration, DOD, end of charge voltages, temperatures during eclipses and solstices, thermal dissipations and cell failures. For the GEO/MEO missions, eclipse and solstice periods can include specific profile such as plasmic propulsion fires and specific balancing operations. For LEO missions, the model is able to simulate high power peaks to predict radar pulses. Saft's main customers have been using the SLIM model available in house for two years. The purpose is to have the satellite builder power engineers able to perform by themselves in the battery pre-dimensioning activities their own battery simulations. The simulations can be shared with Saft engineers to refine the power system designs. This model has been correlated with existing life and calendar tests performed on all the VES and MPS cells. In comparing with more than 10 year lasting life tests, the accuracy of the model from a voltage point of view is less than 10 mV at end Of Life. In addition, thethe comparison with in-orbit data has been also done. b This paper will present the main features of the SLIM software and outputs comparison with real life tests. b0

  8. Saturn Radiation (SATRAD) Model

    NASA Technical Reports Server (NTRS)

    Garrett, H. B.; Ratliff, J. M.; Evans, R. W.

    2005-01-01

    The Saturnian radiation belts have not received as much attention as the Jovian radiation belts because they are not nearly as intense-the famous Saturnian particle rings tend to deplete the belts near where their peak would occur. As a result, there has not been a systematic development of engineering models of the Saturnian radiation environment for mission design. A primary exception is that of Divine (1990). That study used published data from several charged particle experiments aboard the Pioneer 1 1, Voyager 1, and Voyager 2 spacecraft during their flybys at Saturn to generate numerical models for the electron and proton radiation belts between 2.3 and 13 Saturn radii. The Divine Saturn radiation model described the electron distributions at energies between 0.04 and 10 MeV and the proton distributions at energies between 0.14 and 80 MeV. The model was intended to predict particle intensity, flux, and fluence for the Cassini orbiter. Divine carried out hand calculations using the model but never formally developed a computer program that could be used for general mission analyses. This report seeks to fill that void by formally developing a FORTRAN version of the model that can be used as a computer design tool for missions to Saturn that require estimates of the radiation environment around the planet. The results of that effort and the program listings are presented here along with comparisons with the original estimates carried out by Divine. In addition, Pioneer and Voyager data were scanned in from the original references and compared with the FORTRAN model s predictions. The results were statistically analyzed in a manner consistent with Divine s approach to provide estimates of the ability of the model to reproduce the original data. Results of a formal review of the model by a panel of experts are also presented. Their recommendations for further tests, analyses, and extensions to the model are discussed.

  9. Maximally Expressive Task Modeling

    NASA Technical Reports Server (NTRS)

    Japp, John; Davis, Elizabeth; Maxwell, Theresa G. (Technical Monitor)

    2002-01-01

    Planning and scheduling systems organize "tasks" into a timeline or schedule. The tasks are defined within the scheduling system in logical containers called models. The dictionary might define a model of this type as "a system of things and relations satisfying a set of rules that, when applied to the things and relations, produce certainty about the tasks that are being modeled." One challenging domain for a planning and scheduling system is the operation of on-board experiment activities for the Space Station. The equipment used in these experiments is some of the most complex hardware ever developed by mankind, the information sought by these experiments is at the cutting edge of scientific endeavor, and the procedures for executing the experiments are intricate and exacting. Scheduling is made more difficult by a scarcity of space station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling space station experiment operations calls for a "maximally expressive" modeling schema. Modeling even the simplest of activities cannot be automated; no sensor can be attached to a piece of equipment that can discern how to use that piece of equipment; no camera can quantify how to operate a piece of equipment. Modeling is a human enterprise-both an art and a science. The modeling schema should allow the models to flow from the keyboard of the user as easily as works of literature flowed from the pen of Shakespeare. The Ground Systems Department at the Marshall Space Flight Center has embarked on an effort to develop a new scheduling engine that is highlighted by a maximally expressive modeling schema. This schema, presented in this paper, is a synergy of technological advances and domain-specific innovations.

  10. Tsunami Modeling: Development of Benchmarked Models

    NASA Astrophysics Data System (ADS)

    Kanoglu, U.; Synolakis, C. E.

    2008-12-01

    We discuss the progress towards the development of benchmarked models for forecasting tsunami inundation. Tsunami hydrodynamics has progressed slower than research in other natural hazards, because for several decades only the largest tsunamis were being reported. With the exception of the 1960 and 1964 events, there had been only qualitative information on inundation. While the basic equations for analysis have been known for decades, the existing synthesis leading to real time forecasts as currently available had to await the development of sophisticated modeling tools, the large-scale laboratory experiments in the 1980s-1990s and the tsunameter recordings of 2003 and since. The field survey results in the 1990s (Synolakis and Okal, 2005) served as crude proxies to free-field tsunami recordings and allowed for the validation and verification of numerical procedures. State-of-the-art inundation and forecasting codes have evolved through a painstaking process of careful validation and verification which can be traced back to the 1990 NSF Catalina workshop on Long-Wave Runup Models (Liu et al., 1991). Operational tsunami forecasting was only made possible through the availability of deep ocean measurements. We will describe this journey from development of the basic field equations to forecasts, through the scientific milestones that served as benchmarks and reality checks. In summary, as research in live networks -where problems and solution ideas arise spontaneously- tsunami hydrodynamic modeling was driven by milestone scientific meetings, and post tsunami surveys that kept identifying novel problem geometries and previously unrecognized phenomena. We discuss necessary validation and verification steps for numerical codes to be used for inundation mapping, design and operations (Synolakis et al., 2007). Liu, P. L.-F., C. E. Synolakis and H. H. Yeh, 1991. Report on the International Workshop on Long-Wave Run- up. J. Fluid Mech., 229, 675-688. Synolakis, C. E. and E. A. Okal, 2005. 1992-2002: perspective on a decade of post tsunami surveys. Adv. Nat. Technol. Hazards, 23, 1-30. Synolakis, C. E., E. N. Bernard, V. V. Titov, U. Kanoglu and F. Gonzalez, 2007. Standards, criteria, and procedures for NOAA evaluation of tsunami numerical models. NOAA OAR Special Report, Contribution No 3053, NOAA/OAR/PMEL, Seattle, WA, 55 pp.

  11. Predictive models in urology.

    PubMed

    Cestari, Andrea

    2013-01-01

    Predictive modeling is emerging as an important knowledge-based technology in healthcare. The interest in the use of predictive modeling reflects advances on different fronts such as the availability of health information from increasingly complex databases and electronic health records, a better understanding of causal or statistical predictors of health, disease processes and multifactorial models of ill-health and developments in nonlinear computer models using artificial intelligence or neural networks. These new computer-based forms of modeling are increasingly able to establish technical credibility in clinical contexts. The current state of knowledge is still quite young in understanding the likely future direction of how this so-called 'machine intelligence' will evolve and therefore how current relatively sophisticated predictive models will evolve in response to improvements in technology, which is advancing along a wide front. Predictive models in urology are gaining progressive popularity not only for academic and scientific purposes but also into the clinical practice with the introduction of several nomograms dealing with the main fields of onco-urology. PMID:23423686

  12. VPPA weld model evaluation

    NASA Technical Reports Server (NTRS)

    Mccutcheon, Kimble D.; Gordon, Stephen S.; Thompson, Paul A.

    1992-01-01

    NASA uses the Variable Polarity Plasma Arc Welding (VPPAW) process extensively for fabrication of Space Shuttle External Tanks. This welding process has been in use at NASA since the late 1970's but the physics of the process have never been satisfactorily modeled and understood. In an attempt to advance the level of understanding of VPPAW, Dr. Arthur C. Nunes, Jr., (NASA) has developed a mathematical model of the process. The work described in this report evaluated and used two versions (level-0 and level-1) of Dr. Nunes' model, and a model derived by the University of Alabama at Huntsville (UAH) from Dr. Nunes' level-1 model. Two series of VPPAW experiments were done, using over 400 different combinations of welding parameters. Observations were made of VPPAW process behavior as a function of specific welding parameter changes. Data from these weld experiments was used to evaluate and suggest improvements to Dr. Nunes' model. Experimental data and correlations with the model were used to develop a multi-variable control algorithm for use with a future VPPAW controller. This algorithm is designed to control weld widths (both on the crown and root of the weld) based upon the weld parameters, base metal properties, and real-time observation of the crown width. The algorithm exhibited accuracy comparable to that of the weld width measurements for both aluminum and mild steel welds.

  13. Intraocular Lymphoma Models

    PubMed Central

    Aronow, Mary E.; Shen, Defen; Hochman, Jacob; Chan, Chi-Chao

    2015-01-01

    Primary vitreoretinal lymphoma (PVRL) is a subtype of primary central nervous system lymphoma (PCNSL), a high-grade, extranodal, non-Hodgkin's lymphoma, predominantly of B-cell origin. PVRL is an aggressive disease with a poor prognosis. Human studies are not ideally suited for the study of intraocular lymphoma pathogenesis or treatment strategies due to the rare nature of the disease, its variable presentation, limited volume of available ocular fluids, and fragility of sampled lymphoma cells. Animal models have been critical in making progress in understanding intraocular lymphoma pathogenesis and investigating potential therapeutic strategies. Early murine models for intraocular lymphoma used intraperitoneal injection of mouse T-cell lymphomas. This was followed by intravitreal T-cell murine models. More recent murine models have used B-cell lymphomas to more closely mimic human disease. The most current B-cell lymphoma models employ a combined approach of inoculating both the mouse vitreous cavity and brain. The challenge in murine models for intraocular lymphoma lies in recreating the clinical features, disease behavior, molecular profile, systemic immunity, and the microenvironment observed in human disease. In the future, animal models will continue to be central to furthering our understanding of the disease and in the investigation of potential treatment targets. PMID:27171354

  14. Invertebrate models of alcoholism.

    PubMed

    Scholz, Henrike; Mustard, Julie A

    2013-01-01

    For invertebrates to become useful models for understanding the genetic and physiological mechanisms of alcoholism related behaviors and the predisposition towards alcoholism, several general requirements must be fulfilled. The animal should encounter ethanol in its natural habitat, so that the central nervous system of the organism will have evolved mechanisms for responding to ethanol exposure. How the brain adapts to ethanol exposure depends on its access to ethanol, which can be regulated metabolically and/or by physical barriers. Therefore, a model organism should have metabolic enzymes for ethanol degradation similar to those found in humans. The neurons and supporting glial cells of the model organism that regulate behaviors affected by ethanol should share the molecular and physiological pathways found in humans, so that results can be compared. Finally, the use of invertebrate models should offer advantages over traditional model systems and should offer new insights into alcoholism-related behaviors. In this review we will summarize behavioral similarities and identified genes and mechanisms underlying ethanol-induced behaviors in invertebrates. This review mainly focuses on the use of the nematode Caenorhabditis elegans, the honey bee Apis mellifera and the fruit fly Drosophila melanogaster as model systems. We will discuss insights gained from those studies in conjunction with their vertebrate model counterparts and the implications for future research into alcoholism and alcohol-induced behaviors. PMID:21472534

  15. Multiscale Cloud System Modeling

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Moncrieff, Mitchell W.

    2009-01-01

    The central theme of this paper is to describe how cloud system resolving models (CRMs) of grid spacing approximately 1 km have been applied to various important problems in atmospheric science across a wide range of spatial and temporal scales and how these applications relate to other modeling approaches. A long-standing problem concerns the representation of organized precipitating convective cloud systems in weather and climate models. Since CRMs resolve the mesoscale to large scales of motion (i.e., 10 km to global) they explicitly address the cloud system problem. By explicitly representing organized convection, CRMs bypass restrictive assumptions associated with convective parameterization such as the scale gap between cumulus and large-scale motion. Dynamical models provide insight into the physical mechanisms involved with scale interaction and convective organization. Multiscale CRMs simulate convective cloud systems in computational domains up to global and have been applied in place of contemporary convective parameterizations in global models. Multiscale CRMs pose a new challenge for model validation, which is met in an integrated approach involving CRMs, operational prediction systems, observational measurements, and dynamical models in a new international project: the Year of Tropical Convection, which has an emphasis on organized tropical convection and its global effects.

  16. XPAL modeling and theory

    NASA Astrophysics Data System (ADS)

    Palla, Andrew D.; Carroll, David L.; Verdeyen, Joseph T.; Heaven, Michael C.

    2011-03-01

    The exciplex pumped alkali laser (XPAL) system has been demonstrated in mixtures of Cs vapor, Ar, with and without ethane, by pumping Cs-Ar atomic collision pairs and subsequent dissociation of diatomic, electronically-excited CsAr molecules (exciplexes or excimers). The blue satellites of the alkali D2 lines provide an advantageous pathway for optically pumping atomic alkali lasers on the principal series (resonance) transitions with broad linewidth (>2 nm) semiconductor diode lasers. Because of the addition of atomic collision pairs and exciplex states, modeling of the XPAL system is more complicated than classic diode pumped alkali laser (DPAL) modeling. The BLAZE-V model is utilized for high-fidelity simulations. BLAZE-V is a time-dependent finite-volume model including transport, thermal, and kinetic effects appropriate for the simulation of a cylindrical closed cell XPAL system. The model is also regularly used for flowing gas laser simulations and is easily adapted for DPAL. High fidelity calculations of pulsed XPAL operation as a function of temperature and pressure are presented along with a theoretical analysis of requirements for optical transparency in XPAL systems. The detailed modeling predicts higher XPAL performance as the rare gas pressure increases, and that higher output powers are obtainable with higher temperature. The theoretical model indicates that the choice of alkali and rare gas mixture can significantly impact the required intensities for optical transparency.

  17. Functional Generalized Additive Models

    PubMed Central

    McLean, Mathew W.; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(,) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Mller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(,) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online. PMID:24729671

  18. Meshfree magnetotelluric modelling

    NASA Astrophysics Data System (ADS)

    Wittke, J.; Tezkan, B.

    2014-08-01

    We present a new approach for 2-D magnetotelluric forward numerical modelling in contrast to traditional numerical methods like finite elements or finite differences. The method used for solving the partial differential equations is based on a mesh-free technique which does not need an underlaying mesh or grid. We use the Meshless Local Petrov-Galerkin (MLPG) method in combination with radial basis functions to simulate the response of a given conductivity model to a plane-wave source. We compare the mesh-free solution with known simulation programs and simple analytical solutions. Furthermore, we discuss the new magnetotelluric modelling method in terms of implementation and stability. First, we study the convergence and discretization errors of the new method with a simple half-space conductivity model. Then we compare our mesh-free simulation results with simple 2-D conductivity models with the results of a well-known finite element program. In the end, we provide a smooth conductivity model calculated with the mesh-free approach. The modelling results, even with randomly distributed nodes, are in a good agreement with those obtained by the finite element method.

  19. Acute radiation risk models

    NASA Astrophysics Data System (ADS)

    Smirnova, Olga

    Biologically motivated mathematical models, which describe the dynamics of the major hematopoietic lineages (the thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems) in acutely/chronically irradiated humans are developed. These models are implemented as systems of nonlinear differential equations, which variables and constant parameters have clear biological meaning. It is shown that the developed models are capable of reproducing clinical data on the dynamics of these systems in humans exposed to acute radiation in the result of incidents and accidents, as well as in humans exposed to low-level chronic radiation. Moreover, the averaged value of the "lethal" dose rates of chronic irradiation evaluated within models of these four major hematopoietic lineages coincides with the real minimal dose rate of lethal chronic irradiation. The demonstrated ability of the models of the human thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems to predict the dynamical response of these systems to acute/chronic irradiation in wide ranges of doses and dose rates implies that these mathematical models form an universal tool for the investigation and prediction of the dynamics of the major human hematopoietic lineages for a vast pattern of irradiation scenarios. In particular, these models could be applied for the radiation risk assessment for health of astronauts exposed to space radiation during long-term space missions, such as voyages to Mars or Lunar colonies, as well as for health of people exposed to acute/chronic irradiation due to environmental radiological events.

  20. Atmospheric Models for Aerocapture

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta L.; Keller, Vernon W.

    2004-01-01

    There are eight destinations in the solar System with sufficient atmosphere for aerocapture to be a viable aeroassist option - Venus, Earth, Mars, Jupiter, Saturn and its moon Titan, Uranus, and Neptune. Engineering-level atmospheric models for four of these targets (Earth, Mars, Titan, and Neptune) have been developed for NASA to support systems analysis studies of potential future aerocapture missions. Development of a similar atmospheric model for Venus has recently commenced. An important capability of all of these models is their ability to simulate quasi-random density perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithm, and for thermal systems design. Similarities and differences among these atmospheric models are presented, with emphasis on the recently developed Neptune model and on planned characteristics of the Venus model. Example applications for aerocapture are also presented and illustrated. Recent updates to the Titan atmospheric model are discussed, in anticipation of applications for trajectory and atmospheric reconstruct of Huygens Probe entry at Titan.

  1. SPAR Model Structural Efficiencies

    SciTech Connect

    John Schroeder; Dan Henry

    2013-04-01

    The Nuclear Regulatory Commission (NRC) and the Electric Power Research Institute (EPRI) are supporting initiatives aimed at improving the quality of probabilistic risk assessments (PRAs). Included in these initiatives are the resolution of key technical issues that are have been judged to have the most significant influence on the baseline core damage frequency of the NRC’s Standardized Plant Analysis Risk (SPAR) models and licensee PRA models. Previous work addressed issues associated with support system initiating event analysis and loss of off-site power/station blackout analysis. The key technical issues were: • Development of a standard methodology and implementation of support system initiating events • Treatment of loss of offsite power • Development of standard approach for emergency core cooling following containment failure Some of the related issues were not fully resolved. This project continues the effort to resolve outstanding issues. The work scope was intended to include substantial collaboration with EPRI; however, EPRI has had other higher priority initiatives to support. Therefore this project has addressed SPAR modeling issues. The issues addressed are • SPAR model transparency • Common cause failure modeling deficiencies and approaches • Ac and dc modeling deficiencies and approaches • Instrumentation and control system modeling deficiencies and approaches

  2. The timbre model

    NASA Astrophysics Data System (ADS)

    Jensen, Kristoffer

    2002-11-01

    A timbre model is proposed for use in multiple applications. This model, which encompasses all voiced isolated musical instruments, has an intuitive parameter set, fixed size, and separates the sounds in dimensions akin to the timbre dimensions as proposed in timbre research. The analysis of the model parameters is fully documented, and it proposes, in particular, a method for the estimation of the difficult decay/release split-point. The main parameters of the model are the spectral envelope, the attack/release durations and relative amplitudes, and the inharmonicity and the shimmer and jitter (which provide both for the slow random variations of the frequencies and amplitudes, and also for additive noises). Some of the applications include synthesis, where a real-time application is being developed with an intuitive gui, classification, and search of sounds based on the content of the sounds, and a further understanding of acoustic musical instrument behavior. In order to present the background of the model, this presentation will start with sinusoidal A/S, some timbre perception research, then present the timbre model, show the validity for individual music instrument sounds, and finally introduce some expression additions to the model.

  3. SSCL groundwater model

    SciTech Connect

    Romero, V.; Bull, J.; Stapleton, G.; Baker, S.; Goss, D.; Coulson, L.

    1994-02-01

    Activation of groundwater due to accelerator operations has been a consideration since the conceptual stages of the SSC. Prior to site selection, an elementary hydrological model assuming a porous medium with a shallow well in proximity to the tunnel was used to determine the radionuclide concentrations in the water pumped from a well. The model assumed that radionuclides produced within a few feet of the tunnel would migrate to the shallow well and be diluted as the well drew water from a conically symmetric region. After the Ellis County site was selected, the compatibility of this model with the site specific geology was evaluated. The host geology at the selected site is low permeability rock, Austin chalk, shale, and marl, however, vertical fractures do exist. Since the host rock has a low permeability, groundwater in proximity to the tunnel would have to travel primarily through fractures. This hydrology is not compatible with the above mentioned model since water does not percolate uniformly from the surrounding rock into local wells. The amount of dilution of activated water will vary significantly depending on the specific relationship of the well to the activation zone. A further complication in the original model is that it assumes the high energy particles escaping from the accelerator enclosure are localized. The model does not provide for particles being lost over a large area as will happen with routine operational losses. These losses will be distributed along the accelerator over the life of the project. The SSCL groundwater model has been recast to account for the site specific hydrology and both point and distributed losses. Using the new groundwater model, the SSC accelerators are designed to limit the activation concentration in the water located one meter outside the accelerator enclosure to meet the federal drinking water standards. This technical note provides the details of this model.

  4. Turbulence Modeling Workshop

    NASA Technical Reports Server (NTRS)

    Rubinstein, R. (Editor); Rumsey, C. L. (Editor); Salas, M. D. (Editor); Thomas, J. L. (Editor); Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    Advances in turbulence modeling are needed in order to calculate high Reynolds number flows near the onset of separation and beyond. To this end, the participants in this workshop made the following recommendations. (1) A national/international database and standards for turbulence modeling assessment should be established. Existing experimental data sets should be reviewed and categorized. Advantage should be taken of other efforts already under-way, such as that of the European Research Community on Flow, Turbulence, and Combustion (ERCOFTAC) consortium. Carefully selected "unit" experiments will be needed, as well as advances in instrumentation, to fill the gaps in existing datasets. A high priority should be given to document existing turbulence model capabilities in a standard form, including numerical implementation issues such as grid quality and resolution. (2) NASA should support long-term research on Algebraic Stress Models and Reynolds Stress Models. The emphasis should be placed on improving the length-scale equation, since it is the least understood and is a key component of two-equation and higher models. Second priority should be given to the development of improved near-wall models. Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) would provide valuable guidance in developing and validating new Reynolds-averaged Navier-Stokes (RANS) models. Although not the focus of this workshop, DNS, LES, and hybrid methods currently represent viable approaches for analysis on a limited basis. Therefore, although computer limitations require the use of RANS methods for realistic configurations at high Reynolds number in the foreseeable future, a balanced effort in turbulence modeling development, validation, and implementation should include these approaches as well.

  5. The Deep Model.

    PubMed

    Wang, Panqu; Cottrell, Garrison

    2015-01-01

    "The Model" (a.k.a. "TM", Dailey and Cottrell, 1999) is a biologically-plausible neurocomputational model designed for face and object recognition. Developed over the last 25 years, TM has been successfully used to model many cognitive phenomena, such as facial expression perception (Dailey et al., 2002), recruitment of the FFA for other categories of expertise (Tong et al., 2008), and the experience moderation effect on the correlation between face and object recognition (Wang et al., 2014). However, as TM is a "shallow" model, it cannot develop rich feature representations needed for challenging computer vision tasks. Meanwhile, the recent deep convolutional neural network techniques produce state-of-the-art results for many computer vision benchmarks, but they have not been used in cognitive modeling. The deep architecture allows the network to develop rich high level features, which generalize really well to other novel visual tasks. However, the deep learning models use a fully supervised training approach, which seems implausible for early visual system. Here, "The Deep Model" (TDM) tries to bridge TM and deep learning models together to create a "gradually" supervised deep architecture which can be both biologically-plausible and perform well on computer vision tasks. We show that, by using the sparse PCA and RICA algorithms on natural image datasets, we can obtain center surround color-opponent receptive field that represent LGN cells, and Gabor-like filters that represent V1 simple cells. This suggests that the unsupervised learning approach is what is used in the development of the early visual system. We employ this insight to develop a gradually supervised deep neural network and test it on some standard computer vision and cognitive modeling tasks. Meeting abstract presented at VSS 2015. PMID:26326779

  6. Testing modeling frameworks

    NASA Astrophysics Data System (ADS)

    Hill, Mary; Ye, Ming; Foglia, Laura; Lu, Dan

    2015-04-01

    Modeling frameworks include many ideas about, for example, how to parameterize models, conduct sensitivity analysis (including identifying observations and parameters important to calibration and prediction), quantify uncertainty, and so on. Of concern in this talk is meaningful testing of how ideas proposed for any modeling framework perform. The design of meaningful tests depends on the aspect of the framework being tested and the timing of system dynamics. Consider a situation in which the aspect being tested is prediction accuracy and the quantities of concern are readily measured and change quickly, such as for precipitation, floods, or hurricanes. In such cases meaningful tests involve comparing simulated and measured values and tests can be conducted daily, hourly or even more frequently. Though often challenged by measurement difficulties, this remains the simplest circumstance for conducting meaningful tests of modeling frameworks. If measurements are not readily available and(or) the system responds to changes over decades or centuries, as generally occurs for climate change, saltwater intrusion of groundwater systems, and dewatering of aquifers, prediction accuracy needs to be evaluated in other ways. Often these require high performance computing. For example, complex and simple models can be compared or cross-validation experiments can be conducted. Both can require massive computational resources for any but the simplest of problems. Testing other aspects of a modeling framework can require different types of tests. For example, testing methods of identifying observations or parameters important to model calibration or predictions might entail evaluation of many circumstances for methods that are themselves commonly computationally demanding. Again, high performance computing is needed even when the goal is to include computationally frugal methods in the modeling framework. In this talk we discuss the importance of such testing, stress the need to design and implement tests when any modeling framework is developed, and provide examples of tests from several recent publications.

  7. SMC: SCENIC Model Control

    NASA Technical Reports Server (NTRS)

    Srivastava, Priyaka; Kraus, Jeff; Murawski, Robert; Golden, Bertsel, Jr.

    2015-01-01

    NASAs Space Communications and Navigation (SCaN) program manages three active networks: the Near Earth Network, the Space Network, and the Deep Space Network. These networks simultaneously support NASA missions and provide communications services to customers worldwide. To efficiently manage these resources and their capabilities, a team of student interns at the NASA Glenn Research Center is developing a distributed system to model the SCaN networks. Once complete, the system shall provide a platform that enables users to perform capacity modeling of current and prospective missions with finer-grained control of information between several simulation and modeling tools. This will enable the SCaN program to access a holistic view of its networks and simulate the effects of modifications in order to provide NASA with decisional information. The development of this capacity modeling system is managed by NASAs Strategic Center for Education, Networking, Integration, and Communication (SCENIC). Three primary third-party software tools offer their unique abilities in different stages of the simulation process. MagicDraw provides UMLSysML modeling, AGIs Systems Tool Kit simulates the physical transmission parameters and de-conflicts scheduled communication, and Riverbed Modeler (formerly OPNET) simulates communication protocols and packet-based networking. SCENIC developers are building custom software extensions to integrate these components in an end-to-end space communications modeling platform. A central control module acts as the hub for report-based messaging between client wrappers. Backend databases provide information related to mission parameters and ground station configurations, while the end user defines scenario-specific attributes for the model. The eight SCENIC interns are working under the direction of their mentors to complete an initial version of this capacity modeling system during the summer of 2015. The intern team is composed of four students in Computer Science, two in Computer Engineering, one in Electrical Engineering, and one studying Space Systems Engineering.

  8. Aviation Safety Simulation Model

    NASA Technical Reports Server (NTRS)

    Houser, Scott; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.

  9. Structural Model of Eumelanin

    NASA Astrophysics Data System (ADS)

    Kaxiras, Efthimios; Tsolakidis, Argyrios; Zonios, George; Meng, Sheng

    2006-11-01

    Melanin is a ubiquitous pigment in living organisms with multiple important functions, yet its structure is not well understood. We propose a structural model for eumelanin protomolecules, consisting of 4 or 5 of the basic molecular units (hydroquinone, indolequinone, and its tautomers), in arrangements that contain an inner porphyrin ring. We use time-dependent density functional theory to calculate the optical absorption spectrum of the structural model, which reproduces convincingly the main features of the experimental spectrum of eumelanin. Our model also reproduces accurately other important properties of eumelanin, including x-ray scattering data, its ability to capture and release metal ions, and the characteristic size of the protomolecules.

  10. Dynamical model for thyroid

    NASA Astrophysics Data System (ADS)

    Rokni Lamooki, Gholam Reza; Shirazi, Amir H.; Mani, Ali R.

    2015-05-01

    Thyroid's main chemical reactions are employed to develop a mathematical model. The presented model is based on differential equations where their dynamics reflects many aspects of thyroid's behavior. Our main focus here is the well known, but not well understood, phenomenon so called as Wolff-Chaikoff effect. It is shown that the inhibitory effect of intake iodide on the rate of one single enzyme causes a similar effect as Wolff-Chaikoff. Besides this issue, the presented model is capable of revealing other complex phenomena of thyroid hormones homeostasis.

  11. The Finslerian wormhole models

    NASA Astrophysics Data System (ADS)

    Rahaman, Farook; Paul, Nupur; Banerjee, Ayan; De, S. S.; Ray, Saibal; Usmani, A. A.

    2016-05-01

    We present models of wormhole under the Finslerian structure of spacetime. This is a sequel of our previous work (Eur Phys J 75:564, 2015) where we constructed a toy model for compact stars based on the Finslerian spacetime geometry. In the present investigation, a wide variety of solutions are obtained, which explore the wormhole geometry by considering different choices for the form function and energy density. The solutions, like in the previous work, are revealed to be physically interesting and viable models for the explanation of wormholes as far as the background theory and literature are concerned.

  12. Component-specific modeling

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.

    1985-01-01

    Accomplishments are described for the second year effort of a 3-year program to develop methodology for component specific modeling of aircraft engine hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models; (2) geometry model generators; (3) remeshing; (4) specialty 3-D inelastic stuctural analysis; (5) computationally efficient solvers, (6) adaptive solution strategies; (7) engine performance parameters/component response variables decomposition and synthesis; (8) integrated software architecture and development, and (9) validation cases for software developed.

  13. Modeling cortical spreading depression.

    PubMed Central

    Reggia, J. A.; Montgomery, D.

    1994-01-01

    Cortical spreading depression is a wave of electrical silence and biochemical changes that spreads across the cerebral cortex. Recently there has been a growing recognition that it may be an important pathophysiological event in a number of neurological disorders. In this paper, we describe a reaction-diffusion model of the extracellular potassium changes that are a central part of this process. Simulations with the model show that an appropriate stimulus evokes a moving wave of increased potassium with many similarities to that seen experimentally. The resultant model is a useful computational tool for future study of the effects of spreading depression on the cortex. PMID:7950049

  14. Ultrafilter Extensions of Models

    NASA Astrophysics Data System (ADS)

    Saveliev, Denis I.

    We show that any model A can be extended, in a canonical way, to a model βA consisting of ultrafilters over it. The extension procedure preserves homomorphisms: any homomorphism of A into B extends to a continuous homomorphism of βA into βB. Moreover, if a model B carries a compact Hausdorff topology which is (in a certain sense) compatible, then any homomorphism of A into B extends to a continuous homomorphism of βA into B. This is also true for embeddings instead of homomorphisms.

  15. Modeling Compressed Turbulence

    SciTech Connect

    Israel, Daniel M.

    2012-07-13

    From ICE to ICF, the effect of mean compression or expansion is important for predicting the state of the turbulence. When developing combustion models, we would like to know the mix state of the reacting species. This involves density and concentration fluctuations. To date, research has focused on the effect of compression on the turbulent kinetic energy. The current work provides constraints to help development and calibration for models of species mixing effects in compressed turbulence. The Cambon, et al., re-scaling has been extended to buoyancy driven turbulence, including the fluctuating density, concentration, and temperature equations. The new scalings give us helpful constraints for developing and validating RANS turbulence models.

  16. Battle of Animal Models

    PubMed Central

    Persidsky, Yuri; Fox, Howard

    2008-01-01

    This is a brief summary of the animal models session held during the 12th Annual Meeting of the Society on NeuroImmune Pharmacology, Santa Fe, NM. This session provided important information for participants on availability and utility of animal models for the studies of HIV-1 central nervous system infection and drug abuse. It highlighted animal model relevance to human disease/condition, their utility for the studies of pathogenesis, potential importance for the development of therapeutics, and demonstrated limitations/pitfalls. PMID:18040841

  17. Stochastic ontogenetic growth model

    NASA Astrophysics Data System (ADS)

    West, B. J.; West, D.

    2012-02-01

    An ontogenetic growth model (OGM) for a thermodynamically closed system is generalized to satisfy both the first and second law of thermodynamics. The hypothesized stochastic ontogenetic growth model (SOGM) is shown to entail the interspecies allometry relation by explicitly averaging the basal metabolic rate and the total body mass over the steady-state probability density for the total body mass (TBM). This is the first derivation of the interspecies metabolic allometric relation from a dynamical model and the asymptotic steady-state distribution of the TBM is fit to data and shown to be inverse power law.

  18. Hierarchical model of matching

    NASA Technical Reports Server (NTRS)

    Pedrycz, Witold; Roventa, Eugene

    1992-01-01

    The issue of matching two fuzzy sets becomes an essential design aspect of many algorithms including fuzzy controllers, pattern classifiers, knowledge-based systems, etc. This paper introduces a new model of matching. Its principal features involve the following: (1) matching carried out with respect to the grades of membership of fuzzy sets as well as some functionals defined on them (like energy, entropy,transom); (2) concepts of hierarchies in the matching model leading to a straightforward distinction between 'local' and 'global' levels of matching; and (3) a distributed character of the model realized as a logic-based neural network.

  19. Sandia Material Model Driver

    Energy Science and Technology Software Center (ESTSC)

    2005-09-28

    The Sandia Material Model Driver (MMD) software package allows users to run material models from a variety of different Finite Element Model (FEM) codes in a standalone fashion, independent of the host codes. The MMD software is designed to be run on a variety of different operating system platforms as a console application. Initial development efforts have resulted in a package that has been shown to be fast, convenient, and easy to use, with substantialmore » growth potential.« less

  20. Diagnositcs With Adjoint Modelling

    NASA Astrophysics Data System (ADS)

    Blessing, S.; Fraedrich, K.; Kirk, E.; Lunkeit, F.

    The potential usefulness of an adjoint primitive equations global atmospheric circu- lation model for climate diagnostics is demonstrated in a feasibility study. A daily NAO-type index is calculated as one-point correlation of the 300 hPa streamfunction anomaly. By application of the adjoint model we diagnose its temperature forcing on short timescales in terms of spatial temperature sensitivity patterns at different time lags, which, in a first order approximation, induce growth of the index. The dynamical relevance of these sensitivity patterns is confirmed by lag-correlating the index time series and the projection time series of the model temperature on these sensitivity patterns.

  1. Animal Model of Dermatophytosis

    PubMed Central

    Shimamura, Tsuyoshi; Kubota, Nobuo; Shibuya, Kazutoshi

    2012-01-01

    Dermatophytosis is superficial fungal infection caused by dermatophytes that invade the keratinized tissue of humans and animals. Lesions from dermatophytosis exhibit an inflammatory reaction induced to eliminate the invading fungi by using the host's normal immune function. Many scientists have attempted to establish an experimental animal model to elucidate the pathogenesis of human dermatophytosis and evaluate drug efficacy. However, current animal models have several issues. In the present paper, we surveyed reports about the methodology of the dermatophytosis animal model for tinea corporis, tinea pedis, and tinea unguium and discussed future prospects. PMID:22619489

  2. Modeling EERE Deployment Programs

    SciTech Connect

    Cort, Katherine A.; Hostick, Donna J.; Belzer, David B.; Livingston, Olga V.

    2007-11-08

    The purpose of this report is to compile information and conclusions gathered as part of three separate tasks undertaken as part of the overall project, “Modeling EERE Deployment Programs,” sponsored by the Planning, Analysis, and Evaluation office within the Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE). The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address improvements to modeling in the near term, and note gaps in knowledge where future research is needed.

  3. Temperature Dependent Pspice Model

    SciTech Connect

    Tolbert, Leon M; Cui, Yutian; Chinthavali, Madhu Sudhan

    2012-01-01

    This paper provides a behavioral model in Pspice for a silicon carbide (SiC) power MOSFET rated at 1200 V / 20 A for a wide temperature range. The Pspice model is built using device parameters extracted through experiment. The static and dynamic behavior of the SiC power MOSFET is simulated and compared to the measured data to show the accuracy of the Pspice model. The switching losses are obtained from experiment under multiple operation conditions. The temperature dependent behavior has been simulated and analyzed. Then the parasitics in the circuit have been studied and the effects on the switching behavior are simulated and discussed.

  4. Computer Modeling Of Atomization

    NASA Technical Reports Server (NTRS)

    Giridharan, M.; Ibrahim, E.; Przekwas, A.; Cheuch, S.; Krishnan, A.; Yang, H.; Lee, J.

    1994-01-01

    Improved mathematical models based on fundamental principles of conservation of mass, energy, and momentum developed for use in computer simulation of atomization of jets of liquid fuel in rocket engines. Models also used to study atomization in terrestrial applications; prove especially useful in designing improved industrial sprays - humidifier water sprays, chemical process sprays, and sprays of molten metal. Because present improved mathematical models based on first principles, they are minimally dependent on empirical correlations and better able to represent hot-flow conditions that prevail in rocket engines and are too severe to be accessible for detailed experimentation.

  5. Expert Models and Modeling Processes Associated with a Computer-Modeling Tool

    ERIC Educational Resources Information Center

    Zhang, BaoHui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-01-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using "think aloud" technique…

  6. Supersymmetric sigma models

    SciTech Connect

    Bagger, J.A.

    1984-09-01

    We begin to construct the most general supersymmetric Lagrangians in one, two and four dimensions. We find that the matter couplings have a natural interpretation in the language of the nonlinear sigma model.

  7. Fluidized bed combustor modeling

    NASA Technical Reports Server (NTRS)

    Horio, M.; Rengarajan, P.; Krishnan, R.; Wen, C. Y.

    1977-01-01

    A general mathematical model for the prediction of performance of a fluidized bed coal combustor (FBC) is developed. The basic elements of the model consist of: (1) hydrodynamics of gas and solids in the combustor; (2) description of gas and solids contacting pattern; (3) kinetics of combustion; and (4) absorption of SO2 by limestone in the bed. The model is capable of calculating the combustion efficiency, axial bed temperature profile, carbon hold-up in the bed, oxygen and SO2 concentrations in the bubble and emulsion phases, sulfur retention efficiency and particulate carry over by elutriation. The effects of bed geometry, excess air, location of heat transfer coils in the bed, calcium to sulfur ratio in the feeds, etc. are examined. The calculated results are compared with experimental data. Agreement between the calculated results and the observed data are satisfactory in most cases. Recommendations to enhance the accuracy of prediction of the model are suggested.

  8. Community Atmosphere Model

    Energy Science and Technology Software Center (ESTSC)

    2004-10-18

    The Community Atmosphere Model (CAM) is an atmospheric general circulation model that solves equations for atmospheric dynamics and physics. CAM is an outgrowth of the Community Climate Model at the National Center for Atmospheric Research (NCAR) and was developed as a joint collaborative effort between NCAR and several DOE laboratories, including LLNL. CAM contains several alternative approaches for advancing the atmospheric dynamics. One of these approaches uses a finite-volume method originally developed by personnel atmore » NASNGSFC, We have developed a scalable version of the finite-volume solver for massively parallel computing systems. FV-CAM is meant to be used in conjunction with the Community Atmosphere Model. It is not stand-alone.« less

  9. Refining climate models

    ScienceCinema

    Warren, Jeff; Iversen, Colleen; Brooks, Jonathan; Ricciuto, Daniel

    2014-06-26

    Using dogwood trees, Oak Ridge National Laboratory researchers are gaining a better understanding of the role photosynthesis and respiration play in the atmospheric carbon dioxide cycle. Their findings will aid computer modelers in improving the accuracy of climate simulations.

  10. Modeling Viral Capsid Assembly

    PubMed Central

    2014-01-01

    I present a review of the theoretical and computational methodologies that have been used to model the assembly of viral capsids. I discuss the capabilities and limitations of approaches ranging from equilibrium continuum theories to molecular dynamics simulations, and I give an overview of some of the important conclusions about virus assembly that have resulted from these modeling efforts. Topics include the assembly of empty viral shells, assembly around single-stranded nucleic acids to form viral particles, and assembly around synthetic polymers or charged nanoparticles for nanotechnology or biomedical applications. I present some examples in which modeling efforts have promoted experimental breakthroughs, as well as directions in which the connection between modeling and experiment can be strengthened. PMID:25663722

  11. MODELING TREE LEVEL PROCESSES

    EPA Science Inventory

    An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...

  12. Refining climate models

    SciTech Connect

    Warren, Jeff; Iversen, Colleen; Brooks, Jonathan; Ricciuto, Daniel

    2012-10-31

    Using dogwood trees, Oak Ridge National Laboratory researchers are gaining a better understanding of the role photosynthesis and respiration play in the atmospheric carbon dioxide cycle. Their findings will aid computer modelers in improving the accuracy of climate simulations.

  13. HOMER® Energy Modeling Software

    Energy Science and Technology Software Center (ESTSC)

    2000-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  14. Industrial process models

    SciTech Connect

    Pierce, B.

    1983-05-01

    The National Center for Analysis of Energy Systems at Brookhaven National Laboratory has developed process optimization models of the U.S. pulp and paper, iron and steel, and aluminum industries. Given fuel price and product demand projections, the models project modes of operation and energy consumption characteristics that minimize the cost of meeting the expected demands over a 25-year time horizon. Each model includes energy-conserving options to conventional technologies. Examples are vapor recompression, hydropyrolysis, scrap preheating, induction reheating, and titanium diboride cathodes. Model results include fuel use by type and time period as well as use of new technologies. Users can examine the effects of fuel prices, tax policies, product demands, and alternative investment scenarios on fuel and electricity use in each industry.

  15. Ellipsoid-cylinder model

    NASA Astrophysics Data System (ADS)

    Barberis, D.

    1994-08-01

    The data presented in this contribution were obtained in the F2 subsonic wind tunnel of the ONERA Fauga-Mauzac Center. The objective of this work was to obtain detailed experimental data on a separated vortex flow. The model shape has been chosen to be as simple as possible in order to facilitate the mathematical modeling. This model has been defined after preliminary studies in a water tunnel. The present document reports the results obtained with an axisymmetric model at incidence. Attention has been focused on the boundary layer evolution in the zone of separation and on the mechanism leading to the formation of a well detached primary vortex. The flow has been investigated in great detail by using several experimental techniques: surface flow visualizations, surface pressure measurements, field explorations by multihole pressure probes, and an LDV system.

  16. Community Atmosphere Model

    SciTech Connect

    Marin, A. A.; Sawyer, W. B.; Putnam, W. M.

    2004-10-18

    The Community Atmosphere Model (CAM) is an atmospheric general circulation model that solves equations for atmospheric dynamics and physics. CAM is an outgrowth of the Community Climate Model at the National Center for Atmospheric Research (NCAR) and was developed as a joint collaborative effort between NCAR and several DOE laboratories, including LLNL. CAM contains several alternative approaches for advancing the atmospheric dynamics. One of these approaches uses a finite-volume method originally developed by personnel at NASNGSFC, We have developed a scalable version of the finite-volume solver for massively parallel computing systems. FV-CAM is meant to be used in conjunction with the Community Atmosphere Model. It is not stand-alone.

  17. Modelling pulmonary blood flow.

    PubMed

    Tawhai, Merryn H; Burrowes, Kelly S

    2008-11-30

    Computational model analysis has been used widely to understand and interpret complexity of interactions in the pulmonary system. Pulmonary blood transport is a multi-scale phenomenon that involves scale-dependent structure and function, therefore requiring different model assumptions for the microcirculation and the arterial or venous flows. The blood transport systems interact with the surrounding lung tissue, and are dependent on hydrostatic pressure gradients, control of vasoconstriction, and the topology and material composition of the vascular trees. This review focuses on computational models that have been developed to study the different mechanisms contributing to regional perfusion of the lung. Different models for the microcirculation and the pulmonary arteries are considered, including fractal approaches and anatomically-based methods. The studies that are reviewed illustrate the different complementary approaches that can be used to address the same physiological question of flow heterogeneity. PMID:18434260

  18. Molecular Modeling and Bioinformatics

    Cancer.gov

    Dr. Byungkook Lee, Ph.D. Head, Molecular Modeling and Bioinformatics Section Laboratory of Molecular Biology Building 37, Room 5120 37 Convent Drive, MSC 4262 Bethesda, MD 20891 We use theoretical and computational techniques to help solve biological and

  19. Modeling collective cell motility

    NASA Astrophysics Data System (ADS)

    Rappel, Wouter-Jan

    Eukaryotic cells often move in groups, a critical aspect of many biological and medical processes including wound healing, morphogenesis and cancer metastasis. Modeling can provide useful insights into the fundamental mechanisms of collective cell motility. Constructing models that incorporate the physical properties of the cells, however, is challenging. Here, I discuss our efforts to build a comprehensive cell motility model that includes cell membrane properties, cell-substrate interactions, cell polarity, and cell-cell interaction. The model will be applied to a variety of systems, including motion on micropatterned substrates and the migration of border cells in Drosophila. This work was supported by NIH Grant No. P01 GM078586 and NSF Grant No. 1068869.

  20. Multidimensional Speckle Noise Model

    NASA Astrophysics Data System (ADS)

    Lpez-Martnez, Carlos; Fbregas, Xavier; Pottier, Eric

    2005-12-01

    One of the main problems of SAR imagery is the presence of speckle noise, originated by the inherent coherent nature of this type of systems. For one-dimensional SAR systems it has been demonstrated that speckle can be considered as a multiplicative noise term. Nevertheless, this simple model cannot be exported when multidimensional SAR imagery is addressed. This paper is devoted to present the latest advances into the definition of a multidimensional speckle noise model which does not depend on the data dimensionality. Speckle noise may be modeled by multiplicative and additive noise sources, whose combination is determined by the data's correlation structure. The validity of the proposed model is demonstrated by its application to a real L-band multidimensional SAR dataset acquired by the German ESAR sensor.

  1. Aerosol lenses propagation model.

    PubMed

    Tremblay, Grégoire; Roy, Gilles

    2011-09-01

    We propose a model based on the properties of cascading lenses modulation transfer function (MTF) to reproduce the irradiance of a screen illuminated through a dense aerosol cloud. In this model, the aerosol cloud is broken into multiple thin layers considered as individual lenses. The screen irradiance generated by these individual layers is equivalent to the point-spread function (PSF) of each aerosol lens. Taking the Fourier transform of the PSF as a MTF, we cascade the lenses MTF to find the cloud MTF. The screen irradiance is found with the Fourier transform of this MTF. We show the derivation of the model and we compare the results with the Undique Monte Carlo simulator for four aerosols at three optical depths. The model is in agreement with the Monte Carlo for all the cases tested. PMID:21886230

  2. NARSTO NE MODEL

    Atmospheric Science Data Center

    2014-04-25

    ... Station Instrument:  Chemiluminescence UV Ozone Detector Location:  Northeastern United States ... Files:  NE Model Readme Hourly Surface Air Quality Ozone & Nitrogen Measurement Sites Related Data:  ...

  3. Modeling cooling coils

    SciTech Connect

    Theerakulpisut, S.; Priprem, S.

    1998-01-01

    Finned-tube heat exchangers commonly used as cooling coils in air conditioning systems undergo complex heat transfer and dehumidification. Due to the presence of water film on the outside surface of the coils, the general approach for an analysis of dry surface is not adequate to predict the performance of such coils. This paper presents a modeling procedure for cooling coils with dehumidification based on the approach of Threlkeld. In order to verify the calculational results of the model, experiments were conducted with an aim to determine the outlet air conditions as well as some other parameters required as the inputs to the model. Comparison between the simulation and experimental results reveals that the model is accurate and suitable for predicting the performance of cooling coils with dehumidification.

  4. Model deformation system

    NASA Technical Reports Server (NTRS)

    Holmes, H. K.

    1983-01-01

    The development of a system to measure model deflections encountered in the National Transonic Facility is discussed. The goal is to be able to measure peak deflections of up to 3 in. with accuracies to within 0.0025 in. over an area 1 m square as the model pitches through an included angle of 30 deg. Stereophotogrammetric techniques are being implemented, with the initial system being an extension of standard techniques. A second system, which will be all electronic, is under development. Both techniques require targets to be strategically placed on the model. Active targets are being developed for location in the model in order to maximize the signal-to-noise ratio and to approximate a point source. Image processing techniques and stereophotogrammetric data reduction programs are being implemented to perform the data reduction tasks.

  5. Structural Equation Model Trees

    PubMed Central

    Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman

    2015-01-01

    In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree structures that separate a data set recursively into subsets with significantly different parameter estimates in a SEM. SEM Trees provide means for finding covariates and covariate interactions that predict differences in structural parameters in observed as well as in latent space and facilitate theory-guided exploration of empirical data. We describe the methodology, discuss theoretical and practical implications, and demonstrate applications to a factor model and a linear growth curve model. PMID:22984789

  6. Orsted Initial Field Model

    NASA Technical Reports Server (NTRS)

    Olsen, N.; Holme, R.; Hulot, G.; Sabaka, T.; Neubert, T.; Toffner-Clausen, L.; Primdahl, F.; Jorgensen, J.; Leger, J.-M.; Barraclough, D.; Smith, David E. (Technical Monitor)

    2000-01-01

    Magnetic measurements taken by the Orsted satellite during geomagnetic quiet conditions around January 1, 2000 have been used to derive a spherical harmonic model of the Earth's magnetic field for epoch 2000.0. The maximum degree and order of the model is 19 for internal, and 2 for external, source fields; however, coefficients above degree 14 may not be robust. Such detailed models exist for only one previous epoch, 1980. Achieved rms misfit is 2 nT for the scalar intensity and 4 nT for the vector components perpendicular to the magnetic field. This model is of higher detail than the IGRF 2000, which for scientific purposes related to the Orsted mission it supersedes.

  7. Battery Life Predictive Model

    Energy Science and Technology Software Center (ESTSC)

    2009-12-31

    The Software consists of a model used to predict battery capacity fade and resistance growth for arbitrary cycling and temperature profiles. It allows the user to extrapolate from experimental data to predict actual life cycle.

  8. Solar Furnance Model

    ERIC Educational Resources Information Center

    Palmer, Dennis L.; Olsen, Richard W.

    1977-01-01

    Described is how to build a solar furnace model. A detailed list of materials and methods are included along with diagrams. This particular activity is part of an audiotutorial unit concerned with the energy crisis and energy alternatives. (MA)

  9. Modeling of piled foundations

    SciTech Connect

    Kriger, G.A.

    1982-10-01

    In a fixed offshore platform, the steel jacket superstructure and its supporting piled foundation are more conveniently analyzed if treated separately. There are major structural and behavioral differences between the jacket and foundation, and the two do not lend themselves to similar analytical methods. This paper presents basic techniques for constructing linear models that simulate the foundation behavior at the superstructure/foundation boundary. Use of these models permits independent superstructure analyses. Selection of the model type and its degree of refinement are described from a global overview of the structure, available data, and ramification of analytical results. Construction of the foundation simulation model follows routine procedures using results of an independent foundation analysis.

  10. Materials modelling in London

    NASA Astrophysics Data System (ADS)

    Ciudad, David

    2016-04-01

    Angelos Michaelides, Professor in Theoretical Chemistry at University College London (UCL) and co-director of the Thomas Young Centre (TYC), explains to Nature Materials the challenges in materials modelling and the objectives of the TYC.

  11. Lifting Body Model

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Scaled model of the Langley Personnel Launch System (PLS) Vehicle. Concept used to measure aerodynamic performance characteristics over a wide range of flow conditions and attitudes in several Langley wind tunnels.

  12. Software development: Stratosphere modeling

    NASA Technical Reports Server (NTRS)

    Chen, H. C.

    1977-01-01

    A more comprehensive model for stratospheric chemistry and transport theory was developed for the purpose of aiding predictions of changes in the stratospheric ozone content as a consequence of natural and anthropogenic processes. This new and more advanced stratospheric model is time dependent and the dependent variables are zonal means of the relevant meteorological quantities which are functions of latitude and height. The model was constructed by the best mathematical approach on a large IBM S360 in American National Standard FORTRAN. It will be both a scientific tool and an assessment device used to evaluate other models. The interactions of dynamics, photochemistry and radiation in the stratosphere can be governed by a set of fundamental dynamical equations.

  13. UPDATING APPLIED DIFFUSION MODELS

    EPA Science Inventory

    Most diffusion models currently used in air quality applications are substantially out of date with understanding of turbulence and diffusion in the planetary boundary layer. Under a Cooperative Agreement with the Environmental Protection Agency, the American Meteorological Socie...

  14. Dietary Exposure Potential Model

    EPA Science Inventory

    Existing food consumption and contaminant residue databases, typically products of nutrition and regulatory monitoring, contain useful information to characterize dietary intake of environmental chemicals. A PC-based model with resident database system, termed the Die...

  15. Base Flow Model Validation

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

    2011-01-01

    A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

  16. Stochastic modelling of intermittency.

    PubMed

    Stemler, Thomas; Werner, Johannes P; Benner, Hartmut; Just, Wolfram

    2010-01-13

    Recently, methods have been developed to model low-dimensional chaotic systems in terms of stochastic differential equations. We tested such methods in an electronic circuit experiment. We aimed to obtain reliable drift and diffusion coefficients even without a pronounced time-scale separation of the chaotic dynamics. By comparing the analytical solutions of the corresponding Fokker-Planck equation with experimental data, we show here that crisis-induced intermittency can be described in terms of a stochastic model which is dominated by state-space-dependent diffusion. Further on, we demonstrate and discuss some limits of these modelling approaches using numerical simulations. This enables us to state a criterion that can be used to decide whether a stochastic model will capture the essential features of a given time series. PMID:19948556

  17. Using the Partnership Model

    ERIC Educational Resources Information Center

    Wilks, Bob

    1977-01-01

    Demonstrates how the Partnership Model can be utilized in the real world by showing how it served as a guide during the production of a film on female menopause for the College of Human Medicine. (MH)

  18. Contact dynamics math model

    NASA Technical Reports Server (NTRS)

    Glaese, John R.; Tobbe, Patrick A.

    1986-01-01

    The Space Station Mechanism Test Bed consists of a hydraulically driven, computer controlled six degree of freedom (DOF) motion system with which docking, berthing, and other mechanisms can be evaluated. Measured contact forces and moments are provided to the simulation host computer to enable representation of orbital contact dynamics. This report describes the development of a generalized math model which represents the relative motion between two rigid orbiting vehicles. The model allows motion in six DOF for each body, with no vehicle size limitation. The rotational and translational equations of motion are derived. The method used to transform the forces and moments from the sensor location to the vehicles' centers of mass is also explained. Two math models of docking mechanisms, a simple translational spring and the Remote Manipulator System end effector, are presented along with simulation results. The translational spring model is used in an attempt to verify the simulation with compensated hardware in the loop results.

  19. Maximally Expressive Modeling

    NASA Technical Reports Server (NTRS)

    Jaap, John; Davis, Elizabeth; Richardson, Lea

    2004-01-01

    Planning and scheduling systems organize tasks into a timeline or schedule. Tasks are logically grouped into containers called models. Models are a collection of related tasks, along with their dependencies and requirements, that when met will produce the desired result. One challenging domain for a planning and scheduling system is the operation of on-board experiments for the International Space Station. In these experiments, the equipment used is among the most complex hardware ever developed; the information sought is at the cutting edge of scientific endeavor; and the procedures are intricate and exacting. Scheduling is made more difficult by a scarcity of station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling International Space Station experiment operations calls for a maximally expressive modeling schema.

  20. Bonabeau hierarchy models revisited

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas; Luque, Bartolo

    2006-07-01

    What basic processes generate hierarchy in a collective? The Bonabeau model provides us a simple mechanism based on randomness which develops self-organization through both winner/looser effects and relaxation process. A phase transition between egalitarian and hierarchic states has been found both analytically and numerically in previous works. In this paper we present a different approach: by means of a discrete scheme we develop a mean field approximation that not only reproduces the phase transition but also allows us to characterize the complexity of hierarchic phase. In the same philosophy, we study a new version of the Bonabeau model, developed by Stauffer et al. Several previous works described numerically the presence of a similar phase transition in this later version. We find surprising results in this model that can be interpreted properly as the non-existence of phase transition in this version of Bonabeau model, but a changing in fixed point structure.

  1. Argentina wheat yield model

    NASA Technical Reports Server (NTRS)

    Callis, S. L.; Sakamoto, C.

    1984-01-01

    Five models based on multiple regression were developed to estimate wheat yields for the five wheat growing provinces of Argentina. Meteorological data sets were obtained for each province by averaging data for stations within each province. Predictor variables for the models were derived from monthly total precipitation, average monthly mean temperature, and average monthly maximum temperature. Buenos Aires was the only province for which a trend variable was included because of increasing trend in yield due to technology from 1950 to 1963.

  2. Apache Scale Model Helicopter

    NASA Technical Reports Server (NTRS)

    1995-01-01

    NASA Langley Research Centers (LaRC) Electromagnetics Research Branch (ERB) performs antenna radiation pattern measurements on a communications antenna mounted on a 1/7th scale model of a US ARMY Apache Helicopter. The NASA LaRC ERB participates in a government industry, and university sponsored helicopter consortium to advance computational electromagnetics (CEM) code development for antenna radiation pattern predictions. Scale model antenna measurements serve as verification tools and are an integral part of the CEM code development process.

  3. Modeling using optimization routines

    NASA Technical Reports Server (NTRS)

    Thomas, Theodore

    1995-01-01

    Modeling using mathematical optimization dynamics is a design tool used in magnetic suspension system development. MATLAB (software) is used to calculate minimum cost and other desired constraints. The parameters to be measured are programmed into mathematical equations. MATLAB will calculate answers for each set of inputs; inputs cover the boundary limits of the design. A Magnetic Suspension System using Electromagnets Mounted in a Plannar Array is a design system that makes use of optimization modeling.

  4. The GRAM-3 model

    NASA Technical Reports Server (NTRS)

    Justus, C. G.

    1987-01-01

    The Global Reference Atmosphere Model (GRAM) is under continuous development and improvement. GRAM data were compared with Middle Atmosphere Program (MAP) predictions and with shuttle data. An important note: Users should employ only step sizes in altitude that give vertical density gradients consistent with shuttle-derived density data. Using too small a vertical step size (finer then 1 km) will result in what appears to be unreasonably high values of density shears but what in reality is noise in the model.

  5. Modeling armed conflicts.

    PubMed

    Kress, Moshe

    2012-05-18

    Armed conflicts have been prevalent throughout history, in some cases having very great consequences. To win, one needs to understand the characteristics of an armed conflict and be prepared with resources and capabilities for responding to its specific challenges. An important tool for understanding these characteristics and challenges is a model--an abstraction of the field of conflict. Models have evolved through the years, addressing different conflict scenarios with varying techniques. PMID:22605764

  6. CO2 laser modeling

    NASA Technical Reports Server (NTRS)

    Johnson, Barry

    1992-01-01

    The topics covered include the following: (1) CO2 laser kinetics modeling; (2) gas lifetimes in pulsed CO2 lasers; (3) frequency chirp and laser pulse spectral analysis; (4) LAWS A' Design Study; and (5) discharge circuit components for LAWS. The appendices include LAWS Memos, computer modeling of pulsed CO2 lasers for lidar applications, discharge circuit considerations for pulsed CO2 lidars, and presentation made at the Code RC Review.

  7. Geysers injection modeling

    SciTech Connect

    Pruess, K.

    1994-04-01

    Our research is concerned with mathematical modeling techniques for engineering design and optimization of water injection in vapor-dominated systems. The emphasis in the project has been on the understanding of physical processes and mechanisms during injection, applications to field problems, and on transfer of numerical simulation capabilities to the geothermal community. This overview summarizes recent work on modeling injection interference in the Southeast Geysers, and on improving the description of two-phase flow processes in heterogeneous media.

  8. Models of multiquark states

    SciTech Connect

    Lipkin, H.J.

    1986-01-01

    The success of simple constituent quark models in single-hardon physics and their failure in multiquark physics is discussed, emphasizing the relation between meson and baryon spectra, hidden color and the color matrix, breakup decay modes, coupled channels, and hadron-hadron interactions via flipping and tunneling of flux tubes. Model-independent predictions for possible multiquark bound states are considered and the most promising candidates suggested. A quark approach to baryon-baryon interactions is discussed.

  9. OCH Strap Model Test

    SciTech Connect

    Weber, K.; /Fermilab

    1987-08-26

    The OCH Model was stacked using the appropriate spacers between each absorber plate. Steel bars measuring 3-inch wide by 1/4-inch thick were welded, using 1/8-inch fillet weld, along all the corner edges, except the outer radius edges. On the outer radius, the straps were bolted to the end plates and to plates 9 and 17. The straps on the outer radius were also set in towards the center by approximately 3-inches. The spacers were then knocked out. Twelve strain gauges were mounted on the model. See figure 1 and the OCH strap Model log book for locations. Each rosette was centered in the gap between two absorber plates. The finite element plate model can predict the primary deformations of the OH module in both the cantilever and crushing modes to within 11% of the measured values. The primary stresses away from the support plate for the cantilever mode can be predicted to within 13% by this model. Near the support plate where large shear stresses exists, ANSYS will overpredict the measured stresses substantially. This is probably due to the models inherent inability to allow for shear stress concentrations at the welds. The same over-prediction was seen in the side straps during the OH crush test comparison and is probably attributable to the high shear force in this mode. The simple finite element plate model will provide suitable model of OH module stiffness for use in the analysis of the module assembly. The calculation of shear stresses can be improved by applying the ANSYS calculated inter-element forces to traditional weld strength calculations

  10. Modelling heart rate kinetics.

    PubMed

    Zakynthinaki, Maria S

    2015-01-01

    The objective of the present study was to formulate a simple and at the same time effective mathematical model of heart rate kinetics in response to movement (exercise). Based on an existing model, a system of two coupled differential equations which give the rate of change of heart rate and the rate of change of exercise intensity is used. The modifications introduced to the existing model are justified and discussed in detail, while models of blood lactate accumulation in respect to time and exercise intensity are also presented. The main modification is that the proposed model has now only one parameter which reflects the overall cardiovascular condition of the individual. The time elapsed after the beginning of the exercise, the intensity of the exercise, as well as blood lactate are also taken into account. Application of the model provides information regarding the individual's cardiovascular condition and is able to detect possible changes in it, across the data recording periods. To demonstrate examples of successful numerical fit of the model, constant intensity experimental heart rate data sets of two individuals have been selected and numerical optimization was implemented. In addition, numerical simulations provided predictions for various exercise intensities and various cardiovascular condition levels. The proposed model can serve as a powerful tool for a complete means of heart rate analysis, not only in exercise physiology (for efficiently designing training sessions for healthy subjects) but also in the areas of cardiovascular health and rehabilitation (including application in population groups for which direct heart rate recordings at intense exercises are not possible or not allowed, such as elderly or pregnant women). PMID:25876164

  11. Argentina soybean yield model

    NASA Technical Reports Server (NTRS)

    Callis, S. L.; Sakamoto, C.

    1984-01-01

    A model based on multiple regression was developed to estimate soybean yields for the country of Argentina. A meteorological data set was obtained for the country by averaging data for stations within the soybean growing area. Predictor variables for the model were derived from monthly total precipitation and monthly average temperature. A trend variable was included for the years 1969 to 1978 since an increasing trend in yields due to technology was observed between these years.

  12. ADVANCED CHEMISTRY BASINS MODEL

    SciTech Connect

    William Goddard III; Lawrence Cathles III; Mario Blanco; Paul Manhardt; Peter Meulbroek; Yongchun Tang

    2004-05-01

    The advanced Chemistry Basin Model project has been operative for 48 months. During this period, about half the project tasks are on projected schedule. On average the project is somewhat behind schedule (90%). Unanticipated issues are causing model integration to take longer then scheduled, delaying final debugging and manual development. It is anticipated that a short extension will be required to fulfill all contract obligations.

  13. SSUSI Aurora Forecast Model

    NASA Astrophysics Data System (ADS)

    Hsieh, S. W.; Zhang, Y.; Schaefer, R. K.; Romeo, G.; Paxton, L.

    2013-12-01

    A new capability has been developed at JHU/APL for forecasting the global aurora quantities based on the DMSP SSUSI data and the TIMED/GUVI Global Aurora Model. The SSUSI Aurora Forecast Model predicts the electron energy flux, mean energy, and equatorward boundary in the auroral oval for up to 1 day or 15 DMSP orbits in advance. In our presentation, we will demonstrate this newly implemented capability and its results. The future improvement plan will be discussed too.

  14. Argentina corn yield model

    NASA Technical Reports Server (NTRS)

    Callis, S. L.; Sakamoto, C.

    1984-01-01

    A model based on multiple regression was developed to estimate corn yields for the country of Argentina. A meteorological data set was obtained for the country by averaging data for stations within the corn-growing area. Predictor variables for the model were derived from monthly total precipitation, average monthly mean temperature, and average monthly maximum temperature. A trend variable was included for the years 1965 to 1980 since an increasing trend in yields due to technology was observed between these years.

  15. Modelling Heart Rate Kinetics

    PubMed Central

    Zakynthinaki, Maria S.

    2015-01-01

    The objective of the present study was to formulate a simple and at the same time effective mathematical model of heart rate kinetics in response to movement (exercise). Based on an existing model, a system of two coupled differential equations which give the rate of change of heart rate and the rate of change of exercise intensity is used. The modifications introduced to the existing model are justified and discussed in detail, while models of blood lactate accumulation in respect to time and exercise intensity are also presented. The main modification is that the proposed model has now only one parameter which reflects the overall cardiovascular condition of the individual. The time elapsed after the beginning of the exercise, the intensity of the exercise, as well as blood lactate are also taken into account. Application of the model provides information regarding the individual’s cardiovascular condition and is able to detect possible changes in it, across the data recording periods. To demonstrate examples of successful numerical fit of the model, constant intensity experimental heart rate data sets of two individuals have been selected and numerical optimization was implemented. In addition, numerical simulations provided predictions for various exercise intensities and various cardiovascular condition levels. The proposed model can serve as a powerful tool for a complete means of heart rate analysis, not only in exercise physiology (for efficiently designing training sessions for healthy subjects) but also in the areas of cardiovascular health and rehabilitation (including application in population groups for which direct heart rate recordings at intense exercises are not possible or not allowed, such as elderly or pregnant women). PMID:25876164

  16. Theory Modeling and Simulation

    SciTech Connect

    Shlachter, Jack

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  17. Model compilation: An approach to automated model derivation

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Baudin, Catherine; Iwasaki, Yumi; Nayak, Pandurang; Tanaka, Kazuo

    1990-01-01

    An approach is introduced to automated model derivation for knowledge based systems. The approach, model compilation, involves procedurally generating the set of domain models used by a knowledge based system. With an implemented example, how this approach can be used to derive models of different precision and abstraction is illustrated, and models are tailored to different tasks, from a given set of base domain models. In particular, two implemented model compilers are described, each of which takes as input a base model that describes the structure and behavior of a simple electromechanical device, the Reaction Wheel Assembly of NASA's Hubble Space Telescope. The compilers transform this relatively general base model into simple task specific models for troubleshooting and redesign, respectively, by applying a sequence of model transformations. Each transformation in this sequence produces an increasingly more specialized model. The compilation approach lessens the burden of updating and maintaining consistency among models by enabling their automatic regeneration.

  18. Models, Metadata and Metafor

    NASA Astrophysics Data System (ADS)

    Lawrence, B.; Guilyardi, E.; Treshansky, A.; Valcke, S.

    2009-04-01

    The EU project Metafor (Metadata For Climate Modelling Digital Repositories) is developing a common information structure for describing complex climate models, their runtime technical and scientific context, and their data outputs. It is also building tools to help create, discover, view and manipulate the resulting metadata, helping scientists to make the most out of the huge volume of climate model data routinely produced. The project will leave a legacy of services deployed to maintain the infrastructure developed within the project. This presentation will introduce the main concepts defined by model metadata within Metafor (aka the Common Information Model or CIM), and how the CIM is building on other metadata initiatives both in Europe and worldwide - ranging from the exploitation of ISO standards, to support for the European spatial data infrastructure (INSPIRE) and the Global Earth Observation System of Systems (GEOSS). It will also outline how Metafor is contributing to the metadata requirements and collection of the forthcoming fifth Coupled Model Intercomparison Project (CMIP5) led by the World CLimate Research Programme (WCRP) in support of the next IPCC assessment.

  19. Meteoroid ablation models

    NASA Astrophysics Data System (ADS)

    Popova, Olga

    2004-12-01

    The fate of entering meteoroids in atmosphere is determined by their size, velocity and substance properties. Material from ablation of small-sized meteors (roughly R≤0.01-1 cm) is mostly deposited between 120 and 80 km altitudes. Larger bodies (up to meter sizes) penetrate deeper into the atmosphere (down to 20 km altitude). Meteoroids of cometary origin typically have higher termination altitude due to substance properties and higher entry velocity. Fast meteoroids ( V>30-40 km/s) may lose a part of their material at higher altitudes due to sputtering. Local flow regime realized around the falling body determines the heat transfer and mass loss processes. Classic approach to meteor interaction with atmosphere allows describing two limiting cases: - large meteoroid at relatively low altitude, where shock wave is formed (hydrodynamical models); - small meteoroid/or high altitudes - free molecule regime of interaction, which assumes no collisions between evaporated meteoroid particles. These evaporated particles form initial train, which then spreads into an ambient air due to diffusion. Ablation models should make it possible to describe physical conditions that occur around meteor body. Several self-consistent hydrodynamical models are developed, but similar models for transition and free molecule regimes are still under study. This paper reviews existing ablation models and discusses model boundaries.

  20. Ion thruster performance model

    NASA Technical Reports Server (NTRS)

    Brophy, J. R.

    1984-01-01

    A model of ion thruster performance is developed for high flux density, cusped magnetic field thruster designs. This model is formulated in terms of the average energy required to produce an ion in the discharge chamber plasma and the fraction of these ions that are extracted to form the beam. The direct loss of high energy (primary) electrons from the plasma to the anode is shown to have a major effect on thruster performance. The model provides simple algebraic equations enabling one to calculate the beam ion energy cost, the average discharge chamber plasma ion energy cost, the primary electron density, the primary-to-Maxwellian electron density ratio and the Maxwellian electron temperature. Experiments indicate that the model correctly predicts the variation in plasma ion energy cost for changes in propellant gas (Ar, Kr and Xe), grid transparency to neutral atoms, beam extraction area, discharge voltage, and discharge chamber wall temperature. The model and experiments indicate that thruster performance may be described in terms of only four thruster configuration dependent parameters and two operating parameters. The model also suggests that improved performance should be exhibited by thruster designs which extract a large fraction of the ions produced in the discharge chamber, which have good primary electron and neutral atom containment and which operate at high propellant flow rates.

  1. Animal models of scoliosis.

    PubMed

    Bobyn, Justin D; Little, David G; Gray, Randolph; Schindeler, Aaron

    2015-04-01

    Multiple techniques designed to induce scoliotic deformity have been applied across many animal species. We have undertaken a review of the literature regarding experimental models of scoliosis in animals to discuss their utility in comprehending disease aetiology and treatment. Models of scoliosis in animals can be broadly divided into quadrupedal and bipedal experiments. Quadrupedal models, in the absence of axial gravitation force, depend upon development of a mechanical asymmetry along the spine to initiate a scoliotic deformity. Bipedal models more accurately mimic human posture and consequently are subject to similar forces due to gravity, which have been long appreciated to be a contributing factor to the development of scoliosis. Many effective models of scoliosis in smaller animals have not been successfully translated to primates and humans. Though these models may not clarify the aetiology of human scoliosis, by providing a reliable and reproducible deformity in the spine they are a useful means with which to test interventions designed to correct and prevent deformity. PMID:25492698

  2. Structure of Coset Models

    NASA Astrophysics Data System (ADS)

    Koester, Soeren

    2003-08-01

    We study inclusions of local, chiral, conformal quantum theories C which are contained in an ambient theory B and commute with another given subtheory A. These subtheories C are called Coset models. Most of our results are model-independent, although our analysis is motivated by the inclusions of current algebras and their Coset models. We prove that to every given A contained in B there is a unique, inner representation U^A which implements conformal symmetry on the subnet. The local observables of B which commute with U^A form the maximal Coset model C_max. Assuming U^A to be generated by integrals of a quantum field affiliated with the subnet A, we show: The inclusion of the subnet and of its Coset models is directly analogous to the inclusion of chiral observables in a local, conformal theory in 1+1 dimensions. The local observables of the maximal Coset model associated with a given region are found to be characterised by their commuting with the local observables of A associated with the very same region. We give applications and discuss possible generalisations of our methods.

  3. Causal Rasch models

    PubMed Central

    Stenner, A. Jackson; Fisher, William P.; Stone, Mark H.; Burdick, Donald S.

    2013-01-01

    Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained. PMID:23986726

  4. Learning planar ising models

    SciTech Connect

    Johnson, Jason K; Chertkov, Michael; Netrapalli, Praneeth

    2010-11-12

    Inference and learning of graphical models are both well-studied problems in statistics and machine learning that have found many applications in science and engineering. However, exact inference is intractable in general graphical models, which suggests the problem of seeking the best approximation to a collection of random variables within some tractable family of graphical models. In this paper, we focus our attention on the class of planar Ising models, for which inference is tractable using techniques of statistical physics [Kac and Ward; Kasteleyn]. Based on these techniques and recent methods for planarity testing and planar embedding [Chrobak and Payne], we propose a simple greedy algorithm for learning the best planar Ising model to approximate an arbitrary collection of binary random variables (possibly from sample data). Given the set of all pairwise correlations among variables, we select a planar graph and optimal planar Ising model defined on this graph to best approximate that set of correlations. We present the results of numerical experiments evaluating the performance of our algorithm.

  5. Modeling of transitional flows

    NASA Technical Reports Server (NTRS)

    Lund, Thomas S.

    1988-01-01

    An effort directed at developing improved transitional models was initiated. The focus of this work was concentrated on the critical assessment of a popular existing transitional model developed by McDonald and Fish in 1972. The objective of this effort was to identify the shortcomings of the McDonald-Fish model and to use the insights gained to suggest modifications or alterations of the basic model. In order to evaluate the transitional model, a compressible boundary layer code was required. Accordingly, a two-dimensional compressible boundary layer code was developed. The program was based on a three-point fully implicit finite difference algorithm where the equations were solved in an uncoupled manner with second order extrapolation used to evaluate the non-linear coefficients. Iteration was offered as an option if the extrapolation error could not be tolerated. The differencing scheme was arranged to be second order in both spatial directions on an arbitrarily stretched mesh. A variety of boundary condition options were implemented including specification of an external pressure gradient, specification of a wall temperature distribution, and specification of an external temperature distribution. Overall the results of the initial phase of this work indicate that the McDonald-Fish model does a poor job at predicting the details of the turbulent flow structure during the transition region.

  6. Modeling of diastole.

    PubMed

    Kovcs, S J; Meisner, J S; Yellin, E L

    2000-08-01

    Modeling methods have been employed to further characterize the physical and physiologic processes of filling and diastolic function. They have led to more detailed understanding of the effect of alteration of physiologic parameters on the Doppler E-wave contour as well as pulmonary vein flow. Depending on the modeling approach, different aspects of the filling process have been considered from AV gradient and net compliance to atrial appendage function to the mechanical suction pump attribute of the heart. The models have been applied for further characterization of diastolic function and elucidation of novel basic physiologic relations. We trust that readers recognize that this article could not serve as a comprehensive and global review of the state-of-the-art in physiologic modeling, but rather as a selective overview, with emphasis on the main modeling principles and options currently in use. Modeling of systems physiology, especially as it relates to the function of the four-chamber heart, remains a fertile area of investigation. Future progress is likely to have profound influence on (noninvasive) diagnosis and quantitation of the effect of therapy and lead to continued discovery of "new" (macroscopic, cellular, and molecular biologic) physiology. PMID:10986584

  7. Integrated Watershed Modeling

    NASA Astrophysics Data System (ADS)

    Bagulho Galvão, P.; Neves, R.; Silva, A.; Chambel Leitão, P.; Braunchweig, F.

    2004-05-01

    Integrated systems that bring together EO data, local measurements and modeling tools, are a fundamental instrument to help decision making in watershed and land use management. The BASINS system (EPA http://www.epa.gov/OST/BASINS/) follows this philosophy, merging data from local measurement with modeling tools (HSPF, SWAT, PLOAD, QUAL2E). However, remote sensed data is still used in a very static way (usually to define land cover, see corine land cover project). This approach is being replaced with operational methods that use EO data (such as land surface temperature, vegetation state, soil moisture, surface roughness) for both inputs and validation. The development of integrated watershed models that dynamically interact with remote sensed data opens interesting prospective to the validation and improvement of such models. This paper describes the possible data contribution of remote sensing to the needs associated with state of the art watershed models, including well know systems (such as SWAT or HSPF) and a system still under development (MOHID LAND). Application of such models is shown at two pilot sites, which were selected under EU projects, TempQsim and Interreg II B - ICRW.

  8. Seismic wave propagation modeling

    SciTech Connect

    Jones, E.M.; Olsen, K.B.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). A hybrid, finite-difference technique was developed for modeling nonlinear soil amplification from three-dimensional, finite-fault radiation patters for earthquakes in arbitrary earth models. The method was applied to the 17 January 1994 Northridge earthquake. Particle velocities were computed on a plane at 5-km depth, immediately above the causative fault. Time-series of the strike-perpendicular, lateral velocities then were propagated vertically in a soil column typical of the San Fernando Valley. Suitable material models were adapted from a suite used to model ground motions at the US Nevada Test Site. The effects of nonlinearity reduced relative spectral amplitudes by about 40% at frequencies above 1.5 Hz but only by 10% at lower frequencies. Runs made with source-depth amplitudes increased by a factor of two showed relative amplitudes above 1.5 Hz reduced by a total of 70% above 1.5 Hz and 20% at lower frequencies. Runs made with elastic-plastic material models showed similar behavior to runs made with Masing-Rule models.

  9. MARR: active vision model

    NASA Astrophysics Data System (ADS)

    Podladchikova, Lubov N.; Gusakova, Valentina I.; Shaposhnikov, Dmitry G.; Faure, Alain; Golovan, Alexander V.; Shevtsova, Natalia A.

    1997-09-01

    Earlier, the biologically plausible active vision, model for multiresolutional attentional representation and recognition (MARR) has been developed. The model is based on the scanpath theory of Noton and Stark and provides invariant recognition of gray-level images. In the present paper, the algorithm of automatic image viewing trajectory formation in the MARR model, the results of psychophysical experiments, and possible applications of the model are considered. Algorithm of automatic image viewing trajectory formation is based on imitation of the scanpath formed by operator. Several propositions about possible mechanisms for a consecutive selection of fixation points in human visual perception inspired by computer simulation results and known psychophysical data have been tested and confirmed in our psychophysical experiments. In particular, we have found that gaze switch may be directed (1) to a peripheral part of the vision field which contains an edge oriented orthogonally to the edge in the point of fixation, and (2) to a peripheral part of the vision field containing crossing edges. Our experimental results have been used to optimize automatic algorithm of image viewing in the MARR model. The modified model demonstrates an ability to recognize complex real world images invariantly with respect to scale, shift, rotation, illumination conditions, and, in part, to point of view and can be used to solve some robot vision tasks.

  10. Robust Flood Frequency Models

    NASA Astrophysics Data System (ADS)

    Kuczera, George

    1982-04-01

    The concept of a robust model is briefly explored. In the context of flood frequency analysis, two necessary properties of a robust model are advanced, namely, resistance and efficiency. Strategies for seeking more robust models are discussed. Because of its versatility, the five-parameter Wakeby distribution can credibly be considered a parent flood distribution. Four regionalized Wakeby parents are employed in simulation studies to search for robust models. These parents were shown by Houghton to be representative of U.S. flood experience in the sense that certain raw flood data characteristics could be reproduced. A limited range of sampling experiments were undertaken. The results suggest that of the site-specific estimators considered, the two-parameter log normal maximum likelihood estimator is most resistant, with Gumbel estimators employing either maximum likelihood or probability-weighted moments displaying comparable resistance. Several estimators which utilize regional flood information were compared. Included were empirical Bayes estimators which are structurally similar to James-Stein rules and regionalized estimators based on the flood index method. These estimators exhibited substantial improvements in aggregate risk performance over their site-specific counterparts, particularly for short record lengths. Regionalized estimators appear to be preferable for short record lengths, while estimators which combine both site and regional flood information are preferable for longer record lengths. When such estimation procedures are considered, other distributional models such as log Pearson type III and Wakeby become practical alternatives to the two-parameter log normal model.

  11. Extended chameleon models

    NASA Astrophysics Data System (ADS)

    Brax, Philippe; Tamanini, Nicola

    2016-05-01

    We extend the chameleon models by considering scalar-fluid theories where the coupling between matter and the scalar field can be represented by a quadratic effective potential with density-dependent minimum and mass. In this context, we study the effects of the scalar field on Solar System tests of gravity and show that models passing these stringent constraints can still induce large modifications of Newton's law on galactic scales. On these scales we analyze models which could lead to a percent deviation of Newton's law outside the virial radius. We then model the dark matter halo as a Navarro-Frenk-White profile and explicitly find that the fifth force can give large contributions around the galactic core in a particular model where the scalar field mass is constant and the minimum of its potential varies linearly with the matter density. At cosmological distances, we find that this model does not alter the growth of large scale structures and therefore would be best tested on galactic scales, where interesting signatures might arise in the galaxy rotation curves.

  12. Nonparametric Streamflow Disaggregation Model

    NASA Astrophysics Data System (ADS)

    Lee, T.; Salas, J. D.; Prairie, J. R.

    2009-05-01

    Stochastic streamflow generation is generally utilized for planning and management of water resources systems. For this purpose a number of parametric and nonparametric modeling alternatives have been suggested in literature. Among them temporal and spatial disaggregation approaches play an important role particularly to make sure that historical variance-covariance properties are preserved at various temporal and spatial scales. In this paper, we review the underlying features of nonparametric disaggregation, identify some of their pros and cons, and propose a disaggregation algorithm that is capable of surmounting some of the shortcoming of the current models. The proposed models hinge on k-nearest neighbor resampling, the accurate adjusting procedure, and a genetic algorithm. The model has been tested and compared to an existing nonparametric disaggregation approach using data of the Colorado River system. It has been shown that the model is capable of (i) reproducing the season-to-season correlations including the correlation between the last season of the previous year and the first season of the current year, (ii) minimizing or avoiding the generation of flow patterns across the year that are literally the same as those of the historical records, and (iii) minimizing or avoiding the generation of negative flows. In addition, it is applicable to intermittent river regimes. Suggestions for further improving the model are discussed.

  13. Zebrafish models of Tauopathy

    PubMed Central

    Bai, Qing; Burton, Edward A.

    2016-01-01

    Tauopathies are a group of incurable neurodegenerative diseases, in which loss of neurons is accompanied by intracellular deposition of fibrillar material composed of hyper phosphorylated forms of the microtubule associated protein Tau. A zebrafish model of Tauopathy could complement existing murine models by providing a platform for genetic and chemical screens, in order to identify novel therapeutic targets and compounds with disease-modifying potential. In addition, Tauopathy zebrafish would be useful for hypothesis-driven experiments, especially those exploiting the potential to deploy in vivo imaging modalities. Several considerations, including conservation of specialized neuronal and other cellular populations, and biochemical pathways implicated in disease pathogenesis, suggest that the zebrafish brain is an appropriate setting in which to model these complex disorders. Novel transgenic zebrafish lines expressing wild-type and mutant forms of human Tau inCNS neurons have recently been reported. These studies show evidence that human Tau undergoes disease-relevant changes in zebrafish neurons, including somato-dendritic relocalization, hyper phosphorylation and aggregation. In addition, preliminary evidence suggests that Tau transgene expression can precipitate neuronal dysfunction and death. These initial studies are encouraging that the zebrafish holds considerable promise as a model in which to study Tauopathies. Further studies are necessary to clarify the phenotypes of transgenic lines and to develop assays and models suitable for unbiased high-throughput screening approaches. This article is part of a Special Issue entitled Zebrafish Models of Neurological Diseases. PMID:20849952

  14. Climate and atmospheric modeling studies

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The climate and atmosphere modeling research programs have concentrated on the development of appropriate atmospheric and upper ocean models, and preliminary applications of these models. Principal models are a one-dimensional radiative-convective model, a three-dimensional global model, and an upper ocean model. Principal applications were the study of the impact of CO2, aerosols, and the solar 'constant' on climate.

  15. Model Comparison of Bayesian Semiparametric and Parametric Structural Equation Models

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Xia, Ye-Mao; Pan, Jun-Hao; Lee, Sik-Yum

    2011-01-01

    Structural equation models have wide applications. One of the most important issues in analyzing structural equation models is model comparison. This article proposes a Bayesian model comparison statistic, namely the "L[subscript nu]"-measure for both semiparametric and parametric structural equation models. For illustration purposes, we consider…

  16. Model Comparison of Bayesian Semiparametric and Parametric Structural Equation Models

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Xia, Ye-Mao; Pan, Jun-Hao; Lee, Sik-Yum

    2011-01-01

    Structural equation models have wide applications. One of the most important issues in analyzing structural equation models is model comparison. This article proposes a Bayesian model comparison statistic, namely the "L[subscript nu]"-measure for both semiparametric and parametric structural equation models. For illustration purposes, we consider

  17. Hybrid Model of IRT and Latent Class Models.

    ERIC Educational Resources Information Center

    Yamamoto, Kentaro

    This study developed a hybrid of item response theory (IRT) models and latent class models, which combined the strengths of each type of model. The primary motivation for developing the new model is to describe characteristics of examinees' knowledge at the time of the examination. Hence, the application of the model lies mainly in so-called…

  18. A High Precision Prediction Model Using Hybrid Grey Dynamic Model

    ERIC Educational Resources Information Center

    Li, Guo-Dong; Yamaguchi, Daisuke; Nagai, Masatake; Masuda, Shiro

    2008-01-01

    In this paper, we propose a new prediction analysis model which combines the first order one variable Grey differential equation Model (abbreviated as GM(1,1) model) from grey system theory and time series Autoregressive Integrated Moving Average (ARIMA) model from statistics theory. We abbreviate the combined GM(1,1) ARIMA model as ARGM(1,1)

  19. A High Precision Prediction Model Using Hybrid Grey Dynamic Model

    ERIC Educational Resources Information Center

    Li, Guo-Dong; Yamaguchi, Daisuke; Nagai, Masatake; Masuda, Shiro

    2008-01-01

    In this paper, we propose a new prediction analysis model which combines the first order one variable Grey differential equation Model (abbreviated as GM(1,1) model) from grey system theory and time series Autoregressive Integrated Moving Average (ARIMA) model from statistics theory. We abbreviate the combined GM(1,1) ARIMA model as ARGM(1,1)…

  20. Integrated Assessment Model Evaluation

    NASA Astrophysics Data System (ADS)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside and outside the IAM. All IAM projections to date are conditional on assumed inputs such as population dynamics and economic growth. A key part of evaluation exercises will be the substantial effort needed to develop the necessary historical datasets. Given the fundamentally uncertain characteristics of the socio-economic system, alternative formulations of the evaluation question may turn out to be useful. For example, is is likely useful to ask: how much needs to be specified on order to be able to reproduce historical trends to within a given accuracy? There is also a close, and fundamental, link between evaluation and diagnostic exercises that aim to evaluate the characteristics of future scenarios (rates of growth, technology diffusion, etc.) against historical behavior. These exercises are currently being conducted by individual groups due, in part, due to the large diversity if IAM designs and goals. While all climate models are, to first order, modeling the same system, boundary conditions, and physical laws, this is not true for IAMs. The structure, and even feasibility, of a hindcast-style evaluation exercise can be very different depending on the structure of each specific integrated assessment model.