The Scission-Point Configuration within the Two-Center Shell Model Shape Parameterization
F. A. Ivanyuk; S. Chiba; Y. Aritomo
2014-10-28
Within the two-center shell model parameterization we have defined the optimal shape which fissioning nuclei attain just before the scission and calculated the total deformation energy (liquid drop part plus the shell correction) as function of the mass asymmetry and elongation at the scission point. The three minima corresponding to mass symmetric and two mass asymmetric peaks in the mass distribution of fission fragments are found in the deformation energy at the scission point. The calculated deformation energy is used in quasi-static approximation for the estimation of the total kinetic and excitation energy of fission fragments and the total number of emitted prompt neutrons. The calculated results reproduce rather well the experimental data on the position of the peaks in the mass distribution of fission fragments, the total kinetic and excitation energy of fission fragments. The calculated value of neutron multiplicity is somewhat larger than experimental results.
SPY: a new scission-point model based on microscopic inputs to predict fission fragment properties
NASA Astrophysics Data System (ADS)
Panebianco, Stefano; Dubray, Nöel; Goriely, Stéphane; Hilaire, Stéphane; Lemaître, Jean-François; Sida, Jean-Luc
2014-04-01
Despite the difficulty in describing the whole fission dynamics, the main fragment characteristics can be determined in a static approach based on a so-called scission-point model. Within this framework, a new Scission-Point model for the calculations of fission fragment Yields (SPY) has been developed. This model, initially based on the approach developed by Wilkins in the late seventies, consists in performing a static energy balance at scission, where the two fragments are supposed to be completely separated so that their macroscopic properties (mass and charge) can be considered as fixed. Given the knowledge of the system state density, averaged quantities such as mass and charge yields, mean kinetic and excitation energy can then be extracted in the framework of a microcanonical statistical description. The main advantage of the SPY model is the introduction of one of the most up-to-date microscopic descriptions of the nucleus for the individual energy of each fragment and, in the future, for their state density. These quantities are obtained in the framework of HFB calculations using the Gogny nucleon-nucleon interaction, ensuring an overall coherence of the model. Starting from a description of the SPY model and its main features, a comparison between the SPY predictions and experimental data will be discussed for some specific cases, from light nuclei around mercury to major actinides. Moreover, extensive predictions over the whole chart of nuclides will be discussed, with particular attention to their implication in stellar nucleosynthesis. Finally, future developments, mainly concerning the introduction of microscopic state densities, will be briefly discussed.
On the Scission Point Configuration of Fisioning Nuclei
Fedir Ivanyuk
2013-03-29
The scission of a nucleus into two fragments is at present the least understood part of the fission process, though the most important for the formation of the observables. To investigate the potential energy landscape at the largest possible deformations, i.e. at the scission point (line, hypersurface), the Strutinsky's optimal shape approach is applied. For the accurate description of the mass-asymmetric nuclear shape at the scission point, it turned out necessary to construct an interpolation between the two sets of constraints for the elongation and mass asymmetry which are applied successfully at small deformations (quadrupole and octupole moments) and for separated fragments (the distance between the centers of mass and the difference of fragments masses). In addition, a constraint on the neck radius was added, what makes it possible to introduce the so called super-short and super-long shapes at the scission point and to consider the contributions to the observable data from different fission modes. The calculated results for the mass distribution of the fission fragment and the Coulomb repulsion energy "immediately after scission" are in a reasonable agreement with experimental data.
Kadmensky, S. G., E-mail: kadmensky@phys.vsu.ru [Voronezh State University (Russian Federation); Bunakov, V. E. [Russian Academy of Sciences, Petersburg Nuclear Physics Institute (Russian Federation); Kadmensky, S. S. [Voronezh State University (Russian Federation)
2012-11-15
It is shown that the emergence of anisotropies in the angular distributions of fragments originating from the spontaneous and induced fission of oriented actinide nuclei is possible only if nonuniformities in the population of the projectionsM (K) of the fissile-nucleus spin onto the z axis of the laboratory frame (fissile-nucleus symmetry axis) appear simultaneously in the vicinity of the scission point but not in the vicinity of the outer saddle point of the deformation potential. The possibilities for creating the orientation of fissile nuclei for spontaneous and induced fission and the effect of these orientations on the anisotropies under analysis are considered. The role of Coriolis interaction as a unique source of the mixing of different-K fissile-nucleus states at all stages of the fission process is studied with allowance for the dynamical enhancement of this interaction for excited thermalized states of the nucleus involved that is characterized by a high energy density. It is shown that the absence of thermalization of excited states of the fissile nucleus that appear because of the effect of nonadiabaticity of its collective deformation motion in the vicinity of the scission point is a condition of conservation of the influence that transition fission states formed at the inner and outer fission barriers exerts on the distribution of the spin projections K for lowenergy spontaneous nuclear fission. It is confirmed that anisotropies observed in the angular distributions of fragments originating from the fission of nuclei that is induced by fast light particles (multiply charged ions) are due to the appearance of strongly excited equilibrium(nonequilibrium) states of the fissile nucleus in the vicinity of its scission point that have a Gibbs (non-Gibbs) distribution of projections K.
Diffusion model of the formation of fission-fragment distributions
Adeev, G. D.; Gonchar, I. I.; Pashkevich, V. V.; Pischasov, N. I.; Serdyuk, O. I.
1988-11-01
Calculations of the mass--energy fission-fragment distributions of heated nuclei in a wide range of the fissility parameter are reviewed. The calculations were made in a diffusion model based on the Fokker--Planck equation for the distribution function of the three most important collective coordinates that describe the deformation of a fissioning nucleus---the elongation, mass asymmetry, and neck parameter---and their conjugate momenta. The review demonstrates the capabilities of the diffusion model in describing the fragment distributions as functions of various parameters of the compound nucleus: the fissility parameter, the excitation energy, and the angular momentum. The part played by the dynamics of the descent of the fissioning nucleus from the saddle point to the scission point in the formation of the observed fragment distributions is shown. The dependence of the predictions of the diffusion model on the choice of the coefficients of the Fokker--Planck equation is investigated.
Ryabov, E. G.; Nadtochy, P. N.; Adeev, G. D. [Omsk State University, Prospect Mira 55-A, RU-644077, Omsk (Russian Federation); Karpov, A. V. [Flerov Laboratory of Nuclear Reactions, Joint Institute for Nuclear Research, Dubna RU-141980 (Russian Federation)
2008-10-15
A stochastic approach to fission dynamics based on three-dimensional Langevin equations was applied to calculation of the mass-energy and angular distributions of fission fragments. The dependence of the mass-energy distribution parameters on the angular momentum and the anisotropy of the fission-fragment angular distribution on excitation energy have been studied in a wide range of the fissility parameter. A temperature-dependent finite-range liquid-drop model was used in a consistent way to calculate the functional of the Helmholtz free energy and level-density parameter. The modified one-body mechanism of nuclear dissipation (the so-called surface-plus-window dissipation) was used to determine the dissipative forces in Langevin equations. The evaporation of light prescission particles was taken into account on the basis of a statistical model combined with Langevin dynamics. The calculated parameters of the mass-energy distribution and their angular dependencies are in good quantitative agreement with the available experimental data at the value of the reduction coefficient of the contribution from the wall formula equal to 0.25. Analysis of the anisotropy of the fission-fragment angular distribution performed with the saddle-point transition state model and scission-point transition state model indicates that it is necessary to take into account the dynamical aspects of the fission-fragment angular distribution formation.
Fission fragment mass distribution for nuclei in the r-process region
Tatsuda, S.; Hashizume, K.; Wada, T.; Ohta, M. [Department of Physics, Konan University, 8-9-1 Okamoto, Kobe 658-8501 (Japan); Sumiyoshi, K. [Numazu College of Technology, NAO (Japan); Otsuki, K. [Univ. of Chicago (United States); Kajino, T. [NAO, GUSA, Univ. of Tokyo (Japan); Koura, H.; Chiba, S. [JAEA (Japan); Aritomo, Y. [FLNR (JINR) (United States)
2007-02-26
The fission fragment mass distribution is estimated theoretically on about 2000 nuclides which might have a critical role on the r-process nucleosynthesis through fission (Z>85). The mass distribution of fission fragment is derived by considering the location and the depth of valleys of potential energy surface near scission point of nuclei calculated by means of the liquid drop model with the shell energy correction by the Two-Center shell model. The guiding principle of determining the fission mass asymmetry is the behavior of the fission paths from the saddle to the scission point given by the Langevin calculation.
Preliminary calculations of medium-energy fission cross sections and spectra
M. Bozoian; E. D. Arthur; D. C. George; D. G. Madland; P. G. Young
1988-01-01
Nucleon-induced fission cross sections determined from a statistical preequilibrium model are used in conjunction with a new scission-point model of fission fragment mass, charge and excitation energy distributions to produce evaporation model calculations of particle and gamma spectra and multiplicities from fission. Comparisons are made to experiment for the 14.5-MeV neutron-induced fission of Â²Â³â¸U. In addition, calculated particle and gamma
Preliminary calculations of medium-energy fission cross sections and spectra
Bozoian, M.; Arthur, E.D.; George, D.C.; Madland, D.G.; Young, P.G.
1988-01-01
Nucleon-induced fission cross sections determined from a statistical preequilibrium model are used in conjunction with a new scission-point model of fission fragment mass, charge and excitation energy distributions to produce evaporation model calculations of particle and gamma spectra and multiplicities from fission. Comparisons are made to experiment for the 14.5-MeV neutron-induced fission of /sup 238/U. In addition, calculated particle and gamma spectra will be compared with the ENDF/B library for 2- and 5-MeV neutron-induced fission of /sup 235/U and /sup 238/U, respectively. Initial predictions for these same quantities for proton-induced fission reactions at energies up to 100 MeV will be presented and discussed. 6 refs., 3 figs.
Study of Fission Barrier Heights of Uranium Isotopes by the Macroscopic-Microscopic Method
NASA Astrophysics Data System (ADS)
Zhong, Chun-Lai; Fan, Tie-Shuan
2014-09-01
Potential energy surfaces of uranium nuclei in the range of mass numbers 229 through 244 are investigated in the framework of the macroscopic-microscopic model and the heights of static fission barriers are obtained in terms of a double-humped structure. The macroscopic part of the nuclear energy is calculated according to Lublin—Strasbourg-drop (LSD) model. Shell and pairing corrections as the microscopic part are calculated with a folded-Yukawa single-particle potential. The calculation is carried out in a five-dimensional parameter space of the generalized Lawrence shapes. In order to extract saddle points on the potential energy surface, a new algorithm which can effectively find an optimal fission path leading from the ground state to the scission point is developed. The comparison of our results with available experimental data and others' theoretical results confirms the reliability of our calculations.
van Sinderen, Marten
Design Model 123 Chapter 6 Design Model This chapter presents a design model that allows refinement types are identified, and their relevance to design steps in the application protocol design, interaction and causality relation are the elementary design, or architectural, concepts of our design model
Input modeling: input modeling
Lawrence Leemis
2003-01-01
Most discrete-event simulation models have stochastic elements that mimic the probabilistic nature of the system under consideration. A close match between the input model and the true underlying probabilistic mechanism associated with the system is required for successful input modeling. The general question considered here is how to model an element (e.g., arrival process, service times) in a discrete-event simulation
ERIC Educational Resources Information Center
Lesh, Richard; Carmona, Guadalupe; Post, Thomas
In this workshop, we will continue to reflect on a models and modeling perspective to understand how students and teachers learn and reason about real life situations encountered in a mathematics and science classroom. We will discuss the idea of a model as a conceptual system that is expressed by using external representational media, and that is…
Mental Models, Conceptual Models, and Modelling.
ERIC Educational Resources Information Center
Greca, Ileana Maria; Moreira, Marco Antonio
2000-01-01
Reviews science education research into representations constructed by students in their interactions with the world, its phenomena, and artefacts. Features discussions of mental models, conceptual models, and the activity of modeling. (Contains 30 references.) (Author/WRM)
MODEL DEVELOPMENT - DOSE MODELS
Model Development Humans are exposed to mixtures of chemicals from multiple pathways and routes. These exposures may result from a single event or may accumulate over time if multiple exposure events occur. The traditional approach of assessing risk from a single chemica...
NSDL National Science Digital Library
Angela B. Shiflet
In this module, we develop models of the effects of malaria on various populations of humans and mosquitoes. After considering differential equations to model a system, we create a model using the systems modeling tool STELLA. Projects involve various refinements of the model.
NSDL National Science Digital Library
Betty Blecha
The Fair model web site includes a freely available United States macroeconomic econometric model and a multicounty econometric model. The models run on the Windows OS. Instructors can use the models to teach forecasting, run policy experiments, and evaluate historical episodes of macroeconomic behavior. The web site includes extensive documentation for both models. The simulation is for upper-division economics courses in macroeconomics or econometrics. The principle developer is Ray Fair at Yale University.
Katharina Morik
1987-01-01
In this paper, I would like to present a unifying view on knowledge acquisition and machine learning. In this view, knowledge acquisition systems should support the user in doing the modeling of a domain, and machine learning systems are those which perform part of the modeling autonomously. Taking the notion of modeling as the central point, some aspects of modeling
ERIC Educational Resources Information Center
Levenson, Harold E.; Hurni, Andre
1978-01-01
Suggests building models as a way to reinforce and enhance related subjects such as architectural drafting, structural carpentry, etc., and discusses time, materials, scales, tools or equipment needed, how to achieve realistic special effects, and the types of projects that can be built (model of complete building, a panoramic model, and model…
Y. Aritomo; K. Hagino; K. Nishio; S. Chiba
2012-03-12
In order to describe heavy-ion fusion reactions around the Coulomb barrier with an actinide target nucleus, we propose a model which combines the coupled-channels approach and a fluctuation-dissipation model for dynamical calculations. This model takes into account couplings to the collective states of the interacting nuclei in the penetration of the Coulomb barrier and the subsequent dynamical evolution of a nuclear shape from the contact configuration. In the fluctuation-dissipation model with a Langevin equation, the effect of nuclear orientation at the initial impact on the prolately deformed target nucleus is considered. Fusion-fission, quasi-fission and deep quasi-fission are separated as different Langevin trajectories on the potential energy surface. Using this model, we analyze the experimental data for the mass distribution of fission fragments (MDFF) in the reactions of $^{34,36}$S+$^{238}$U and $^{30}$Si+$^{238}$U at several incident energies around the Coulomb barrier. We find that the time scale in the quasi-fission as well as the deformation of fission fragments at the scission point are different between the $^{30}$Si+$^{238}$U and $^{36}$S+$^{238}$U systems, causing different mass asymmetries of the quasi-fission.
NSDL National Science Digital Library
Shirley Watt Ireton
2003-01-01
Chapter 1 defines and discusses models in a broad, and perhaps unusual, way. In particular, the chapter stresses the framework of personal models that underlie science and learning across fields. Subsequent chapters will deal more with particular kinds of expressed models that are important in science and science teaching: physical models, analog models and plans, mathematical models, and computer simulations. Throughout, the book examines how all models are important to science, how they are used, and how to use them effectively. They can and should be used not only to teach science, but also to teach students something about the process of learning and about the nature of knowledge itself.
MODEL ABSTRACTION IN HYDROLOGIC MODELING
Technology Transfer Automated Retrieval System (TEKTRAN)
Model abstraction (MA) is a methodology for reducing the complexity of a simulation model while maintaining the validity of the simulation results with respect to the question that the simulation is being used to address. The MA explicitly deals with uncertainties in model structure and in model par...
R. Arnowitt; Pran Nath
1993-11-24
Theoretical and experimental motivations behind supergravity grand unified models are described. The basic ideas of supergravity, and the origin of the soft breaking terms are reviewed. Effects of GUT thresholds and predictions arising from models possessing proton decay are discussed. Speculations as to which aspects of the Standard Model might be explained by supergravity models and which may require Planck scale physics to understand are mentioned.
Lawrence Leemis
2000-01-01
Discrete-event simulation models typically have stochastic elements that mimic the probabilistic nature of the system under consideration. Successful input modeling requires a close match between the input model and the true underlying probabilistic mechanism associated with the system. The general question considered here is how to model an element (e.g., arrival process, service times) in a discrete-event simulation given a
NSDL National Science Digital Library
Amanda Schulz
2004-09-01
Typically, teachers use simple models that employ differences in temperature and density to help students visualize convection. However, most of these models are incomplete or merely hint at (instead of model) convective circulation. In order to make the use of models more effective, the authors developed an alternative system that uses a simple, low-cost apparatus that not only maintains dynamic convective circulation, but also illustrates two adjacent cells that teaches students about Earth's processes.
Phoenix (formerly referred to as the Second Generation Model or SGM) is a global general equilibrium model designed to analyze energy-economy-climate related questions and policy implications in the medium- to long-term. This model disaggregates the global economy into 26 industr...
ERIC Educational Resources Information Center
James, W. G. G.
1970-01-01
Discusses the historical development of both the wave and the corpuscular photon model of light. Suggests that students should be informed that the two models are complementary and that each model successfully describes a wide range of radiation phenomena. Cites 19 references which might be of interest to physics teachers and students. (LC)
Coppola, Antonietta; Moshé, Solomon L
2012-01-01
Epilepsy accounts for a significant portion of the dis-ease burden worldwide. Research in this field is fundamental and mandatory. Animal models have played, and still play, a substantial role in understanding the patho-physiology and treatment of human epilepsies. A large number and variety of approaches are available, and they have been applied to many animals. In this chapter the in vitro and in vivo animal models are discussed,with major emphasis on the in vivo studies. Models have used phylogenetically different animals - from worms to monkeys. Our attention has been dedicated mainly to rodents.In clinical practice, developmental aspects of epilepsy often differ from those in adults. Animal models have often helped to clarify these differences. In this chapter, developmental aspects have been emphasized.Electrical stimulation and chemical-induced models of seizures have been described first, as they represent the oldest and most common models. Among these models, kindling raised great interest, especially for the study of the epileptogenesis. Acquired focal models mimic seizures and occasionally epilepsies secondary to abnormal cortical development, hypoxia, trauma, and hemorrhage.Better knowledge of epileptic syndromes will help to create new animal models. To date, absence epilepsy is one of the most common and (often) benign forms of epilepsy. There are several models, including acute pharmacological models (PTZ, penicillin, THIP, GBL) and chronic models (GAERS, WAG/Rij). Although atypical absence seizures are less benign, thus needing more investigation, only two models are so far available (AY-9944,MAM-AY). Infantile spasms are an early childhood encephalopathy that is usually associated with a poor out-come. The investigation of this syndrome in animal models is recent and fascinating. Different approaches have been used including genetic (Down syndrome,ARX mutation) and acquired (multiple hit, TTX, CRH,betamethasone-NMDA) models.An entire section has been dedicated to genetic models, from the older models obtained with spontaneous mutations (GEPRs) to the new engineered knockout, knocking, and transgenic models. Some of these models have been created based on recently recognized patho-genesis such as benign familial neonatal epilepsy, early infantile encephalopathy with suppression bursts, severe myoclonic epilepsy of infancy, the tuberous sclerosis model, and the progressive myoclonic epilepsy. The contribution of animal models to epilepsy re-search is unquestionable. The development of further strategies is necessary to find novel strategies to cure epileptic patients, and optimistically to allow scientists first and clinicians subsequently to prevent epilepsy and its consequences. PMID:22938964
Model Reduction in Groundwater Modeling
NASA Astrophysics Data System (ADS)
Yeh, W. W. G.
2014-12-01
Model reduction has been shown to be a very effective method for reducing the computational burden of large-scale simulations. Model reduction techniques preserve much of the physical knowledge of the system and primarily seek to remove components from the model that do not provide significant information of interest. Proper Orthogonal Decomposition (POD) is a model reduction technique by which a system of ordinary equations is projected onto a much smaller subspace in such a way that the span of the subspace is equal to the span of the original full model space. Basically, the POD technique selects a small number of orthonormal basis functions (principal components) that span the spatial variability of the solutions. In this way the state variable (head) is approximated by a linear combination of these basis functions and, using a Galerkin projection, the dimension of the problem is significantly reduced. It has been shown that for a highly discritized model, the reduced model can be two to three orders of magnitude smaller than the original model and runs 1,000 faster. More importantly, the reduced model captures the dominating characteristics of the full model and produces sufficiently accurate solutions. One of the major tasks in the development of the reduced model is the selection of snapshots which are used to determine the dominant eigenvectors. This paper discusses ways to optimize the snapshot selection. Additionally, the paper also discusses applications of the reduced model to parameter estimation, Monte Carlo simulation and experimental design in groundwater modeling.
NSDL National Science Digital Library
Mr. Ertl
2007-11-03
This project will allow users to become acquainted with station models that are found on weather maps. Students will study the various atmospheric variables that are depicted on a station model and then practice on an interactive station model program. Part 1 - Being able to read and interpret weather maps is a very important skill in meteorology. One of the most basic skills of predicting the weather is being able to interpret a station model of a given location. A station model is a bundle of information that ...
NSDL National Science Digital Library
John Nielsen-Gammon
1996-01-01
This undergraduate meteorology tutorial from Texas A&M University focuses on computer models that are run by the National Weather Service (NWS) National Centers for Environmental Prediction (NCEP) and are used for forecasting day-to-day weather in the United States. NCEP has four basic models: the Eta Model, the Nested Grid model (NGM), the Rapid Update Cycle (RUC), and the Global Forecast System (GFS). Each model is a self-contained set of computer programs, which include means of analyzing data and computing the evolution of the atmosphere's winds, temperature, pressure, and moisture based on the analyses. Students are given some basic terminology and learn to identify the models and to read model output.
Model Selection in Acoustic Modeling
S. S. Chen; R. A. Gopinath
2001-01-01
Recently several classes of models have been suggested for use in continuousdensity HMMs for speech recognition. This paper proposes tochoose both the model type and model size (number of parameters) byoptimizing the Bayesian information criterion. Specically we apply thisto Gaussian mixture density estimation to determine both the numberof Gaussians and the covariance structure of each Gaussian, and decisiontree clustering of
Functions and Models: Mathematical Models
NSDL National Science Digital Library
Michael Freeze
Describe the process of mathematical modeling;Name and describe some methods of modeling;Classify a symbolically represented function as one of the elementary algebraic or transcendental functions;Appraise the suitability of different models for interpreting a given set of data.
H. Yang
1999-11-04
The purpose of this analysis and model report (AMR) for the Ventilation Model is to analyze the effects of pre-closure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts and provide heat removal data to support EBS design. It will also provide input data (initial conditions, and time varying boundary conditions) for the EBS post-closure performance assessment and the EBS Water Distribution and Removal Process Model. The objective of the analysis is to develop, describe, and apply calculation methods and models that can be used to predict thermal conditions within emplacement drifts under forced ventilation during the pre-closure period. The scope of this analysis includes: (1) Provide a general description of effects and heat transfer process of emplacement drift ventilation. (2) Develop a modeling approach to simulate the impacts of pre-closure ventilation on the thermal conditions in emplacement drifts. (3) Identify and document inputs to be used for modeling emplacement ventilation. (4) Perform calculations of temperatures and heat removal in the emplacement drift. (5) Address general considerations of the effect of water/moisture removal by ventilation on the repository thermal conditions. The numerical modeling in this document will be limited to heat-only modeling and calculations. Only a preliminary assessment of the heat/moisture ventilation effects and modeling method will be performed in this revision. Modeling of moisture effects on heat removal and emplacement drift temperature may be performed in the future.
NASA Astrophysics Data System (ADS)
Wang, R. S.; Zhang, Y.; Xiao, Z. G.; Tian, J. L.; Zhang, Y. X.; Wu, Q. H.; Duan, L. M.; Jin, G. M.; Hu, R. J.; Wang, S. F.; Li, Z. Y.; Wang, H. W.; Zhang, Z.; Yi, H.; Li, H. J.; Cheng, W. J.; Huang, Y.; Lü, L. M.
2014-06-01
Fission fragments resulting from the fission of target-like nuclei produced in the Ar40+Au197 reaction at 35 MeV/u are measured in coincidence with the emitted light charged particles (LCPs). Comparison of the N /Z composition of the LCPs at middle and large angles in the laboratory frame shows that particles emitted at smaller angles, which contain a larger contribution from dynamical emission, are more neutron rich. A moving-source model is used to fit the energy spectra of the hydrogen isotopes. A hierarchy from proton to deuteron and triton is observed in the multiplicity ratio between the intermediate velocity source and the compound nucleus source. This ratio is sensitive to the dynamical emission at early stages of the reaction and to statistical emission lasting up to the scission point. Calculations with the improved quantum molecular dynamics (ImQMD) transport-model qualitatively support the picture that more free and bound neutrons are emitted during the early stage, showing a clear dependence of N /Z on the parametrization of the symmetry energy. The time-dependent isospin composition of the emitted particles thus may be used to probe the symmetry energy at subsaturation densities.
New Fission Fragment Distributions and r-Process Origin of the Rare-Earth Elements
NASA Astrophysics Data System (ADS)
Goriely, S.; Sida, J.-L.; Lemaître, J.-F.; Panebianco, S.; Dubray, N.; Hilaire, S.; Bauswein, A.; Janka, H.-T.
2013-12-01
Neutron star (NS) merger ejecta offer a viable site for the production of heavy r-process elements with nuclear mass numbers A?140. The crucial role of fission recycling is responsible for the robustness of this site against many astrophysical uncertainties, but calculations sensitively depend on nuclear physics. In particular, the fission fragment yields determine the creation of 110?A?170 nuclei. Here, we apply a new scission-point model, called SPY, to derive the fission fragment distribution (FFD) of all relevant neutron-rich, fissioning nuclei. The model predicts a doubly asymmetric FFD in the abundant A?278 mass region that is responsible for the final recycling of the fissioning material. Using ejecta conditions based on relativistic NS merger calculations, we show that this specific FFD leads to a production of the A?165 rare-earth peak that is nicely compatible with the abundance patterns in the Sun and metal-poor stars. This new finding further strengthens the case of NS mergers as possible dominant origin of r nuclei with A?140.
NASA Technical Reports Server (NTRS)
Druyan, Leonard M.
2012-01-01
Climate models is a very broad topic, so a single volume can only offer a small sampling of relevant research activities. This volume of 14 chapters includes descriptions of a variety of modeling studies for a variety of geographic regions by an international roster of authors. The climate research community generally uses the rubric climate models to refer to organized sets of computer instructions that produce simulations of climate evolution. The code is based on physical relationships that describe the shared variability of meteorological parameters such as temperature, humidity, precipitation rate, circulation, radiation fluxes, etc. Three-dimensional climate models are integrated over time in order to compute the temporal and spatial variations of these parameters. Model domains can be global or regional and the horizontal and vertical resolutions of the computational grid vary from model to model. Considering the entire climate system requires accounting for interactions between solar insolation, atmospheric, oceanic and continental processes, the latter including land hydrology and vegetation. Model simulations may concentrate on one or more of these components, but the most sophisticated models will estimate the mutual interactions of all of these environments. Advances in computer technology have prompted investments in more complex model configurations that consider more phenomena interactions than were possible with yesterday s computers. However, not every attempt to add to the computational layers is rewarded by better model performance. Extensive research is required to test and document any advantages gained by greater sophistication in model formulation. One purpose for publishing climate model research results is to present purported advances for evaluation by the scientific community.
Edmund M. Clarke
1997-01-01
Model checking is an automatic technique for verifying finite-state reactive systems, such as sequential circuit designs and\\u000a communication protocols. Specifications are expressed in temporal logic, and the reactive system is modeled as a statetransition\\u000a graph. An efficient search procedure is used to determine whether or not the state-transition graph satisfies the specifications.\\u000a \\u000a We describe the basic model checking algorithm and
NSDL National Science Digital Library
Bill Locke
SCARP is the first in a sequence of spreadsheet modeling exercises (SCARP2, LONGPRO, and GLACPRO). In this exercise, students use a simple arithmetic model (a running mean) to simulate the evolution of a scarp (escarpment) across time. Although the output closely resembles an evolving scarp, no real variables are included in the model. The purpose of the exercise, in addition to the simulation, is to develop basic skills in spreadsheeting and especially in graphical display.
V. Chipman
2002-10-05
The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their post-closure analyses. The Ventilation Model report was initially developed to analyze the effects of preclosure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts, and to provide heat removal data to support EBS design. Revision 00 of the Ventilation Model included documentation of the modeling results from the ANSYS-based heat transfer model. The purposes of Revision 01 of the Ventilation Model are: (1) To validate the conceptual model for preclosure ventilation of emplacement drifts and verify its numerical application in accordance with new procedural requirements as outlined in AP-SIII-10Q, Models (Section 7.0). (2) To satisfy technical issues posed in KTI agreement RDTME 3.14 (Reamer and Williams 2001a). Specifically to demonstrate, with respect to the ANSYS ventilation model, the adequacy of the discretization (Section 6.2.3.1), and the downstream applicability of the model results (i.e. wall heat fractions) to initialize post-closure thermal models (Section 6.6). (3) To satisfy the remainder of KTI agreement TEF 2.07 (Reamer and Williams 2001b). Specifically to provide the results of post-test ANSYS modeling of the Atlas Facility forced convection tests (Section 7.1.2). This portion of the model report also serves as a validation exercise per AP-SIII.10Q, Models, for the ANSYS ventilation model. (4) To further satisfy KTI agreements RDTME 3.01 and 3.14 (Reamer and Williams 2001a) by providing the source documentation referred to in the KTI Letter Report, ''Effect of Forced Ventilation on Thermal-Hydrologic Conditions in the Engineered Barrier System and Near Field Environment'' (Williams 2002). Specifically to provide the results of the MULTIFLUX model which simulates the coupled processes of heat and mass transfer in and around waste emplacement drifts during periods of forced ventilation. This portion of the model report is presented as an Alternative Conceptual Model with a numerical application, and also provides corroborative results used for model validation purposes (Section 6.3 and 6.4).
Model Selection for Geostatistical Models
Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.
2006-02-01
We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.
NASA Technical Reports Server (NTRS)
Bardina, Jorge E.
1995-01-01
The objective of this work is to develop, verify, and incorporate the baseline two-equation turbulence models which account for the effects of compressibility into the three-dimensional Reynolds averaged Navier-Stokes (RANS) code and to provide documented descriptions of the models and their numerical procedures so that they can be implemented into 3-D CFD codes for engineering applications.
NSDL National Science Digital Library
David Bice
Daisyworld is a classic model of complex feedbacks in a simple climate system; this activity guides students through the construction of a STELLA model that can be used to experiment with the system, exploring the somewhat surprising dynamics that arise from the interplay of positive and negative feedbacks between daisies and the temperature of their environment.
1988-01-01
PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1) chemical flooding; 2) carbon dioxide miscible flooding; 3)
1986-01-01
PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1) chemical flooding, where soap-like surfactants are injected into
NSDL National Science Digital Library
2012-06-26
In this activity, learners create models of bugs. Learners use household materials like plastic cups and straws to create models of bugs like centipedes and spiders. The activity is covered in the first 5 pages of the document. There are also a number of related activities that introduce learners to the world of invertebrates.
ERIC Educational Resources Information Center
Budiansky, Stephen
1980-01-01
This article discusses the need for more accurate and complete input data and field verification of the various models of air pollutant dispension. Consideration should be given to changing the form of air quality standards based on enhanced dispersion modeling techniques. (Author/RE)
NSDL National Science Digital Library
James Lovelock
The simulation exercise uses a STELLA-based model called Daisyworld to explore concepts associated with Earth's energy balance and climate change. Students examine the evolution of a simplified model of an imaginary planet with only two species of life on its surface -- white and black daisies -- with different albedos. The daisies can alter the temperature of the surface where they are growing.
NSDL National Science Digital Library
Bill Locke
In the GLACPRO exercise student teams (1-3 members) use a numerical model to reconstruct a former glacial flowline from moraines to source. They must interact with teams studying adjacent flowlines to accurately place ice divides. They can calculate average thicknesses, volumes, ice loading, and sea level equivalent from the class model.
NSDL National Science Digital Library
2012-06-26
In this activity, learners explore the relative sizes and distances of objects in the solar system. Without being informed of the expected product, learners will make a Play-doh model of the Earth-Moon system, scaled to size and distance. The facilitator reveals the true identity of the system at the conclusion of the activity. During the construction phase, learners try to guess what members of the solar system their model represents. Each group receives different amounts of Play-doh, with each group assigned a color (red, blue, yellow, white). At the end, groups set up their models and inspect the models of other groups. They report patterns of scale that they notice; as the amount of Play-doh increases, for example, so do the size and distance of the model. This resource guide includes background information about the Earth to Moon ratio and solar eclipses.
Veronica J. Rutledge
2013-01-01
The absence of industrial scale nuclear fuel reprocessing in the U.S. has precluded the necessary driver for developing the advanced simulation capability now prevalent in so many other countries. Thus, it is essential to model complex series of unit operations to simulate, understand, and predict inherent transient behavior and feedback loops. A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes will provide substantial cost savings and many technical benefits. The specific fuel cycle separation process discussed in this report is the off-gas treatment system. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and REcoverY (OSPREY) models the adsorption of off-gas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas, sorbent, and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data is obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. It also outputs temperature along the column length as a function of time and pressure drop along the column length. Experimental data and parameters were input into the adsorption model to develop models specific for krypton adsorption. The same can be done for iodine, xenon, and tritium. The model will be validated with experimental breakthrough curves. Customers will be given access to OSPREY to used and evaluate the model.
Pre-saddle neutron multiplicity for fission reactions induced by heavy ions and light particles
S. Soheyli; M. K. Khalili
2013-06-03
Pre-saddle neutron multiplicity has been calculated for several fission reactions induced by heavy ions and light particles. Experimentally, it is impossible to determine the contribution of neutrons being emitted before the saddle point and those emitted between the saddle and the scission points. Determination of the pre-saddle neutron multiplicity in our research is based on the comparison between the experimental anisotropies and those predicted by the standard saddle-point statistical model. Analysis of the results shows that the pre-saddle neutron multiplicity depends on the fission barrier height and stability of the compound nucleus. In heavy ion induced fission, the number of pre-saddle neutrons decreases with increasing the excitation energy of the compound nucleus. A main cause of this behavior is due to a reduction in the ground state-to-saddle point transition time with increasing the excitation energy of the compound nucleus. Whereas in induced fission by light particles, the number of pre-saddle neutrons increases with increasing the excitation energy of the compound nucleus.
Thierens, H.; De Clercq, A.; Jacobs, E.; De Frenne, D.; D'hondt, P.; De Gelder, P.; Deruytter, A.J.
1981-05-01
Energy correlation measurements were performed for /sup 240/Pu(s.f.), /sup 239/Pu(n/sub th/,f), and the photofission of /sup 240/Pu with 12-, 15-, 20-, and 30-MeV bremsstrahlung. The photofission cross section for /sup 240/Pu was determined up to 30 MeV, which permitted the calculation of the average excitation energy
Thierens, H.; Jacobs, E.; D'hondt, P.; De Clercq, A.; Piessens, M.; De Frenne, D.
1984-02-01
Energy correlation measurements were performed for the spontaneous fission of /sup 242/Pu, the thermal-neutron-induced fission of /sup 241/Pu, and the photofission of /sup 242/Pu with 12-, 15-, 20-, and 30-MeV bremsstrahlung. The photofission cross section for /sup 242/Pu was determined up to 30 MeV. For /sup 242/Pu(sf) the overall kinetic energy distribution is strongly asymmetric and the overall mass distribution has a very high peak yield (9%). Important deviations of the average total kinetic energy release
Modular Modeling System Model Builder
McKim, C.S.; Matthews, M.T. [Framatome Technologies, Lynchburg, VA (United States)
1996-12-31
The latest release of the Modular Modeling System (MMS) Model Builder adds still more time-saving features to an already powerful MMS dynamic-simulation tool set. The Model Builder takes advantage of 32-bit architecture within the Microsoft Windows 95/NT{trademark} Operating Systems to better integrate a mature library of power-plant components. In addition, the MMS Library of components can now be modified and extended with a new tool named MMS CompGen{trademark}. The MMS Model Builder allows the user to quickly build a graphical schematic representation for a plant by selecting from a library of predefined power plant components to dynamically simulate their operation. In addition, each component has a calculation subroutine stored in a dynamic-link library (DLL), which facilitates the determination of a steady-state condition and performance of routine calculations for the component. These calculations, termed auto-parameterization, help avoid repetitive and often tedious hand calculations for model initialization. In striving to meet the needs for large models and increase user productivity, the MMS Model Builder has been completely revamped to make power plant model creation and maintainability easier and more efficient.
Penny, Will
Hierarchical Dynamic Models Will Penny OU Processes Embedding OU(2) process Dynamic Models Model State Equation Observation Equation Generative Model Energies and Actions Linear Convolution Model Generative Model Generated Data Filtering Triple Estimation Hierarchical Dynamic Models References
NASA Astrophysics Data System (ADS)
Regardt, Olle; Rönnbäck, Lars; Bergholtz, Maria; Johannesson, Paul; Wohed, Petia
Maintaining and evolving data warehouses is a complex, error prone, and time consuming activity. The main reason for this state of affairs is that the environment of a data warehouse is in constant change, while the warehouse itself needs to provide a stable and consistent interface to information spanning extended periods of time. In this paper, we propose a modeling technique for data warehousing, called anchor modeling, that offers non-destructive extensibility mechanisms, thereby enabling robust and flexible management of changes in source systems. A key benefit of anchor modeling is that changes in a data warehouse environment only require extensions, not modifications, to the data warehouse. This ensures that existing data warehouse applications will remain unaffected by the evolution of the data warehouse, i.e. existing views and functions will not have to be modified as a result of changes in the warehouse model.
Insepov, Z.; Norem, J. [Argonne National Lab, Argonne, IL 60439 (United States); Vetizer, S.; Mahalingam, S. [Tech-X Corp., Boulder, CO (United States)
2011-12-23
Although vacuum arcs were first identified over 110 years ago, they are not yet well understood. We have since developed a model of breakdown and gradient limits that tries to explain, in a self-consistent way: arc triggering, plasma initiation, plasma evolution, surface damage and gradient limits. We use simple PIC codes for modeling plasmas, molecular dynamics for modeling surface breakdown, and surface damage, and mesoscale surface thermodynamics and finite element electrostatic codes for to evaluate surface properties. Since any given experiment seems to have more variables than data points, we have tried to consider a wide variety of arcing (rf structures, e beam welding, laser ablation, etc.) to help constrain the problem, and concentrate on common mechanisms. While the mechanisms can be comparatively simple, modeling can be challenging.
NSDL National Science Digital Library
2009-04-14
A human is a complicated organism, and it is considered unethical to do many kinds of experiments on human subjects. For these reasons, biologists often use simpler 'model' organisms that are easy to keep and manipulate in the laboratory. Despite obvious differences, model organisms share with humans many key biochemical and physiological functions that have been conserved (maintained) by evolution. Each of the following model organisms has its advantages and disadvantages in different research applications. This tool allows you to examine the similarities between different systems by comparing the proteins they share and the proportion of DNA they have in common. Choose a gene from the drop-down menu and select the species you want to compare. Rolling over the images will give you a more detailed description of each model. Clicking on a geneÃ¢Â?Â?s name will take you to the National Center for Biological Information, where you can explore the latest relevant scientific literature.
Daniel, David J [Los Alamos National Laboratory; Mc Pherson, Allen [Los Alamos National Laboratory; Thorp, John R [Los Alamos National Laboratory; Barrett, Richard [SNL; Clay, Robert [SNL; De Supinski, Bronis [LLNL; Dube, Evi [LLNL; Heroux, Mike [SNL; Janssen, Curtis [SNL; Langer, Steve [LLNL; Laros, Jim [SNL
2011-01-14
A programming model is a set of software technologies that support the expression of algorithms and provide applications with an abstract representation of the capabilities of the underlying hardware architecture. The primary goals are productivity, portability and performance.
Alper Demir; Alberto Sangiovanni-Vincentelli
\\u000a To reach the final goal of simulating and characterizing the effect of noise on the performance of an electronic circuit or\\u000a system, we first need to investigate the actual noise sources in the system and develop models for these noise sources in\\u000a the framework of the theory of signals and systems we will be operating with. The models we are
Model selection for geostatistical models.
Hoeting, Jennifer A; Davis, Richard A; Merton, Andrew A; Thompson, Sandra E
2006-02-01
We consider the problem of model selection for geospatial data. Spatial correlation is often ignored in the selection of explanatory variables, and this can influence model selection results. For example, the importance of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often-used traditional approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also apply the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored. R software to implement the geostatistical model selection methods described in this paper is available in the Supplement. PMID:16705963
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1990-01-01
Although powerful computers have allowed complex physical and manmade hardware systems to be modeled successfully, we have encountered persistent problems with the reliability of computer models for systems involving human learning, human action, and human organizations. This is not a misfortune; unlike physical and manmade systems, human systems do not operate under a fixed set of laws. The rules governing the actions allowable in the system can be changed without warning at any moment, and can evolve over time. That the governing laws are inherently unpredictable raises serious questions about the reliability of models when applied to human situations. In these domains, computers are better used, not for prediction and planning, but for aiding humans. Examples are systems that help humans speculate about possible futures, offer advice about possible actions in a domain, systems that gather information from the networks, and systems that track and support work flows in organizations.
C Sivaram
2007-07-07
An alternate model for gamma ray bursts is suggested. For a white dwarf (WD) and neutron star (NS) very close binary system, the WD (close to Mch) can detonate due to tidal heating, leading to a SN. Material falling on to the NS at relativistic velocities can cause its collapse to a magnetar or quark star or black hole leading to a GRB. As the material smashes on to the NS, it is dubbed the Smashnova model. Here the SN is followed by a GRB. NS impacting a RG (or RSG) (like in Thorne-Zytkow objects) can also cause a SN outburst followed by a GRB. Other variations are explored.
NASA Astrophysics Data System (ADS)
Backman, Juha Reinhold
This chapter discusses the basic models with emphasis on audio applications. Loudspeakers are most commonly used as an example of electroacoustic transducers yet, from a modelling point of view, they present the broadest range of challenges to the theoreticians. The fundamental principles are, however, applicable to all transducer problems (microphones, hydrophones, ultrasonics). The reader is assumed to be reasonably familiar with the fundamental concepts of electroacoustics; introductory summaries have been presented by, e.g., Poldy (1994) and Hickson and Busch-Vishniac (1997).
Network epistemology Discrete models
Zollman, Kevin
Network epistemology Discrete models Continuous models Social Structure and Social Influence Structure and Social Influence #12;Network epistemology Discrete models Continuous models Network and Social Influence #12;Network epistemology Discrete models Continuous models Network epistemology
ERIC Educational Resources Information Center
Goodwyn, Lauren; Salm, Sarah
2007-01-01
Teaching the anatomy of the muscle system to high school students can be challenging. Students often learn about muscle anatomy by memorizing information from textbooks or by observing plastic, inflexible models. Although these mediums help students learn about muscle placement, the mediums do not facilitate understanding regarding integration of…
NSDL National Science Digital Library
Alexei Sharov
Web-based intructional material describing the use of diffusion models in population ecology. This page is part of a set of on-line lectures on Quantitative Population Ecology produced by Alexei Sharov in the Department of Entomology at Virginia Tech.
ERIC Educational Resources Information Center
Casey, Katherine
2011-01-01
As teachers learn new pedagogical strategies, they crave explicit demonstrations that show them how the new strategies will work with their students in their classrooms. Successful instructional coaches, therefore, understand the importance of modeling lessons to help teachers develop a vision of effective instruction. The author, an experienced…
Although air quality models have been applied historically to address issues specific to ambient air quality standards (i.e., one criteria pollutant at a time) or welfare (e.g.. acid deposition or visibility impairment). they are inherently multipollutant based. Therefore. in pri...
NSDL National Science Digital Library
In this activity, students build a model to demonstrate how aquifers are formed and ground water becomes polluted. For younger students, the teacher can perform this activity as a demonstration, or older students can perform it themselves. A materials list, instructions, and extension activities are provided.
ERIC Educational Resources Information Center
Ebert, James R.; Elliott, Nancy A.; Hurteau, Laura; Schulz, Amanda
2004-01-01
Students must understand the fundamental process of convection before they can grasp a wide variety of Earth processes, many of which may seem abstract because of the scales on which they operate. Presentation of a very visual, concrete model prior to instruction on these topics may facilitate students' understanding of processes that are largely…
NSDL National Science Digital Library
Kirsten Menking
The Daisyworld model created by Andrew Watson and James Lovelock (1983, Tellus, v. 35B, p. 284-289) is a wonderful example of a self-regulating system incorporating positive and negative feedbacks. The model consists of a planet on which black and white daisies are growing. The growth of these daisies is governed by a parabolic shaped growth function regulated by planetary temperature and is set to zero for temperatures less than 5 ºC or greater than 40 ºC and optimized at 22.5 ºC. The model explores the effect of a steadily increasing solar luminosity on the growth of daisies and the resulting planetary temperature. The growth function for the daisies allows them to modulate the planet's temperature for many years, warming it early on as black daisies grow, and cooling it later as white daisies grow. Eventually, the solar luminosity increases beyond the daisies' capability to modulate the temperature and they die out, leading to a rapid rise in the planetary temperature. Students read Watson and Lovelock's original paper, and then use STELLA to create their own Daisyworld model with which they can experiment. Experiments include changing the albedos of the daisies, changing their death rates, and changing the rate at which energy is conducted from one part of the planet to another. In all cases, students keep track of daisy populations and of planetary temperature over time.
Graphical models, causal inference, and econometric models
Spirtes, Peter
Graphical models, causal inference, and econometric models Peter Spirtes Abstract A graphical model modeling has historical ties to causal modeling in econometrics and other social sciences, there have been in graphical causal modeling, and their relevance to econometrics and other social sciences. The use of graphs
ATMOSPHERIC MODELING: MODEL AND ACCURACY
The development of models to assess the emission control requirements of primary precursor pollutants in the production of photochemical oxidants has been underway for approximately 20 years. Over the period there has been a considerable increase in our understanding of the basic...
Students' Models of Curve Fitting: A Models and Modeling Perspective
ERIC Educational Resources Information Center
Gupta, Shweta
2010-01-01
The Models and Modeling Perspectives (MMP) has evolved out of research that began 26 years ago. MMP researchers use Model Eliciting Activities (MEAs) to elicit students' mental models. In this study MMP was used as the conceptual framework to investigate the nature of students' models of curve fitting in a problem-solving environment consisting of…
10. MOVABLE BED SEDIMENTATION MODELS. DOGTOOTH BEND MODEL (MODEL SCALE: ...
10. MOVABLE BED SEDIMENTATION MODELS. DOGTOOTH BEND MODEL (MODEL SCALE: 1' = 400' HORIZONTAL, 1' = 100' VERTICAL), AND GREENVILLE BRIDGE MODEL (MODEL SCALE: 1' = 360' HORIZONTAL, 1' = 100' VERTICAL). - Waterways Experiment Station, Hydraulics Laboratory, Halls Ferry Road, 2 miles south of I-20, Vicksburg, Warren County, MS
NSDL National Science Digital Library
2009-09-10
This interactive simulation gives students practice in the operation and the physical parts of a real micrometer, a measuring device that employs a screw to amplify distances that are too small to measure easily. The accuracy of a micrometer derives from the accuracy of the thread that is at its heart. The basic operating principle of a micrometer is that the rotation of an accurately made screw can be directly and precisely correlated to a certain amount of axial movement (and vice-versa), through the constant known as the screw's lead. The Micrometer model was created using the Easy Java Simulations (EJS) modeling tool. It is distributed as a ready-to-run (compiled) Java archive. Double click the ejs_ntnu_Micrometer.jar file to run the program (Java must be installed).
NASA Technical Reports Server (NTRS)
2000-01-01
The molecule modeling method known as Multibody Order (N) Dynamics, or MBO(N)D, was developed by Moldyn, Inc. at Goddard Space Flight Center through funding provided by the SBIR program. The software can model the dynamics of molecules through technology which stimulates low-frequency molecular motions and properties, such as movements among a molecule's constituent parts. With MBO(N)D, a molecule is substructured into a set of interconnected rigid and flexible bodies. These bodies replace the computation burden of mapping individual atoms. Moldyn's technology cuts computation time while increasing accuracy. The MBO(N)D technology is available as Insight II 97.0 from Molecular Simulations, Inc. Currently the technology is used to account for forces on spacecraft parts and to perform molecular analyses for pharmaceutical purposes. It permits the solution of molecular dynamics problems on a moderate workstation, as opposed to on a supercomputer.
NSDL National Science Digital Library
2012-07-12
In this quick activity about pollutants and groundwater (page 2 of PDF), learners build a model well with a toilet paper tube. Learners use food coloring to simulate pollutants and observe how they can be carried by groundwater and eventually enter water sources such as wells, rivers, and streams. This activity is associated with nanotechnology and relates to linked video, DragonflyTV Nano: Water Clean-up.
Plimpton, Steven James; Heffernan, Julieanne; Sasaki, Darryl Yoshio; Frischknecht, Amalie Lucile; Stevens, Mark Jackson; Frink, Laura J. Douglas
2005-11-01
Understanding the properties and behavior of biomembranes is fundamental to many biological processes and technologies. Microdomains in biomembranes or ''lipid rafts'' are now known to be an integral part of cell signaling, vesicle formation, fusion processes, protein trafficking, and viral and toxin infection processes. Understanding how microdomains form, how they depend on membrane constituents, and how they act not only has biological implications, but also will impact Sandia's effort in development of membranes that structurally adapt to their environment in a controlled manner. To provide such understanding, we created physically-based models of biomembranes. Molecular dynamics (MD) simulations and classical density functional theory (DFT) calculations using these models were applied to phenomena such as microdomain formation, membrane fusion, pattern formation, and protein insertion. Because lipid dynamics and self-organization in membranes occur on length and time scales beyond atomistic MD, we used coarse-grained models of double tail lipid molecules that spontaneously self-assemble into bilayers. DFT provided equilibrium information on membrane structure. Experimental work was performed to further help elucidate the fundamental membrane organization principles.
Rainer Koschke; Daniel Simon
2003-01-01
The reexion model originally proposed by Murphy and Notkin allows one to structurally validate a de- scriptive or prescriptive architecture model against a source model. First, the entities in the source model are mapped onto the architectural model, then discrep- ancies between the architecture model and source model are computed automatically. The original reexion model allows an analyst to specify
Vincent, Julian F V
2003-01-01
Biomimetics is seen as a path from biology to engineering. The only path from engineering to biology in current use is the application of engineering concepts and models to biological systems. However, there is another pathway: the verification of biological mechanisms by manufacture, leading to an iterative process between biology and engineering in which the new understanding that the engineering implementation of a biological system can bring is fed back into biology, allowing a more complete and certain understanding and the possibility of further revelations for application in engineering. This is a pathway as yet unformalized, and one that offers the possibility that engineers can also be scientists. PMID:14561351
The Standard Model Beyond the Standard Model
The Standard Model Beyond the Standard Model New physics with top quark Search for Extra, January 13, 2010 Ritesh Singh New physics at LHC #12;The Standard Model Beyond the Standard Model New physics with top quark Search for Extra-dimensions Conclusions 1 The Standard Model Building block
Technological Forecasting---Model Selection, Model Stability, and Combining Models
Nigel Meade; Towhidul Islam
1998-01-01
The paper identifies 29 models that the literature suggests are appropriate for technological forecasting. These models are divided into three classes according to the timing of the point of inflexion in the innovation or substitution process. Faced with a given data set and such a choice, the issue of model selection needs to be addressed. Evidence used to aid model
Modelling intonational structure using hidden markov models.
Wright, Helen; Taylor, Paul A
1997-01-01
A method is introduced for using hidden Markov models (HMMs) to model intonational structure. HMMs are probabilistic and can capture the variability in structure which previous finite state network models lack. We show ...
ECOBAS — modelling and documentation
J Benz; R Hoch; T Legovi?
2001-01-01
Until now modelling and model documentation were two different processes. As a consequence, model documentation was prone to error. It was rarely possible to run larger models from their documentation. Model exchange was limited to simple models due to different languages in which they were created. To facilitate more efficient model creation, documentation and exchange we are introducing ECOBAS system.
Sumner, Walton; Xu, Jin Zhong
2002-01-01
The American Board of Family Practice is developing a patient simulation program to evaluate diagnostic and management skills. The simulator must give temporally and physiologically reasonable answers to symptom questions such as "Have you been tired?" A three-step process generates symptom histories. In the first step, the simulator determines points in time where it should calculate instantaneous symptom status. In the second step, a Bayesian network implementing a roughly physiologic model of the symptom generates a value on a severity scale at each sampling time. Positive, zero, and negative values represent increased, normal, and decreased status, as applicable. The simulator plots these values over time. In the third step, another Bayesian network inspects this plot and reports how the symptom changed over time. This mechanism handles major trends, multiple and concurrent symptom causes, and gradually effective treatments. Other temporal insights, such as observations about short-term symptom relief, require complimentary mechanisms. PMID:12463924
CISNET lung models: Comparison of model assumptions and model structures
McMahon, Pamela M.; Hazelton, William; Kimmel, Marek; Clarke, Lauren
2012-01-01
Sophisticated modeling techniques can be powerful tools to help us understand the effects of cancer control interventions on population trends in cancer incidence and mortality. Readers of journal articles are however rarely supplied with modeling details. Six modeling groups collaborated as part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network (CISNET) to investigate the contribution of US tobacco control efforts towards reducing lung cancer deaths over the period 1975 to 2000. The models included in this monograph were developed independently and use distinct, complementary approaches towards modeling the natural history of lung cancer. The models used the same data for inputs and agreed on the design of the analysis and the outcome measures. This article highlights aspects of the models that are most relevant to similarities of or differences between the results. Structured comparisons can increase the transparency of these complex models. PMID:22882887
Multivariate Receptor Models and Model Uncertainty
Washington at Seattle, University of
that are unobservable (latent variables or factors), P is the unknown q×p factor loading matrix, and q is the unknown pollution data are presented. Key words: Latent variable models; Factor analysis models; Model uncertainty. INTRODUCTION Multivariate receptor modeling aims to identify the pollution sources and assess the amounts
Rossignac & Requicha Solid Modeling 1 Solid Modeling
Rossignac, Jarek
Rossignac & Requicha Solid Modeling 1 Solid Modeling Jarek R. Rossignac GVU Center, College University of Southern California at Los Angeles 1 Introduction A solid model is a digital representation of the geometry of an existing or envisioned physical object. Solid models are used in many industries, from
CISNET: Standardized Model Documents
Modeling is a complex endeavor, and often it is very difficult to reconcile results from different models. To aid in this process of model description and comparison, CISNET has developed and implemented standardized model documentation. Model profiles are standardized descriptions that facilitate the comparison of models and their results. Users can read documentation about a single model or read side-by-side descriptions that contrast how models address different components of the process.
Comparative Protein Structure Modeling Using Modeller
Eswar, Narayanan; Marti-Renom, Marc A.; Madhusudhan, M.S.; Eramian, David; Shen, Min-yi; Pieper, Ursula
2014-01-01
Functional characterization of a protein sequence is one of the most frequent problems in biology. This task is usually facilitated by accurate three-dimensional (3-D) structure of the studied protein. In the absence of an experimentally determined structure, comparative or homology modeling can sometimes provide a useful 3-D model for a protein that is related to at least one known protein structure. Comparative modeling predicts the 3-D structure of a given protein sequence (target) based primarily on its alignment to one or more proteins of known structure (templates). The prediction process consists of fold assignment, target-template alignment, model building, and model evaluation. This unit describes how to calculate comparative models using the program MODELLER and discusses all four steps of comparative modeling, frequently observed errors, and some applications. Modeling lactate dehydrogenase from Trichomonas vaginalis (TvLDH) is described as an example. The download and installation of the MODELLER software is also described. PMID:18428767
Modelling project management performance
David James Bryde
2003-01-01
This paper presents an argument that it is appropriate to develop a model of project management (PM) performance from models for assessing quality management. The paper presents a model, labelled the project management performance assessment (PMPA) model, based upon the EFQM business excellence model. The model proposes six criteria for assessing PM performance: project management leadership; project management staff; project
Loong, Lim Yaw; Yusof, Norhasliza; Kassim, Hasan Abu [Physics Department, Faculty of Science, University of Malaya, 50603 Kuala Lumpur (Malaysia)
2008-05-20
Solar models are important in our understanding of stars and stellar evolution. Solar models have been constructed using different methods. In this work, a solar model will be built using the fitting method. The model will incorporate the most recent input data. The model will be evolved to the current epoch starting from the zero-age main sequence model.
NASA Astrophysics Data System (ADS)
Burger, M. H.; Killen, R. M.; M, N.; Sarantos, M.; Crider, D. H.; Vervak, R. J.
2009-04-01
Mercury has a tenuous exosphere created by the combined effects of solar radiation and micrometeoroid bombardment on the surface and the interaction of the solar wind with Mercury's magnetic field and surface. Observations of this exosphere provide essential data necessary for understanding the composition and evolution of Mercury's surface, as well as the interaction between Mercury's magnetosphere with the solar wind. The sodium component of the exosphere has been well observed from the ground (see review by Killen et al., 2007). These observations have revealed a highly variable and inhomogeneous exosphere with emission often peaking in the polar regions. Radiation acceleration drives exospheric escape producing a sodium tail pointing away from the sun which has been detected up to 1400 Mercury radii from the planet (Potter et al. 2002; Baumgardner et al. 2008). Calcium has also been observed in Mercury's exosphere showing a distribution distinct from sodium, although also variable (Killen et al. 2005). During the first two encounters with Mercury by MESSENGER, observations of the exosphere were made by the UltraViolet and Visible Spectrometer (UVVS) channel of the Mercury Atmospheric and Surface Composition Spectrometer (MASCS). Sodium and calcium emission were detected during both flybys, and magnesium was detected for the first time in Mercury's exosphere during the second flyby. The spatial distributions of these species showed significant, unexpected differences which suggest differences in the mechanisms responsible for releasing them from the surface. We present a Monte-Carlo model of sodium, magnesium, and calcium in Mercury's exosphere. The important source mechanisms for ejecting these species from the surface are sputtering by solar wind ions, photon-stimulated desorption, and micrometeoroid impact vaporization. Thermal desorption on the dayside does not supply enough energy to significantly populate the exosphere, although it does play a role in redistributing volatiles over the surface. In addition, atomic calcium can be produced from the dissociation of Ca-bearing molecules, such as CaO, which can be formed in impact vapors. The primary loss processes are the escape of neutrals ejected with sufficient energy and photoionization. The former process is supplemented by radiation pressure which accelerates neutrals anti-sunward such that escaping neutrals form a tail pointing away from the sun. Because Mercury's heliocentric distance and radial velocity vary during its orbit, both loss processes are functions of Mercury's true anomaly. We also consider the spatial distribution of the surface source. Impact vaporization is roughly isotropic over the surface, although there may be a leading/trailing asymmetry in the impact rate due to Mercury's orbital motion. Sputtering is confined to regions where the solar wind can impact the surface, which is shielded somewhat by the internal magnetic field. The surface regions vulnerable depend on the solar wind conditions. References: Baumgardner et al., GRL, 35, L03201, 2008. Killen, R.M. et al., Space Sci. Rev. 132, 433-509, 2007. Killen, R.M. et al., Icarus, 173, 300-311, 2005. Potter et al., Meteoritics & Planetary Sci., 37, 1165, 2002.
Longini, Ira M.; Morris, J. Glenn
2014-01-01
Mathematical modeling can be a valuable tool for studying infectious disease outbreak dynamics and simulating the effects of possible interventions. Here, we describe approaches to modeling cholera outbreaks and how models have been applied to explore intervention strategies, particularly in Haiti. Mathematical models can play an important role in formulating and evaluating complex cholera outbreak response options. Major challenges to cholera modeling are insufficient data for calibrating models and the need to tailor models for different outbreak scenarios. PMID:23412687
Protein Structure Modeling With MODELLER Narayanan Eswar$
Sali, Andrej
sequences. Key Words: Comparative modeling, fold assignment, sequence-structure alignment, model assessment resonance (NMR) spectroscopy. In the absence of experimentally determined structures, computationally template structure (see Section on Materials for definitions of these terms); (ii) alignment of the target
Model selection for logistic regression models
NASA Astrophysics Data System (ADS)
Duller, Christine
2012-09-01
Model selection for logistic regression models decides which of some given potential regressors have an effect and hence should be included in the final model. The second interesting question is whether a certain factor is heterogeneous among some subsets, i.e. whether the model should include a random intercept or not. In this paper these questions will be answered with classical as well as with Bayesian methods. The application show some results of recent research projects in medicine and business administration.
ERIC Educational Resources Information Center
Frees, Edward W.; Kim, Jee-Seon
2006-01-01
Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…
Browne, James C.
1 9/11/01 Naming Models and Parallel Programming Overview of Name Model Lectures 1. Roles of Name An Instance of the Taxonomy 3. Associative Broadcast Model of Parallel Programming 2 9/11/01 Naming Models. 8. Propagation Â Algorithm for moving names among objects. 3 9/11/01 Naming Models and Parallel
James T. Kajiya
1985-01-01
We present a new set of lighting models derived from the questions of electromagnetism. These models describe the reflection and refraction of light from surfaces which exhibit anisotropy---surfaces with preferred directions. The model allows a new mapping technique, which we call . We also discuss the general relationship between geometric models, surface mapping of all types, and lighting models in
Alan N. Beard
1997-01-01
Over the last 20 years there has been a great increase in the construction of computer-based models related to fire risk. Both probabilistic and deterministic models have been produced. Many existing models are in a state of development and new models are being created continually. However, how such models are to be efficaciously employed as part of the design process
Modeling transient rootzone salinity (SWS Model)
Technology Transfer Automated Retrieval System (TEKTRAN)
The combined, water quality criteria for irrigation, water and ion processes in soils, and plant and soil response is sufficiently complex that adequate analysis requires computer models. Models for management are also needed but these models must consider that the input requirements must be reasona...
Cosmological Models Generalising Robertson-Walker Models
Abdussattar
2003-08-07
Considering the physical 3-space t = constant of the spacetime metrics as spheroidal and pseudo spheroidal, cosmological models which are generalizations of Robertson-Walker models are obtained. Specific forms of these general models as solutions of Einstein's field equations are also discussed in the radiation- and the matter-dominated eras of the universe.
Bayesian Model Averaging for Linear Regression Models
Adrian E. Raftery; David Madigan; Jennifer A. Hoeting
1998-01-01
We consider the problem of accounting for model uncertainty in linear regressionmodels. Conditioning on a single selected model ignores model uncertainty, and thusleads to the underestimation of uncertainty when making inferences about quantitiesof interest. A Bayesian solution to this problem involves averaging over all possiblemodels (i.e., combinations of predictors) when making inferences about quantities ofAdrian E. Raftery is Professor of
Cognitive Modeling Cognitive Modelling -The nature of
Bremen, Universität
Cognitive Modeling Cognitive Modelling - The nature of Connectionism and notes on computability Mathias Hinz Universität Bremen November 17, 2014 November 17, 2014 1 #12;Cognitive Modeling topic · Comparing PDP and nature · properties of PDP · computability · discussion November 17, 2014 2 #12;Cognitive
Iman, R.L.
1986-01-01
Computer models for various applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model, and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results obtained from simulation studies. Model characteristics are reviewed that have a direct bearing on the model input process and reasons are given for using probabilistic based modeling with the inputs. Discussions are presented on how to model distributions for individual inputs and how to model multivariate input structures when dependence and other constraints may be present. 12 refs.
MODEL CONSERVATION STANDARD INTRODUCTION
MODEL CONSERVATION STANDARD INTRODUCTION As directed by the Northwest Power Act, the Council has designed model conservation standards to produce all electricity savings that are cost believes the measures used to achieve the model conservation standards should provide reliable savings
Editor's Roundtable: Model behavior
NSDL National Science Digital Library
Inez Liftig
2010-11-01
Models are manageable representations of objects, concepts, and phenomena, and are everywhere in science. Models are "thinking tools" for scientists and have always played a key role in the development of scientific knowledge. Models of the solar system,
Educating with Aircraft Models
ERIC Educational Resources Information Center
Steele, Hobie
1976-01-01
Described is utilization of aircraft models, model aircraft clubs, and model aircraft magazines to promote student interest in aerospace education. The addresses for clubs and magazines are included. (SL)
N. W. Brimicombe
1991-01-01
Hot air balloons can be modelled in a number of different ways. The most satisfactory, but least useful model is at a microscopic level. Macroscopic models are easier to use but can be very misleading.
Nonlinear models Nonlinear Regression
Penny, Will
Nonlinear models Will Penny Nonlinear Regression Nonlinear Regression Priors Energies Posterior Metropolis-Hasting Proposal density References Nonlinear models Will Penny Bayesian Inference Course, WTCN, UCL, March 2013 #12;Nonlinear models Will Penny Nonlinear Regression Nonlinear Regression Priors
NSDL National Science Digital Library
Christine Lotter
2011-02-01
In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a bet
Maintenance & reliability models
Richard E. Barlow; Carsten Boe; Tor Heimly; Tor-Chr. Mathliesen; Aridaman K. Jain; V. P. Sobczynski; C. J. Pearson
1973-01-01
Three models are presented in which stochastic simulation models are used to construct simulations of systems composed of unreliable components. The releability of the resultant systems is inferred from behavior of these simulation models.
Mathematics and Statistics Models
NSDL National Science Digital Library
Developed by Bob MacKay, Clark College. What are Mathematical and Statistical Models These types of models are obviously related, but there are also real differences between them. Mathematical Models: grow out of ...
NASA Technical Reports Server (NTRS)
Liou, J. C.
2012-01-01
Presentation outlne: (1) The NASA Orbital Debris (OD) Engineering Model -- A mathematical model capable of predicting OD impact risks for the ISS and other critical space assets (2) The NASA OD Evolutionary Model -- A physical model capable of predicting future debris environment based on user-specified scenarios (3) The NASA Standard Satellite Breakup Model -- A model describing the outcome of a satellite breakup (explosion or collision)
Modeling of geothermal systems
Bodvarsson, G.S.; Pruess, K.; Lippmann, M.J.
1985-03-01
During the last decade the use of numerical modeling for geothermal resource evaluation has grown significantly, and new modeling approaches have been developed. In this paper we present a summary of the present status in numerical modeling of geothermal systems, emphasizing recent developments. Different modeling approaches are described and their applicability discussed. The various modeling tasks, including natural-state, exploitation, injection, multi-component and subsidence modeling, are illustrated with geothermal field examples. 99 refs., 14 figs.
NASA Technical Reports Server (NTRS)
Cellier, Francois E.
1991-01-01
A comprehensive and systematic introduction is presented for the concepts associated with 'modeling', involving the transition from a physical system down to an abstract description of that system in the form of a set of differential and/or difference equations, and basing its treatment of modeling on the mathematics of dynamical systems. Attention is given to the principles of passive electrical circuit modeling, planar mechanical systems modeling, hierarchical modular modeling of continuous systems, and bond-graph modeling. Also discussed are modeling in equilibrium thermodynamics, population dynamics, and system dynamics, inductive reasoning, artificial neural networks, and automated model synthesis.
NSDL National Science Digital Library
COMET
2008-02-05
The Fire Model Matrix is an on-line resource that presents four fire community models in a matrix that facilitates the exploration of the characteristics of each model. As part of the Advanced Fire Weather Forecasters Course, this matrix is meant to sensitize forecasters to the use of weather data in these fire models to forecast potential fire activity.
Connectionist models of development
Yuko Munakata; James L. McClelland
2003-01-01
How have connectionist models informed the study of development? This paper considers three contributions from specific models. First, connectionist models have proven useful for exploring nonlinear dynamics and emergent properties, and their role in non- linear developmental trajectories, critical periods and developmental disorders. Second, connectionist models have informed the study of the representations that lead to behavioral dissociations. Third, connectionist
Madanlal Musuvathi; Shaz Qadeer
2008-01-01
Stateless model checking is a useful state-space exploration tech- nique for systematically testing complex real-world software. Ex- isting stateless model checkers are limited to the verification of safety properties on terminating programs. However, realistic con- current programs are nonterminating, a property that significantly reduces the efficacy of stateless model checking in testing them. Moreover, existing stateless model checkers are unable
Generalized additive mixed models
Colin Chen
2000-01-01
Following the extension from linear mixed models to additive mixed models, extension from generalized linear mixed models to generalized additive mixed models is made, Algorithms are developed to compute the MLE's of the nonlinear effects and the covariance structures based on the penalized marginal likelihood. Convergence of the algorithms and selection of the smooth param¬eters are discussed.
Building Credible Input Models
Lawrence M. Leemis
2004-01-01
Most discrete-event simulation models have stochastic el- ements that mimic the probabilistic nature of the system under consideration. A close match between the input model and the true underlying probabilistic mechanism associated with the system is required for successful input modeling. The general question considered here is how to model an element (e.g., arrival process, service times) in a discrete-
Generalized Weibull Linear Models
Andrea A. Prudente; Gauss M. Cordeiro
2010-01-01
For the first time, a new class of generalized Weibull linear models is introduced to be competitive to the well-known generalized (gamma and inverse Gaussian) linear models which are adequate for the analysis of positive continuous data. The proposed models have a constant coefficient of variation for all observations similar to the gamma models and may be suitable for a
Generative Models of Disfluency
ERIC Educational Resources Information Center
Miller, Timothy A.
2010-01-01
This thesis describes a generative model for representing disfluent phenomena in human speech. This model makes use of observed syntactic structure present in disfluent speech, and uses a right-corner transform on syntax trees to model this structure in a very natural way. Specifically, the phenomenon of speech repair is modeled by explicitly…
PREDICTIVE MODELS. Enhanced Oil Recovery Model
Ray, R.M. [DOE Bartlesville Energy Technology Center, Bartlesville, OK (United States)
1992-02-26
PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1 chemical flooding; 2 carbon dioxide miscible flooding; 3 in-situ combustion; 4 polymer flooding; and 5 steamflood. CFPM, the Chemical Flood Predictive Model, models micellar (surfactant)-polymer floods in reservoirs, which have been previously waterflooded to residual oil saturation. Thus, only true tertiary floods are considered. An option allows a rough estimate of oil recovery by caustic or caustic-polymer processes. CO2PM, the Carbon Dioxide miscible flooding Predictive Model, is applicable to both secondary (mobile oil) and tertiary (residual oil) floods, and to either continuous CO2 injection or water-alternating gas processes. ICPM, the In-situ Combustion Predictive Model, computes the recovery and profitability of an in-situ combustion project from generalized performance predictive algorithms. PFPM, the Polymer Flood Predictive Model, is switch-selectable for either polymer or waterflooding, and an option allows the calculation of the incremental oil recovery and economics of polymer relative to waterflooding. SFPM, the Steamflood Predictive Model, is applicable to the steam drive process, but not to cyclic steam injection (steam soak) processes. The IBM PC/AT version includes a plotting capability to produces a graphic picture of the predictive model results.
NSDL National Science Digital Library
2014-09-14
Ocean waves near shore impact public safety, commerce, navigation, and, of course recreation. Predicting these waves has driven efforts to model them for more than two decades. This module introduces forecasters to different nearshore wave models, including phase-resolving and 1- and 2-dimensional spectral models. It describes the processes that wave models simulate, the assumptions they make, the initial and boundary conditions required to run the models, and potential sources of error in model forecasts. While focusing on SWAN, the module also examines the Navy Standard surf Model and Bouss-2D.
Jeremy Marozeau
\\u000a Any chapter dedicated to reviewing models should first try to define what a model is. The question is more difficult than\\u000a it first appears. Whereas most scientists agree that a model should represent a real-world phenomenon, many disagree on the\\u000a level of complexity that a model should have. In his chapter on pitch models, De Cheveigné (2004) cites Norbert Wiener:
Stable Models of superacceleration
Manoj Kaplinghat; Arvind Rajaraman
2007-05-16
We discuss an instability in a large class of models where dark energy is coupled to matter. In these models the mass of the scalar field is much larger than the expansion rate of the Universe. We find models in which this instability is absent, and show that these models generically predict an apparent equation of state for dark energy smaller than -1, i.e., superacceleration. These models have no acausal behavior or ghosts.
Modernizing Our Cognitive Model
David J. Bryant
Although still popular, the Observe-Orient-Decide-Act (OODA) Loop is outdated as a model of human cognition. Based on advances in the cognitive sciences since the 1950s, the Critique- Explore-Compare-Adapt (CECA) Loop is proposed as a better descriptive model. The model puts two mental representations, the conceptual model established through operational planning, and the situation model, which represents the state of the
14. Quark model 1 14. QUARK MODEL
Krusche, Bernd
14. Quark model 1 14. QUARK MODEL Revised December 2005 by C. Amsler (University of ZÂ¨urich), T. DeGrand (University of Colorado, Boulder) and B. Krusche (University of Basel). 14.1. Quantum numbers of the quarks Quarks are strongly interacting fermions with spin 1/2 and, by convention, positive parity
14. Quark model 1 14. QUARK MODEL
14. Quark model 1 14. QUARK MODEL Revised August 2011 by C. Amsler (University of Z¨urich), T. DeGrand (University of Colorado, Boulder), and B. Krusche (University of Basel). 14.1. Quantum numbers of the quarks and its constituents are a set of fermions, the quarks, and gauge bosons, the gluons. Strongly interacting
1. Quark model 1 1. QUARK MODEL
Krusche, Bernd
1. Quark model 1 1. QUARK MODEL Revised December 2005 by C. Amsler (University of ZÂ¨urich), T. DeGrand (University of Colorado, Boulder) and B. Krusche (University of Basel). 1.1. Quantum numbers of the quarks Quarks are strongly interacting fermions with spin 1/2 and, by convention, positive parity
14. Quark model 1 14. QUARK MODEL
14. Quark model 1 14. QUARK MODEL Revised September 2009 by C. Amsler (University of Z¨urich), T numbers of the quarks Quarks are strongly interacting fermions with spin 1/2 and, by convention, positive parity. Antiquarks have negative parity. Quarks have the additive baryon number 1/3, antiquarks -1
PREDICTIVE MODELS. Enhanced Oil Recovery Model
1992-01-01
PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1 chemical flooding, where soap-like surfactants are injected into
PREDICTIVE MODELS. Enhanced Oil Recovery Model
1992-01-01
PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1 chemical flooding; 2 carbon dioxide miscible flooding; 3
WASP TRANSPORT MODELING AND WASP ECOLOGICAL MODELING
A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...
Efficient Model Determination for Discrete Graphical Models
Paolo Giudici; Peter Green; Claudia Tarantola
2000-01-01
We present a novel methodology for bayesian model determination in discretedecomposable graphical models. We assign, for each given graph, a Hyper Dirichletdistribution on the matrix of cell probabilities. To ensure compatibility acrossmodels such prior distributions are obtained by marginalisation from the prior conditionalon the complete graph. This leads to a prior distribution automaticallysatisfying the hyperconsistency criterion. Our contribution is twofold.
Geochemistry Model Validation Report: External Accumulation Model
K. Zarrabi
2001-09-27
The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation, density of accumulation, and the geometry of the accumulation zone. The density of accumulation and the geometry of the accumulation zone are calculated using a characterization of the fracture system based on field measurements made in the proposed repository (BSC 2001k). The model predicts that accumulation would spread out in a conical accumulation volume. The accumulation volume is represented with layers as shown in Figure 1. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance.
jModelTest: phylogenetic model averaging.
Posada, David
2008-07-01
jModelTest is a new program for the statistical selection of models of nucleotide substitution based on "Phyml" (Guindon and Gascuel 2003. A simple, fast, and accurate algorithm to estimate large phylogenies by maximum likelihood. Syst Biol. 52:696-704.). It implements 5 different selection strategies, including "hierarchical and dynamical likelihood ratio tests," the "Akaike information criterion," the "Bayesian information criterion," and a "decision-theoretic performance-based" approach. This program also calculates the relative importance and model-averaged estimates of substitution parameters, including a model-averaged estimate of the phylogeny. jModelTest is written in Java and runs under Mac OSX, Windows, and Unix systems with a Java Runtime Environment installed. The program, including documentation, can be freely downloaded from the software section at http://darwin.uvigo.es. PMID:18397919
Trapped Radiation Model Uncertainties: Model-Data and Model-Model Comparisons
NASA Technical Reports Server (NTRS)
Armstrong, T. W.; Colborn, B. L.
2000-01-01
The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux and dose measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP. LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir space station. This report gives the details of the model-data comparisons -- summary results in terms of empirical model uncertainty factors that can be applied for spacecraft design applications are given in a companion report. The results of model-model comparisons are also presented from standard AP8 and AE8 model predictions compared with the European Space Agency versions of AP8 and AE8 and with Russian trapped radiation models.
Trapped Radiation Model Uncertainties: Model-Data and Model-Model Comparisons
NASA Technical Reports Server (NTRS)
Armstrong, T. W.; Colborn, B. L.
2000-01-01
The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux and dose measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP, LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir Space Station. This report gives the details of the model-data comparisons-summary results in terms of empirical model uncertainty factors that can be applied for spacecraft design applications are given in a combination report. The results of model-model comparisons are also presented from standard AP8 and AE8 model predictions compared with the European Space Agency versions of AP8 and AE8 and with Russian-trapped radiation models.
Model Validation Status Review
E.L. Hardin
2001-11-28
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and engineered barriers, plus the TSPA model itself Description of the model areas is provided in Section 3, and the documents reviewed are described in Section 4. The responsible manager for the Model Validation Status Review was the Chief Science Officer (CSO) for Bechtel-SAIC Co. (BSC). The team lead was assigned by the CSO. A total of 32 technical specialists were engaged to evaluate model validation status in the 21 model areas. The technical specialists were generally independent of the work reviewed, meeting technical qualifications as discussed in Section 5.
Inflation models and observation
Laila Alabidi; David Lyth
2005-12-01
We consider small-field models which invoke the usual framework for the effective field theory, and large-field models which go beyond that. Present and future possibilities for discriminating between the models are assessed, on the assumption that the primordial curvature perturbation is generated during inflation. With PLANCK data, the theoretical and observational uncertainties on the spectral index will be comparable, providing useful discrimination between small-field models. Further discrimination between models may come later through the tensor fraction, the running of the spectral index and non-gaussianity. The prediction for the trispectrum in a generic multi-field inflation model is given for the first time.
NSDL National Science Digital Library
COMET
2006-05-16
The Marine Wave Model Matrix provides information on the formulation of wave models developed by the National Centers for Environmental Prediction (NCEP) and other modeling centers, including how these models forecast the generation, propagation, and dissipation of ocean waves using NWP model forecasts for winds and near-surface temperature and stability. Additionally, information is provided on data assimilation, post-processing of data, and verfication of wave models currently in operation. Within the post-processing pages are links to forecast output both in graphical and raw form, including links for data downloads. Links to COMET training on wave processes are also provided.
Thatcher, R.M.
1984-05-01
The Surface-To-Air Missile (SAM) Electro-Magnetic-Pulse (EMP) (SEMP) model simulates the illumination of an entire SAM brigade with an EMP weapon. It computes probability distributions of SAM brigade performance levels after an EMP attack has occurred. Brigade performance is determined by the combination of components that survive the EMP. Accordingly, the SEMP model is separated into the component failure model and the condition model. The component failure model computes the failure probability of each component in the brigade from data supplied by two input data files. The condition model converts component failure probabilities into brigade performance in the form of missile availability probability tables.
NASA Astrophysics Data System (ADS)
Cendrowski, S. K.
1996-12-01
The paper presented describes the mathematical model of light stimulus transforming by the neural layers of eye retina. The model contains the descriptions of photoreceptors, horizontal and bipolar cells which belong to outer plexiform layer of retina. There were physical mechanisms of neurons exciting which were in the base of model. The model allows to predict the set of well-known visual phenomena, for example, the Broca-Sulzer temporal and spatial effects, the Mach's bands effect. The quantitative validation of the model have been made. The model may be taken as a principle of designing sensory layer of networks.
NASA Astrophysics Data System (ADS)
Geller, Michael; Telem, Ofri
2015-05-01
We present the first realization of a "twin Higgs" model as a holographic composite Higgs model. Uniquely among composite Higgs models, the Higgs potential is protected by a new standard model (SM) singlet elementary "mirror" sector at the sigma model scale f and not by the composite states at mKK , naturally allowing for mKK beyond the LHC reach. As a result, naturalness in our model cannot be constrained by the LHC, but may be probed by precision Higgs measurements at future lepton colliders, and by direct searches for Kaluza-Klein excitations at a 100 TeV collider.
Reiter, E.R.
1980-01-01
A highly sophisticated and accurate approach is described to compute on an hourly or daily basis the energy consumption for space heating by individual buildings, urban sectors, and whole cities. The need for models and specifically weather-sensitive models, composite models, and space-heating models are discussed. Development of the Colorado State University Model, based on heat-transfer equations and on a heuristic, adaptive, self-organizing computation learning approach, is described. Results of modeling energy consumption by the city of Minneapolis and Cheyenne are given. Some data on energy consumption in individual buildings are included.
NSDL National Science Digital Library
National Centers for Environmental Prediction, National Oceanic and Atmospheric Administration
The Marine Modeling and Analysis Branch (MMAB) of the Environmental Modeling Center is responsible for the development of improved numerical weather and marine prediction modeling systems. These models provide analysis and real-time forecast guidance on marine meteorological, oceanographic, and cryospheric parameters over the global oceans and coastal areas of the US. This site provides access to MMAB modeling tools for ocean waves (including an interactive presentation,) sea ice, marine meteorology, sea surface temperature and more. The site also features a mailing list, bibliography of publications, and information about modeling products still in the experimental and development phases.
D.W. Wu; A.J. Smith
2004-11-08
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
Introduction Discriminative Language Modeling (DLM)
Mohri, Mehryar
Introduction Discriminative Language Modeling (DLM) Discriminative Training of Acoustic Models Discriminative Language and Acoustic Modeling for Large Vocabulary Continuous Speech Recognition Murat Sarac Language and Acoustic Modeling for LVCSR #12;Introduction Discriminative Language Modeling (DLM
Modeling error in Approximate Deconvolution Models
Adrian Dunca; Roger Lewandowski
2012-10-09
We investigate the assymptotic behaviour of the modeling error in approximate deconvolution model in the 3D periodic case, when the order $N$ of deconvolution goes to $\\infty$. We consider successively the generalised Helmholz filters of order $p$ and the Gaussian filter. For Helmholz filters, we estimate the rate of convergence to zero thanks to energy budgets, Gronwall's Lemma and sharp inequalities about Fouriers coefficients of the residual stress. We next show why the same analysis does not allow to conclude convergence to zero of the error modeling in the case of Gaussian filter, leaving open issues.
Dingle, Brent Michael
2007-09-17
This dissertation presents a robust method of modeling objects and forces for computer animation. Within this method objects and forces are represented as particles. As in most modeling systems, the movement of objects is driven by physically based...
Simpson, A Hamish; Murray, Iain R
2015-02-01
Animal models are widely used to investigate the pathogenesis of osteoporosis and for the clinical testing of anti-resorptive drugs. However, osteoporotic fracture models designed to investigate novel ways to treat fractures of osteoporotic bone must fulfil requirements distinct from those of pharmacological testing. Bone strength and toughness, implant fixation and osteointegration and fracture repair are of particular interest. Osteoporotic models should reflect the underlying clinical scenario be that primary type 1 (post-menopausal) osteoporosis, primary type 2 (senile) osteoporosis or secondary osteoporosis. In each scenario, small and large animal models have been developed. While rodent models facilitate the study of fractures in strains specifically established to facilitate understanding of the pathologic basis of disease, concerns remain about the relevance of small animal fracture models to the human situation. There is currently no all-encompassing model, and the choice of species and model must be individualized to the scientific question being addressed. PMID:25388154
NSDL National Science Digital Library
David Joiner
Monte Carlo modeling refers to the solution of mathematical problems with the use of random numbers. This can include both function integration and the modeling of stochastic phenomena using random processes.
Chou, Danielle, 1981-
2004-01-01
The drive behind improved friction models has been better prediction and control of dynamic systems. The earliest model was of classical Coulomb friction; however, the discontinuity during force reversal of the Coulomb ...
Endoh, Shinsuke
1982-01-01
Introduction: The threat of midair collisions is one of the most serious problems facing the air traffic control system and has been studied by many researchers. The gas model is one of the models which describe the expected ...
NSDL National Science Digital Library
American Museum of Natural History
2012-06-26
In this activity, learners make a 3-D model of DNA using paper and toothpicks. While constructing this model, learners will explore the composition and structure of DNA. The activity also gives suggestions for alternate materials and challenges to explore.
Models of scientific explanation
Sutton, Peter Andrew
2005-08-29
Ever since Hempel and Oppenheim's development of the Deductive Nomological model of scientific explanation in 1948, a great deal of philosophical energy has been dedicated to constructing a viable model of explanation that concurs both with our...
NASA Technical Reports Server (NTRS)
1998-01-01
Model support system and instumentation cabling of the 1% scale X-33 reaction control system model. Installed in the Unitary Plan Wind Tunnel for supersonic testing. In building 1251, test section #2.
Bounding Species Distribution Models
NASA Technical Reports Server (NTRS)
Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.
2011-01-01
Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].
Bounding species distribution models
Stohlgren, T.J.; Jarnevich, C.S.; Esaias, W.E.; Morisette, J.T.
2011-01-01
Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used. ?? 2011 Current Zoology.
We developed a simplified spreadsheet modeling approach for characterizing and prioritizing sources of sediment loadings from watersheds in the United States. A simplified modeling approach was developed to evaluate sediment loadings from watersheds and selected land segments. ...
Exposure Analysis Modeling System
The Exposure Analysis Modeling System (EXAMS) is an interactive software application for formulating aquatic ecosystem models and evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals including pesticides, industrial materials, and leachates f...
NSDL National Science Digital Library
2014-09-14
The goal of this training module is to help you increase your understanding of how mesoscale models work. Such understanding, in turn, can help you more efficiently and accurately evaluate model-generated forecast products.
METEOROLOGICAL AND TRANSPORT MODELING
Advanced air quality simulation models, such as CMAQ, as well as other transport and dispersion models, require accurate and detailed meteorology fields. These meteorology fields include primary 3-dimensional dynamical and thermodynamical variables (e.g., winds, temperature, mo...
ERIC Educational Resources Information Center
Brinner, Bonnie
1992-01-01
Presents an activity in which models help students visualize both the DNA process and transcription. After constructing DNA, RNA messenger, and RNA transfer molecules; students model cells, protein synthesis, codons, and RNA movement. (MDH)
NSDL National Science Digital Library
Visualization of output from mathematical or statistical models is one of the best ways to introduce introductory geoscience students to the results and behavior of sophisticated models. Example of good sites ...
NASA Astrophysics Data System (ADS)
Grøn, Øyvind; Darian, Diako
We give a review of viscous relativistic universe models that have been presented during the period from 1990 until the present time. In particular we discuss the properties of isotropic and homogeneous universe models, and of anisotropic and homogeneous Bianchi type I models. We consider these types of models both in the context of the non-causal Eckhart theory and the causal Israel-Stewart theory.
ACES terminal model enhancement
George J. Couluris; Paul C. Davis; Nathan C. Mittler; Aditya P. Saraf; Sebastian D. Timar
2009-01-01
Terminal model enhancement is an advanced modeling capability for simulating terminal area airport and airspace traffic operations. But, more importantly, TME is a platform for testing advanced air traffic management concepts using plug-and-play modeling. TME augments the existing airport surface and terminal airspace modeling capabilities of NASA's airspace concept evaluation system, an agent-based fast-time National Airspace System simulation. TME supports
NASA Technical Reports Server (NTRS)
Figueroa-Feliciano, Enectali
2004-01-01
We have developed a software suite that models complex calorimeters in the time and frequency domain. These models can reproduce all measurements that we currently do in a lab setting, like IV curves, impedance measurements, noise measurements, and pulse generation. Since all these measurements are modeled from one set of parameters, we can fully describe a detector and characterize its behavior. This leads to a model than can be used effectively for engineering and design of detectors for particular applications.
Tashiro, Tohru
2013-01-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.
Modelling of cells bioenergetics.
Kasperski, Andrzej
2008-09-01
This paper presents an integrated model describing the control of Saccharomyces cerevisiae yeast cells bioenergetics. This model describes the oxidative and respirofermentative metabolism. The model assumes that the mitochondria of the Saccharomyces cerevisiae cells are charged with NADH during the tricarboxylic acid cycle, and NADH is discharged from mitochondria later in the electron transport system. Selected effects observed in the Saccharomyces cerevisiae eucaryotic cells, including the Pasteur's and Crabtree effects, are also modeled. PMID:18379882
NSDL National Science Digital Library
2014-09-14
Oceans cover over 70% of the surface of the earth, yet many details of their workings are not fully understood. To better understand and forecast the state of the ocean, we rely on numerical ocean models. Ocean models combine observations and physics to predict the ocean temperature, salinity, and currents at any time and any place across the ocean basins. This module will discuss what goes into numerical ocean models, including model physics, coordinate systems, parameterization, initialization, and boundary conditions.
Trevor Hastie; Robert Tibshirani
1986-01-01
Likelihood-based regression models such as the normal linear regression model and the linear logistic model, assume a linear (or some other parametric) form for the covariates $X_1, X_2, \\\\cdots, X_p$. We introduce the class of generalized additive models which replaces the linear form $\\\\sum \\\\beta_jX_j$ by a sum of smooth functions $\\\\sum s_j(X_j)$. The $s_j(\\\\cdot)$'s are unspecified functions that are
Aerosol Modeling for the Global Model Initiative
NASA Technical Reports Server (NTRS)
Weisenstein, Debra K.; Ko, Malcolm K. W.
2001-01-01
The goal of this project is to develop an aerosol module to be used within the framework of the Global Modeling Initiative (GMI). The model development work will be preformed jointly by the University of Michigan and AER, using existing aerosol models at the two institutions as starting points. The GMI aerosol model will be tested, evaluated against observations, and then applied to assessment of the effects of aircraft sulfur emissions as needed by the NASA Subsonic Assessment in 2001. The work includes the following tasks: 1. Implementation of the sulfur cycle within GMI, including sources, sinks, and aqueous conversion of sulfur. Aerosol modules will be added as they are developed and the GMI schedule permits. 2. Addition of aerosol types other than sulfate particles, including dust, soot, organic carbon, and black carbon. 3. Development of new and more efficient parameterizations for treating sulfate aerosol nucleation, condensation, and coagulation among different particle sizes and types.
3 Human vs. model 2 Salience model
Peters, Rob
4 Contour model Robert J. Peters (1), T. Nathan Mundhenk (2), Laurent Itti (2), and Christof Koch (1 Winner-take-all Inhibition of return Attended location adapted from Itti&Koch (2001), Nat. Rev. Neurosci
Nonlinear Modeling by Assembling Piecewise Linear Models
NASA Technical Reports Server (NTRS)
Yao, Weigang; Liou, Meng-Sing
2013-01-01
To preserve nonlinearity of a full order system over a parameters range of interest, we propose a simple modeling approach by assembling a set of piecewise local solutions, including the first-order Taylor series terms expanded about some sampling states. The work by Rewienski and White inspired our use of piecewise linear local solutions. The assembly of these local approximations is accomplished by assigning nonlinear weights, through radial basis functions in this study. The efficacy of the proposed procedure is validated for a two-dimensional airfoil moving at different Mach numbers and pitching motions, under which the flow exhibits prominent nonlinear behaviors. All results confirm that our nonlinear model is accurate and stable for predicting not only aerodynamic forces but also detailed flowfields. Moreover, the model is robustness-accurate for inputs considerably different from the base trajectory in form and magnitude. This modeling preserves nonlinearity of the problems considered in a rather simple and accurate manner.
Jeremy Manson; William Pugh; Sarita V. Adve
2005-01-01
This paper describes the new Java memory model, which has been revised as part of Java 5.0. The model specifies the legal behaviors for a multithreaded program; it defines the semantics of multithreaded Java programs and partially determines legal implementations of Java virtual machines and compilers.The new Java model provides a simple interface for correctly synchronized programs -- it guarantees
K. Bardakci; M. B. Halpern
1971-01-01
On the basis of new representations of the projective group, we construct some new dual quark models whose spin and internal symmetry are not multiplicative. One model is a factorized theory of exotic states with broken exchange degeneracy, ninth mesons being optional. The exotic states are suppressed three units below the Pomeranchon. In another model, with spin-orbit coupling and curved
NASA Astrophysics Data System (ADS)
Pasquier, V.
1988-09-01
We show that a class of 2 D statistical mechanics models known as IRF models can be viewed as a subalgebra of the operator algebra of vertex models. Extending the Wigner calculus to quantum groups, we obtain an explicit intertwiner between two representations of this subalgebra.
Harrison, A.K.
1997-03-14
We have identified the Cranfill multifluid turbulence model (Cranfill, 1992) as a starting point for development of subgrid models of instability, turbulent and mixing processes. We have differenced the closed system of equations in conservation form, and coded them in the object-oriented hydrodynamics code FLAG, which is to be used as a testbed for such models.
Marker, David
Model Theory of Separably Closed Fields, Margit Messmer #12; The model theory of fields is a fascinating is to give an introduction to this fascinating area concentrating on connections to stability theory paper ``Model theory of differential fields'' is based on a course given at the University of Illinois
PARAMETRIC MODEL SELECTION TECHNIQUES
GARY L. BECK
Many parametric statistical models are available for modelling lifetime data. Given a data set of lifetimes, which may or may not be censored, which parametric model should be used to conduct statistical tests? In only a few cases can analytical expressions be found to answer this question in some optimal fashion. Various measures of discrepancy and other functionals of the
NSDL National Science Digital Library
2012-08-03
This is an activity about scale model building. Learners will use mathematics to determine the scale model size, construct a pattern, and build a paper scale model of the IMAGE (Imager for Magnetopause-to-Aurora Global Exploration) satellite, the first satellite mission to image the Earth's magnetosphere. This is the second activity in the Solar Storms and You: Exploring Satellite Design educator guide.
D A Kirzhnits; Yurii E Lozovik; Galina V Shpatakovskaya
1975-01-01
This review is devoted to the development of the statistical model of matter over the last twenty years. The ranges of applicability of the model for electron-nuclear systems (atoms, solids, plasmas) are considered. Effects lying beyond the scope of statistical model (exchange, correlation, quantum and shell effects) are analyzed. The relative roles of the effects enumerated are estimated in different
Piet A. Slats; Bis Bhola; Joseph J. M. Evers; Gert Dijkhuizen
1995-01-01
Logistic chain modelling is very important in improving the overall performance of the total logistic chain. Logistic models provide support for a large range of applications, such as analysing bottlenecks, improving customer service, configuring new logistic chains and adapting existing chains to new products and markets. Modelling the logistic chain is the main topic dealt with in this article. Recent
Models for Repeated Measurements
J. K. Lindsey
1993-01-01
This second edition of Models for Repeated Measurements has been comprehensively revised and updated, taking into account the huge amount of research that has been carried out in the subject in recent years. A wide variety of useful new models is now available, models that have revolutionized the analysis of such data. The second edition contains three new chapters on
ERIC Educational Resources Information Center
Meara, Paul
2004-01-01
This paper describes some simple simulation models of vocabulary attrition. The attrition process is modelled using a random autonomous Boolean network model, and some parallels with real attrition data are drawn. The paper argues that applying a complex systems approach to attrition can provide some important insights, which suggest that real…
M. Hagee; L. Gordon
2003-01-01
For detection, tracking and identification of time sensitive targets (TST) i.e to find, fix, track, target, and kill an enemy vehicle within minutes, Tactical Sensor Models (TSM) are utilized in the targeting phase of this process. A Tactical Sensor Model is a math model that takes known errors in a sensor and corrects for them to get a better idea
Darwyn R. Peachey
1986-01-01
Although modeling natural phenomena is recognized as one of the greatest challenges of computer graphics, relatively little time has been spent on modeling ocean waves. The model presented in this paper is suitable for the rendering and animation of waves approaching and breaking on a sloping beach. Waveforms consist of a phase function which correctly produces wave refraction and other
Modelling airport congestion charges
Milan Janic
2005-01-01
This article deals with modelling congestion charges at an airport. In this context, congestion charging represents internalizing the cost of marginal delays that a flight imposes on other flights due to congestion. The modelling includes estimating congestion and flight delays, the cost of these delays and the efficiency of particular flights following the introduction of a congestion charge. The models
PRZM3 is a modeling system that links two subordinate models - PRZM and VADOFT to predict pesticide transport and transformation down through the crop root and unsaturated zone. PRZM3 includes modeling capabilities for such phenomena as soil temperature simulation, vo...
PTOLEMY II HETEROGENEOUS CONCURRENT MODELING AND DESIGN IN JAVA Edited by: Christopher Hylands of California at Berkeley http://ptolemy.eecs.berkeley.edu Document Version 2.0.1 for use with Ptolemy II 2 Concurrent Modeling and Design Contents Part 1: Using Ptolemy II 1. Introduction 1-1 1.1.Modeling and Design
Electronic materials process modeling
Dimitrios Maroudasa; Sadasivan Shankar
1996-01-01
Summary This report focuses on current needs in the process modeling of materials used in electronic and optoelectronic device fabrication and provides specific recommendations in addressing these needs. The establishment of relationships between materials structure and processing is identified as the critical modeling need in the electronics industry. A hierarchical modeling approach is suggested aiming at the development of efficient
This presentation provides a general overview of SHEDS model features, describes algorithms in the SHEDS-Air Toxics model that focus on mobile source exposures and multipathway exposures, and presents examples of results from application of the SHEDS-Air Toxics model to benzene i...
ERIC Educational Resources Information Center
Goodman, Richard E.
1970-01-01
Describes types of molecular models (ball-and-stick, framework, and space-filling) and evaluates commercially available kits. Gives instructions for constructive models from polystyrene balls and pipe-cleaners. Models are useful for class demonstrations although not sufficiently accurate for research use. Illustrations show biologically important…
Solar information process model
R. Hewett; P. Spewak
1978-01-01
The MITRE Solar Information Process Model (SIP) is a computerized model that simulates information processes in solar markets. As such, it represents a useful tool in the formulation of solar information outreach programs. For each market investigated, SIP model outputs include prioritized listings of the information needs of key decision makers and other strategically important market participants, and related information
B. Drossel; A. J. McKane
2002-01-01
We review theoretical approaches to the understanding of food webs. After an overview of the available food web data, we discuss three different classes of models. The first class comprise static models, which assign links between species according to some simple rule. The second class are dynamical models, which include the population dynamics of several interacting species. We focus on
Danil Sokolov; Ivan Poliakov; Alexandre Yakovlev
2007-01-01
A token-based model for asynchronous data path is formally defined and three token game semantics, spread token, antitoken and counterflow, are introduced. These semantics are studied and their advantages and drawbacks are highlighted. For analysis and comparison a software tool is developed which integrates these models into a consistent framework. The models are verified by mapping them into Petri nets
Penny, Will
Empirical Bayes Will Penny Linear Models fMRI analysis Gradient Ascent Online learning Delta Rule Newton Method Bayesian Linear Models MAP Learning MEG Source Reconstruction Empirical Bayes Model Maximum Likelihood Augmented Form ReML Objective Function References Empirical Bayes Will Penny 3rd March
ERIC Educational Resources Information Center
Speiser, Bob; Walter, Chuck
2011-01-01
This paper explores how models can support productive thinking. For us a model is a "thing", a tool to help make sense of something. We restrict attention to specific models for whole-number multiplication, hence the wording of the title. They support evolving thinking in large measure through the ways their users redesign them. They assume new…
Modeling for Tsunami Forecast Vasily Titov NOAA Center for Tsunami Research Pacific Marine Environmental Laboratory Seattle, WA #12;Outline Tsunami Modeling Development Toward Real- time Tsunami Forecast Challenges Modeling development in 1990 -2000 Short-term Inundation Forecast for Tsunamis Forecast system
John T. Baldwin
We give a model theoretic proof, replacing admissible set theory by the Lopez- Escobar theorem, of Makkai's theorem: Every counterexample to Vaught's conjecture has an uncountable model which realizes only countably many L!1,!-types. The fol- lowing result is new. Theorem. If a first order theory is a counterexample to the Vaught conjecture then it has 2@1 models of cardinality @1.
NASA Astrophysics Data System (ADS)
Schwartz, Sidney H.
1987-09-01
Information on a tank slosh model for Peacekeeper missiles is given in viewgraph form. Allowable vehicle errors for nose cone ejection clearance, vehicle maneuver sloshing problems, slosh/moment prediction, slosh surface specification, code validation, experimental model-computational model comparison and the propellant storage assembly are covered.
NASA Technical Reports Server (NTRS)
Schwartz, Sidney H.
1987-01-01
Information on a tank slosh model for Peacekeeper missiles is given in viewgraph form. Allowable vehicle errors for nose cone ejection clearance, vehicle maneuver sloshing problems, slosh/moment prediction, slosh surface specification, code validation, experimental model-computational model comparison and the propellant storage assembly are covered.
Reasoning and Formal Modelling
LÃ¶we, Benedikt
and grade (1). Averaging of components. #12;Reasoning and Formal Modelling for Forensic Science Lecture 2 component is replaced with the new grade. #12;Reasoning and Formal Modelling for Forensic Science Lecture 2Reasoning and Formal Modelling for Forensic Science Lecture 2 Prof. Dr. Benedikt LÂ¨owe What
Sharlemann, E.T.
1994-07-01
We are developing a DIAL performance model for CALIOPE at LLNL. The intent of the model is to provide quick and interactive parameter sensitivity calculations with immediate graphical output. A brief overview of the features of the performance model is given, along with an example of performance calculations for a non-CALIOPE application.
NSDL National Science Digital Library
2012-06-26
In this activity, learners create a model of a neuron by using colored clay or play dough. Learners use diagrams to build the model and then label the parts on a piece of paper. This resource guide includes extension ideas like using fruit or candy instead of clay. See the "Modeling the Nervous System" page for a recipe for play dough.
General Graded Response Model.
ERIC Educational Resources Information Center
Samejima, Fumiko
This paper describes the graded response model. The graded response model represents a family of mathematical models that deal with ordered polytomous categories, such as: (1) letter grading; (2) an attitude survey with "strongly disagree, disagree, agree, and strongly agree" choices; (3) partial credit given in accord with an individual's degree…
NSDL National Science Digital Library
Nicole LaDue
In this activity, candy models are used to demonstrate the features of the Earth, including its internal structure and layers. Students learn why models are essential in Earth science and answer questions about how their candy models do and do not compare with the actual Earth.
NSDL National Science Digital Library
Betty J. Blecha
This site contains 21 modular, easy to use economic models, that are appropriate for class assignments or in-class demonstrations. Students can simulate all the standard models taught in most economics courses. EconModel uses the Windows OS. The simulations were developed by William R. Parke of the University of North Carolina at Chapel Hill.
Phyloclimatic Modelling Workshop
Yesson, Christopher
2012-11-13
– from ice cores Palaeohistory • Fossil history – Mostly pollen • Geological record – Continental drift – Climate • Computer models – Climate Alastair Culham Gathering the evidence • Fossil history is generally poor and patchy even in the best recorded... climate models? • Modelling here relies on: – Knowing continental positions – Knowing altitudes – Knowing sea levels – Knowing atmospheric gas concentrations • This can be validated against fossil evidence – Pollen/macrofossils – ‘Fossil’ atmospheres...
Introduction Statistical Models
Paciorek, Chris
quantifiable. Disadvantages: Observations are expensive Data-driven estimation of effects of important model represents air pollution from line sources based on Gaussian diffusion. One might simply make, including emissions information Model errors; in particular error from model 'extrapolation' Chris Paciorek
The Constructivist Learning Model
NSDL National Science Digital Library
Robert E. Yager
2000-01-01
Much cognitive science research has been used to support a new model of learning. This most promising new model is called the Constructivist Learning Model (CLM). Russell Yeany (University of Georgia) has called CLM the most exciting idea of the past 50 y
Two Cognitive Modeling Frontiers
NASA Astrophysics Data System (ADS)
Ritter, Frank E.
This paper reviews three hybrid cognitive architectures (Soar, ACT-R, and CoJACK) and how they can support including models of emotions. There remain problems creating models in these architectures, which is a research and engineering problem. Thus, the term cognitive science engineering is introduced as an area that would support making models easier to create, understand, and re-use.
Johnson, J L; Padgett, M L
1999-01-01
The pulse coupled neural network (PCNN) models are described. The linking field modulation term is shown to be a universal feature of any biologically grounded dendritic model. Applications and implementations of PCNN's are reviewed. Application based variations and simplifications are summarized. The PCNN image decomposition (factoring) model is described in new detail. PMID:18252547
Succession Model Landscape Stochasticity
100 1000 10000 patch sizes birth rate both Disturbance Model Landscape Stochasticity Low Control High" accomplished by incrementing the patch birth rate (Control: s = a = 10) A simple model of species viabilitySuccession Model Landscape Stochasticity Low Control High Very High ThresholdMultiplier 0.1 1 10
Rasmus Ejlers Møgelberg; Lars Birkedal; Rasmus Lerchedahl Petersen
2005-01-01
Abstract We review the theory of adjunctions and comonads,in the 2-category of symmetric,monoidal,adjunctions. This leads to the definitions of linear adjunctio ns, linear categories and models of DILL as in [1, 6, 7]. This theory is generalized to the fibred case, and we define models of PILL and PILL and morphisms,between them. 1 Models of DILL
NSDL National Science Digital Library
Ethel D. Stanley (Beloit College; Biology)
2006-05-20
Humans have been producing wines for thousands of years. How did wine making get started? How has it changed? The Wine Mini-Model simulation enables us to explore the basic fermentation process as well as model enhancements such as the higher alcohol tolerance of cultivated yeasts used in modern wine making. * model the fermentation process in early and modern wines
GTM is an economic model capable of examining global forestry land-use, management, and trade responses to policies. In responding to a policy, the model captures afforestation, forest management, and avoided deforestation behavior. The model estimates harvests in industrial fore...
ERIC Educational Resources Information Center
Harris, Mary B.
To investigate the effect of modeling on altruism, 156 third and fifth grade children were exposed to a model who either shared with them, gave to a charity, or refused to share. The test apparatus, identified as a game, consisted of a box with signal lights and a chute through which marbles were dispensed. Subjects and the model played the game…
Mathew Hahn; David Rogers
A number of methods, called receptor mapping techniques, attempt to provide insight about the putative active site and to characterize receptor binding requirements. Often, receptor mapping techniques are used to generate a hypothetical model of the actual receptor site. This is known as a receptor site model. In this chapter, we describe a specific type of receptor site model called
Computational modeling of growth
E. Kuhl; A. Menzel; P. Steinmann
2003-01-01
The present contribution is dedicated to the computational modeling of growth phenomena typically encountered in modern biomechanical applications. We set the basis by critically reviewing the relevant literature and classifying the existing models. Next, we introduce a geometrically exact continuum model of growth which is not a priori restricted to applications in hard tissue biomechanics. The initial boundary value problem
Modeling and Remodeling Writing
ERIC Educational Resources Information Center
Hayes, John R.
2012-01-01
In Section 1 of this article, the author discusses the succession of models of adult writing that he and his colleagues have proposed from 1980 to the present. He notes the most important changes that differentiate earlier and later models and discusses reasons for the changes. In Section 2, he describes his recent efforts to model young…
Timothy F. Cootes; Gareth J. Edwards; Christopher J. Taylor
1998-01-01
We describe a new method of matching statistical models of appearance to images. A set of model parameters control modes of shape and gray-level variation learned from a training set. We construct an efficient iterative matching algorithm by learning the relationship between perturbations in the model parameters and the induced image errors.
Focus Article Electrophysiological models
Nelson, Mark E.
Focus Article Electrophysiological models of neural processing Mark E. Nelson The brain modeling and computer simulation techniques have become essential tools in understanding diverse aspects capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward
Technology Transfer Automated Retrieval System (TEKTRAN)
Adsorption-desorption reactions are important processes that affect the transport of contaminants in the environment. Surface complexation models are chemical models that can account for the effects of variable chemical conditions, such as pH, on adsorption reactions. These models define specific ...
Appendix W to 40CFR Part 51 (Guideline on Air Quality Models) specifies the models to be used for purposes of permitting, PSD, and SIPs. Through a formal regulatory process this modeling guidance is periodically updated to reflect current science. In the most recent action, thr...
The Modal Aerosol Dynamics (MAD) model is a computationally efficient model for solving the General Dynamics Equation of Aerosols (GDE) (Friedlander, 1977). The simplifying assumption in the model is that aerosol size distributions can be approximated by overlapping modes, each r...
ERIC Educational Resources Information Center
Walsh, Jim; McGehee, Richard
2013-01-01
A dynamical systems approach to energy balance models of climate is presented, focusing on low order, or conceptual, models. Included are global average and latitude-dependent, surface temperature models. The development and analysis of the differential equations and corresponding bifurcation diagrams provides a host of appropriate material for…
Crushed Salt Constitutive Model
Callahan, G.D.
1999-02-01
The constitutive model used to describe the deformation of crushed salt is presented in this report. Two mechanisms -- dislocation creep and grain boundary diffusional pressure solution -- are combined to form the basis for the constitutive model governing the deformation of crushed salt. The constitutive model is generalized to represent three-dimensional states of stress. Upon complete consolidation, the crushed-salt model reproduces the Multimechanism Deformation (M-D) model typically used for the Waste Isolation Pilot Plant (WIPP) host geological formation salt. New shear consolidation tests are combined with an existing database that includes hydrostatic consolidation and shear consolidation tests conducted on WIPP and southeastern New Mexico salt. Nonlinear least-squares model fitting to the database produced two sets of material parameter values for the model -- one for the shear consolidation tests and one for a combination of the shear and hydrostatic consolidation tests. Using the parameter values determined from the fitted database, the constitutive model is validated against constant strain-rate tests. Shaft seal problems are analyzed to demonstrate model-predicted consolidation of the shaft seal crushed-salt component. Based on the fitting statistics, the ability of the model to predict the test data, and the ability of the model to predict load paths and test data outside of the fitted database, the model appears to capture the creep consolidation behavior of crushed salt reasonably well.
Hierarchical Models of Attitude.
ERIC Educational Resources Information Center
Reddy, Srinivas K.; LaBarbera, Priscilla A.
1985-01-01
The application and use of hierarchical models is illustrated, using the example of the structure of attitudes toward a new product and a print advertisement. Subjects were college students who responded to seven-point bipolar scales. Hierarchical models were better than nonhierarchical models in conceptualizing attitude but not intention. (GDC)
Modeling Realistic Virtual Hairstyles
Yizhou Yu
2001-01-01
In this paper we present an effective method for modeling realistic curly hairstyles, taking into account both artificial hairstyling processes and natural curliness. The result is a detailed geometric model of hairs that can be rendered and animated via existing methods. Our technique exploits the analogy between hairs and a vector field; interactively and efficiently models global and local hair
QUALITATIVE ECOLOGICAL MODELING
Technology Transfer Automated Retrieval System (TEKTRAN)
Students construct qualitative models of an ecosystem and use the models to evaluate the direct and indirect effects that may result from perturbations to the ecosystem. Qualitative modeling is described for use in two procedures, each with different educational goals and student backgrounds in min...
PREDICTIVE MODELS. Enhanced Oil Recovery Model
Ray, R.M. [DOE Bartlesville Energy Technology Technology Center, Bartlesville, OK (United States)
1992-02-26
PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1 chemical flooding, where soap-like surfactants are injected into the reservoir to wash out the oil; 2 carbon dioxide miscible flooding, where carbon dioxide mixes with the lighter hydrocarbons making the oil easier to displace; 3 in-situ combustion, which uses the heat from burning some of the underground oil to thin the product; 4 polymer flooding, where thick, cohesive material is pumped into a reservoir to push the oil through the underground rock; and 5 steamflood, where pressurized steam is injected underground to thin the oil. CFPM, the Chemical Flood Predictive Model, models micellar (surfactant)-polymer floods in reservoirs, which have been previously waterflooded to residual oil saturation. Thus, only true tertiary floods are considered. An option allows a rough estimate of oil recovery by caustic or caustic-polymer processes. CO2PM, the Carbon Dioxide miscible flooding Predictive Model, is applicable to both secondary (mobile oil) and tertiary (residual oil) floods, and to either continuous CO2 injection or water-alternating gas processes. ICPM, the In-situ Combustion Predictive Model, computes the recovery and profitability of an in-situ combustion project from generalized performance predictive algorithms. PFPM, the Polymer Flood Predictive Model, is switch-selectable for either polymer or waterflooding, and an option allows the calculation of the incremental oil recovery and economics of polymer relative to waterflooding. SFPM, the Steamflood Predictive Model, is applicable to the steam drive process, but not to cyclic steam injection (steam soak) processes.
Bayesian Model Comparison of Structural Equation Models
Sik-Yum Lee; Xin-Yuan Song
\\u000a Structural equation modeling is a multivariate method for establishing meaningful models to investigate the relationships\\u000a of some latent (causal) and manifest (control) variables with other variables. In the past quarter of a century, it has drawn\\u000a a great deal of attention in psychometrics and sociometrics, both in terms of theoretical developments and practical applications\\u000a (see Bentler and Wu, 2002; Bollen,
Hoff, Peter
Introduction and examples Modeling mean structure Modeling covariance structure Multiway Array Models for Dynamic Relational Data Peter Hoff Statistics, Biostatistics and the CSSS University of Washington #12;Introduction and examples Modeling mean structure Modeling covariance structure Outline
Pediatric Computational Models
NASA Astrophysics Data System (ADS)
Soni, Bharat K.; Kim, Jong-Eun; Ito, Yasushi; Wagner, Christina D.; Yang, King-Hay
A computational model is a computer program that attempts to simulate a behavior of a complex system by solving mathematical equations associated with principles and laws of physics. Computational models can be used to predict the body's response to injury-producing conditions that cannot be simulated experimentally or measured in surrogate/animal experiments. Computational modeling also provides means by which valid experimental animal and cadaveric data can be extrapolated to a living person. Widely used computational models for injury biomechanics include multibody dynamics and finite element (FE) models. Both multibody and FE methods have been used extensively to study adult impact biomechanics in the past couple of decades.
T. W. Brown
2011-10-10
The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super-Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich-Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces.
Sorin Vlad; Paul Pascu; Nicolae Morariu
2010-01-20
The paper discusses the main ideas of the chaos theory and presents mainly the importance of the nonlinearities in the mathematical models. Chaos and order are apparently two opposite terms. The fact that in chaos can be found a certain precise symmetry (Feigenbaum numbers) is even more surprising. As an illustration of the ubiquity of chaos, three models among many other existing models that have chaotic features are presented here: the nonlinear feedback profit model, one model for the simulation of the exchange rate and one application of the chaos theory in the capital markets.
NSDL National Science Digital Library
Ocean-Modeling.org
The purpose of this web site is to facilitate the development and testing of the Terrain-following Ocean Modeling System (TOMS) and to provide a forum to the ocean community at large. The site provides an explanation of three-dimensional modeling, as well as an overview of the four primary types of ocean modeling methods currently in use and links to labs around the country using these modeling techniques. A collection of links to freely downloadable ocean modeling tools is provided. The site also includes links to data sources, publications, bulletin boards, chat rooms and other relevant sites.
Pilot model hypothesis testing
NASA Technical Reports Server (NTRS)
Broussard, J. R.; Berry, P. W.
1982-01-01
The aircraft control time history predicted by the optimal control pilot model and actual pilot tracking data obtained from NASA Langley's differential maneuvering simulator (DMS) are analyzed. The analysis is performed using a hypothesis testing scheme modified to allow for changes in the true hypothesis. A finite number of pilot models, each with different hypothesized internal model representations of the aircraft dynamics, are constructed. The hypothesis testing scheme determines the relative probability that each pilot model best matches the DMS data. By observing the changes in probabilities, it is possible to determine when the pilot changes control strategy and which hypothesized pilot model best represent's the pilot's control behavior.
Ritchie, L.T.; Alpert, D.J.; Burke, R.P.; Johnson, J.D.; Ostmeyer, R.M.; Aldrich, D.C.; Blond, R.M.
1984-03-01
The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.
Brown, T. W. [DESY, Hamburg, Theory Group, Notkestrasse, 85, D-22603 Hamburg (Germany)
2011-04-15
The same complex matrix model calculates both tachyon scattering for the c=1 noncritical string at the self-dual radius and certain correlation functions of operators which preserve half the supersymmetry in N=4 super-Yang-Mills theory. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich-Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces.
Whiting, R C
1995-11-01
Predictive food microbiology is a field of study that combines elements of microbiology, mathematics, and statistics to develop models that describe and predict the growth or decline of microbes under specified environmental conditions. Models can be thought of as having three levels: primary level models describe changes in microbial numbers with time, secondary level models show how the parameters of the primary model vary with environmental conditions, and the tertiary level combines the first two types of models with user-friendly application software or expert systems that calculate microbial behavior under the specified conditions. Primary models include time-to-growth, Gompertz function, exponential growth rate, and inactivation/survival models. Commonly used secondary models are response surface equations and the square root and Arrhenius relationships. Microbial models are valuable tools in planning Hazard Analysis, Critical Control Point (HACCP) programs and making decisions, as they provide the first estimates of expected changes in microbial populations when exposed to a specific set of conditions. This review describes the models currently being developed for food-borne microorganisms, particularly pathogens, and discusses their uses. PMID:8777014
Johnson, Douglas H.; Cook, R.D.
2013-01-01
In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.
T. Ghezzehej
2004-10-04
The purpose of this model report is to document the calibrated properties model that provides calibrated property sets for unsaturated zone (UZ) flow and transport process models (UZ models). The calibration of the property sets is performed through inverse modeling. This work followed, and was planned in, ''Technical Work Plan (TWP) for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654], Sections 1.2.6 and 2.1.1.6). Direct inputs to this model report were derived from the following upstream analysis and model reports: ''Analysis of Hydrologic Properties Data'' (BSC 2004 [DIRS 170038]); ''Development of Numerical Grids for UZ Flow and Transport Modeling'' (BSC 2004 [DIRS 169855]); ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]); ''Geologic Framework Model'' (GFM2000) (BSC 2004 [DIRS 170029]). Additionally, this model report incorporates errata of the previous version and closure of the Key Technical Issue agreement TSPAI 3.26 (Section 6.2.2 and Appendix B), and it is revised for improved transparency.
Multiscale Modeling of Recrystallization
Godfrey, A.W.; Holm, E.A.; Hughes, D.A.; Lesar, R.; Miodownik, M.A.
1998-12-07
We propose a multi length scale approach to modeling recrystallization which links a dislocation model, a cell growth model and a macroscopic model. Although this methodology and linking framework will be applied to recrystallization, it is also applicable to other types of phase transformations in bulk and layered materials. Critical processes such as the dislocation structure evolution, nucleation, the evolution of crystal orientations into a preferred texture, and grain size evolution all operate at different length scales. In this paper we focus on incorporating experimental measurements of dislocation substructures, rnisorientation measurements of dislocation boundaries, and dislocation simulations into a mesoscopic model of cell growth. In particular, we show how feeding information from the dislocation model into the cell growth model can create realistic initial microstructure.
Toward Scientific Numerical Modeling
NASA Technical Reports Server (NTRS)
Kleb, Bil
2007-01-01
Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.
NASA Technical Reports Server (NTRS)
Guenther, D. B.; Demarque, P.; Kim, Y.-C.; Pinsonneault, M. H.
1992-01-01
A set of solar models have been constructed, each based on a single modification to the physics of a reference solar model. In addition, a model combining several of the improvements has been calculated to provide a best solar model. Improvements were made to the nuclear reaction rates, the equation of state, the opacities, and the treatment of the atmosphere. The impact on both the structure and the frequencies of the low-l p-modes of the model to these improvements are discussed. It is found that the combined solar model, which is based on the best physics available (and does not contain any ad hoc assumptions), reproduces the observed oscillation spectrum (for low-l) within the errors associated with the uncertainties in the model physics (primarily opacities).
Andrew R. Liddle; Pia Mukherjee; David Parkinson
2006-08-09
Model selection aims to determine which theoretical models are most plausible given some data, without necessarily asking about the preferred values of the model parameters. A common model selection question is to ask when new data require introduction of an additional parameter, describing a newly-discovered physical effect. We review several model selection statistics, and then focus on use of the Bayesian evidence, which implements the usual Bayesian analysis framework at the level of models rather than parameters. We describe our CosmoNest code, which is the first computationally-efficient implementation of Bayesian model selection in a cosmological context. We apply it to recent WMAP satellite data, examining the need for a perturbation spectral index differing from the scale-invariant (Harrison-Zel'dovich) case.
Generalized Multilevel Structural Equation Modeling
ERIC Educational Resources Information Center
Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew
2004-01-01
A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…
V. Chipman; J. Case
2002-12-20
The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their post-closure analyses. The Ventilation Model report was initially developed to analyze the effects of preclosure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts, and to provide heat removal data to support EBS design. Revision 00 of the Ventilation Model included documentation of the modeling results from the ANSYS-based heat transfer model. Revision 01 ICN 01 included the results of the unqualified software code MULTIFLUX to assess the influence of moisture on the ventilation efficiency. The purposes of Revision 02 of the Ventilation Model are: (1) To validate the conceptual model for preclosure ventilation of emplacement drifts and verify its numerical application in accordance with new procedural requirements as outlined in AP-SIII-10Q, Models (Section 7.0). (2) To satisfy technical issues posed in KTI agreement RDTME 3.14 (Reamer and Williams 2001a). Specifically to demonstrate, with respect to the ANSYS ventilation model, the adequacy of the discretization (Section 6.2.3.1), and the downstream applicability of the model results (i.e. wall heat fractions) to initialize post-closure thermal models (Section 6.6). (3) To satisfy the remainder of KTI agreement TEF 2.07 (Reamer and Williams 2001b). Specifically to provide the results of post-test ANSYS modeling of the Atlas Facility forced convection tests (Section 7.1.2). This portion of the model report also serves as a validation exercise per AP-SIII.10Q, Models, for the ANSYS ventilation model. (4) To asses the impacts of moisture on the ventilation efficiency.
NASA Technical Reports Server (NTRS)
Schoeberl, Mark; Rood, Richard B.; Hildebrand, Peter; Raymond, Carol
2003-01-01
The Earth System Model is the natural evolution of current climate models and will be the ultimate embodiment of our geophysical understanding of the planet. These models are constructed from components - atmosphere, ocean, ice, land, chemistry, solid earth, etc. models and merged together through a coupling program which is responsible for the exchange of data from the components. Climate models and future earth system models will have standardized modules, and these standards are now being developed by the ESMF project funded by NASA. The Earth System Model will have a variety of uses beyond climate prediction. The model can be used to build climate data records making it the core of an assimilation system, and it can be used in OSSE experiments to evaluate. The computing and storage requirements for the ESM appear to be daunting. However, the Japanese ES theoretical computing capability is already within 20% of the minimum requirements needed for some 2010 climate model applications. Thus it seems very possible that a focused effort to build an Earth System Model will achieve succcss.
Model checking of healthcare domain models.
Baksi, Dibyendu
2009-12-01
This paper shows the application of a type of formal software verification technique known as lightweight model checking to a domain model in healthcare informatics in general and public health surveillance systems in particular. One of the most complex use cases of such a system is checked using assertions to verify one important system property. This use case is one of the major justifications for the complexity of the domain model. Alloy Analyzer verification tool is utilized for this purpose. Such verification work is very effective in either uncovering design flaws or in providing guarantees on certain desirable system properties in the earlier phases of the development lifecycle of any critical project. PMID:19640605
Radiation Environment Modeling for Spacecraft Design: New Model Developments
NASA Technical Reports Server (NTRS)
Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray
2006-01-01
A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.
Software Maintenance Maturity Model (SMmm The software maintenance process model
Hayes, Jane E.
Software Maintenance Maturity Model (SMmm ): The software maintenance process model Alain April1 improvements to the software maintenance standards and introducing a proposed maturity model for daily software maintenance activities: Software Maintenance Maturity Model (SMmm ). The software maintenance function suffers
Larson, J. W.; Jacob, R. L.; Foster, I.; Guo, J.
2001-04-13
The advent of coupled earth system models has raised an important question in parallel computing: What is the most effective method for coupling many parallel models to form a high-performance coupled modeling system? We present our solution to this problem--The Model Coupling Toolkit (MCT). We explain how our effort to construct the Next-Generation Coupler for NCAR Community Climate System Model motivated us to create this toolkit. We describe in detail the conceptual design of the MCT and explain its usage in constructing parallel coupled models. We present preliminary performance results for the toolkit's parallel data transfer facilities. Finally, we outline an agenda for future development of the MCT.
Carcinogenesis models: An overview
Moolgavkar, S.H. [Fred Hutchinson Cancer Research Center, Seattle, WA (United States)
1992-12-31
Biologically based mathematical models of carcinogenesis are not only an essential part of a rational approach to quantitative cancer risk assessment but also raise fundamental questions about the nature of the events leading to malignancy. In this paper two such models are reviewed. The first is the multistage model proposed by Armitage and Doll in the 1950s; most of the paper is devoted to a discussion of the two-mutation model proposed by the author and his colleagues. This model is a generalization of the idea of recessive oncogenesis proposed by Knudson and has been shown to be consistent with a large body of epidemiologic and experimental data. The usefulness of the model is illustrated by analyzing a large experimental data set in which rats exposed to radon developed malignant lung tumors.
Macklin, Paul; Cristini, Vittorio
2013-01-01
Simulating cancer behavior across multiple biological scales in space and time, i.e., multiscale cancer modeling, is increasingly being recognized as a powerful tool to refine hypotheses, focus experiments, and enable more accurate predictions. A growing number of examples illustrate the value of this approach in providing quantitative insight on the initiation, progression, and treatment of cancer. In this review, we introduce the most recent and important multiscale cancer modeling works that have successfully established a mechanistic link between different biological scales. Biophysical, biochemical, and biomechanical factors are considered in these models. We also discuss innovative, cutting-edge modeling methods that are moving predictive multiscale cancer modeling toward clinical application. Furthermore, because the development of multiscale cancer models requires a new level of collaboration among scientists from a variety of fields such as biology, medicine, physics, mathematics, engineering, and computer science, an innovative Web-based infrastructure is needed to support this growing community. PMID:21529163
Extended frequency turbofan model
NASA Technical Reports Server (NTRS)
Mason, J. R.; Park, J. W.; Jaekel, R. F.
1980-01-01
The fan model was developed using two dimensional modeling techniques to add dynamic radial coupling between the core stream and the bypass stream of the fan. When incorporated into a complete TF-30 engine simulation, the fan model greatly improved compression system frequency response to planar inlet pressure disturbances up to 100 Hz. The improved simulation also matched engine stability limits at 15 Hz, whereas the one dimensional fan model required twice the inlet pressure amplitude to stall the simulation. With verification of the two dimensional fan model, this program formulated a high frequency F-100(3) engine simulation using row by row compression system characteristics. In addition to the F-100(3) remote splitter fan, the program modified the model fan characteristics to simulate a proximate splitter version of the F-100(3) engine.
Probabilistic Mesomechanical Fatigue Model
NASA Technical Reports Server (NTRS)
Tryon, Robert G.
1997-01-01
A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.
NSDL National Science Digital Library
Steve Ackerman
This applet tests the sensitivity of a barotropic model to time step, grid spacing, and initial conditions. The site explains the CFL (Courant-Friedrichs-Lewy) criterion (that the speed of fastest winds in the model must be less than or equal to grid spacing divided by the time step) and how a finite-difference weather prediction model blows up if this criterion is not met. The user of this applet will learn what the model looks like when it blows up, that a modeler cannot arbitrarily choose a horizontal grid spacing without also taking into account the time step of the model, and that if fine horizontal resolution is desired to see small-scale weather, there must be fine time resolution, too.
Zhou, Yongquan; Xie, Jian; Li, Liangliang; Ma, Mingzhi
2014-01-01
Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformation theory of cloud model to depict the qualitative concept: “bats approach their prey.” Furthermore, Lévy flight mode and population information communication mechanism of bats are introduced to balance the advantage between exploration and exploitation. The simulation results show that the cloud model bat algorithm has good performance on functions optimization. PMID:24967425
Excitation energy at scission in thermal-neutron-induced fission
G. Mantzouranis; J. R. Nix
1982-01-01
For the thermal-neutron-induced fission of 235U and 233U, we extract the internal excitation energy at the scission point by use of two different methods. The first method uses experimental data for 235U + nth on the neutrons and gamma rays emitted from doubly magic fission fragments, where the extra stability associated with shell closures makes the deformation energy small. Under
Quantum and thermodynamic properties of spontaneous and low-energy induced fission of nuclei
S. G. Kadmensky
2005-01-01
It is shown that A. Bohr’s concept of transition fission states can be matched with the properties of Coriolis interaction\\u000a if an axisymmetric fissile nucleus near the scission point remains cold despite a nonadiabatic character of nuclear collective\\u000a deformation motion. The quantum and thermodynamic properties of various stages of binary and ternary fission after the descent\\u000a of a fissile nucleus
Collective modes associated with the proton-induced fission of 209Bi
L. Nowicki; M. Berlanger; B. Borderie; C. Cabot; P. del Marmol; Y. El Masri; C. Grégoire; F. Hanappe; C. Ngô; B. Tamain
1982-01-01
Gamma-ray multiplicity (first and second moments) has been measured in the 60 MeV proton-induced fission of 209Bi. From this work we have evidence that shell effects play an effective role at the scission point, even at an excitation energy of 65 MeV. Collective modes seem to be more easily excited when the nascent fragments are far from spherical shapes. The
Bayesian Data-Model Fit Assessment for Structural Equation Modeling
ERIC Educational Resources Information Center
Levy, Roy
2011-01-01
Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes…
Using Markov Models and Hidden Markov Models to Find Repetitive
Karplus, Kevin
Using Markov Models and Hidden Markov Models to Find Repetitive Extragenic Palindromic Sequences for using simple Markov models and hidden Markov models hmms to search for interesting sequences for automatically constructing simple Markov models and hidden Markov models from small training sets
Human Motor Computational Model Through Iterative Model Reference Adaptive Control
Melbourne, University of
Human Motor Computational Model Through Iterative Model Reference Adaptive Control Shou-Han Zhou observed in the human motor adaptation. Keywords: Human Motor Computational Model; Iterative Model.burdet@imperial.ac.uk). Abstract: A computational model using mechanical impedance control in combination with an iterative model
System Modeling with Mixed Object and Data Models Hessam Sarjoughian
. An example illustrating mixed component and data modeling is described using an extended realization for component- based model development [12]. The SES emphasizes modeling concrete alternative model structuresSystem Modeling with Mixed Object and Data Models Hessam Sarjoughian Robert Flasher Arizona Center
NASA Astrophysics Data System (ADS)
Chakrabarti, Bikas K.; Dasgupta, Prabir K.
1992-07-01
We review here briefly some of our recent studies on neural network modelling. We discuss the studies on relaxation and growth of correlation in the Hopfield model, increase in memory loading capacity with an extended Hopfield-like model with delayed dynamics, the prediction capability of time series with a multi-layered network with supervised learning and studies on some generalised versions of the travelling salesman problem.
Competitive facility location models
A. V. Kononov; Yu. A. Kochetov; A. V. Plyasunov
2009-01-01
Two classes of competitive facility location models are considered, in which several persons (players) sequentially or simultaneously\\u000a open facilities for serving clients. The first class consists of discrete two-level programming models. The second class consists\\u000a of game models with several independent players pursuing selfish goals. For the first class, its relationship with pseudo-Boolean\\u000a functions is established and a novel method
Modeling and Network Organization
Cynthia Stokes; Adam Arkin
The use of mathematical modeling and analysis of networks has a long history in biological research. Perhaps the best-known\\u000a early example of insightful modeling is the work of Hodgkin and Huxley in 1952 describing how sodium and potassium ion channels\\u000a could function together to produce the membrane action potential in neurons (Hodgkin and Huxley, 1952). For several decades,\\u000a models and
Inflation models and observation
Laila Alabidi; David H. Lyth
2005-01-01
We consider small-field models which invoke the usual framework for the\\u000aeffective field theory, and large-field models which go beyond that. Present\\u000aand future possibilities for discriminating between the models are assessed, on\\u000athe assumption that the primordial curvature perturbation is generated during\\u000ainflation. With PLANCK data, the theoretical and observational uncertainties on\\u000athe spectral index will be comparable, providing
Trapped Radiation Model Uncertainties: Model-Data and Model-Model Comparisons
T. W. Armstrong; B. L. Colborn
2000-01-01
The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux and dose measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP, LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the
NASA Technical Reports Server (NTRS)
Kerstman, Eric; Minard, Charles; Saile, Lynn; Freiere deCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Iyengar, Sriram; Johnson-Throop, Kathy; Baumann, David
2010-01-01
The goals of the Integrated Medical Model (IMM) are to develop an integrated, quantified, evidence-based decision support tool useful to crew health and mission planners and to help align science, technology, and operational activities intended to optimize crew health, safety, and mission success. Presentation slides address scope and approach, beneficiaries of IMM capabilities, history, risk components, conceptual models, development steps, and the evidence base. Space adaptation syndrome is used to demonstrate the model's capabilities.
Modelling Message Handling System
Insu Song; Pushkar Piggott
2003-01-01
\\u000a This paper introduces a new approach to designing a Message Handling Assistant (MA). It provides a generic model of an MA\\u000a and an intention extraction function for text messages using speech act theory and the belief-desire-intention (BDI) model\\u000a of rational agency. The model characterizes the desired behaviors of an MA and the relationships between the MA, its user,\\u000a and other
Global Atmospheric Aerosol Modeling
NASA Technical Reports Server (NTRS)
Hendricks, Johannes; Aquila, Valentina; Righi, Mattia
2012-01-01
Global aerosol models are used to study the distribution and properties of atmospheric aerosol particles as well as their effects on clouds, atmospheric chemistry, radiation, and climate. The present article provides an overview of the basic concepts of global atmospheric aerosol modeling and shows some examples from a global aerosol simulation. Particular emphasis is placed on the simulation of aerosol particles and their effects within global climate models.
Isad Šari'c; Nedžad Rep?i'c; Adil Muminovi'c
2010-01-01
In this paper, the results of the research of three-dimensional (3D) parameter modelling of different types of standard catalogue gears using CATIA V5 software system. Gears modelling by computers are based on geometric and perspective transformation which is not more detail examined in the paper because of their large scope. Parameter modelling application makes possible the control of created 3D
Atmospheric prediction model survey
NASA Technical Reports Server (NTRS)
Wellck, R. E.
1976-01-01
As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.
Mark S Miesch
2008-01-01
Mean-field dynamo models and 3D magnetohydrodynamic (MHD) simulations of global-scale solar convection provide complementary insights into the origins of cyclic magnetic activity in the Sun. One particular class of mean-field dynamo models, known as the Babcock-Leighton Flux-Transport (BL-FT) modeling approach, has enjoyed much success recently in reproducing many aspects of the solar activity cycle. We review the essential ingredients of
Malcolm Carr
1984-01-01
Conclusion This paper has been a preliminary discussion of model confusion about acids and bases, presenting evidence (some of it to\\u000a be elaborated) that the Arrhenius and the Lowry-Bronsted models are confused in some textbooks, and in many students' minds.\\u000a \\u000a A similar analysis of other concepts in chemistry (are some problems about ions a results of carrying Daltonian and Newtonian\\u000a models
NSDL National Science Digital Library
Robert MacKay
In this JAVA-based interactive modeling activity, students are introduced to the concept of mass balance, flow rates, and equilibrium using a simple water bucket model. Students can vary flow rate into the bucket, initial water level in the bucket, and residence time of water in the bucket. After running the model, the bucket's water level as a function of time is presented graphically and in tabular form.
NASA Astrophysics Data System (ADS)
Jöckel, Patrick
Earth system models are important research tools for improving understanding of the climate system and for simulating climate projections. This chapter is devoted to the basic construction principles and challenges of such models, whereas application examples are provided in companion chapters. Since they still do not incorporate the full complexity of the real climate system (and maybe never will), Earth system models nowadays typically focus on specific aspects, for instance on the role of chemically active substances in the climate system.
Multistage models for carcinogenesis.
Freedman, D A; Navidi, W C
1989-01-01
The multistage model is tested on several human and animal data sets. It fits in some cases but not in others. With human lung cancer data, there is a drop in risk for ex-smokers quite different from the predictions of the model. The results are not conclusive but are compatible with the view that the multistage model provides a family of curves that often fit cancer incidence data, but may not capture the underlying biological reality. PMID:2667978
Mathematical Models of Narcolepsy
Cecilia Diniz Behn
\\u000a Mathematical modeling offers a way to critically test experimentally derived theories, integrate experimental results across\\u000a spatial and temporal scales, and generate predictions to drive bench science and influence clinical practice. Although, in\\u000a general, mathematical modeling approaches have been applied to narcolepsy only recently, developments in modeling normal sleep\\/wake\\u000a behavior have laid an excellent foundation for linking the experimental insights about
Modeling Skeletal Muscle Contraction
Suresh R. Devasahayam
\\u000a Skeletal muscles in amphibians and mammals have been subjected to extensive investigation due to their relatively easy access\\u000a and simple structure compared to other physiological systems. Consequently fairly detailed models exist for skeletal muscle\\u000a behavior. Several competing models have been proposed that succeed to varying extents in explaining skeletal muscle behavior.\\u000a Although abundant experimental data exists, no existing model is
NSDL National Science Digital Library
The Princeton Ocean Model (POM), a sigma coordinate, free surface, ocean model, can be "used for modeling estuaries, coastal regions, basin, and global oceans." Users can find helpful guides on how to use the freely distributed POM. The web site offers downloads of the proceedings of past meetings. Researchers can find links to data sources, national agencies and labs, and organizations. The Applications link offers numerous examples of organizations that have used POM in their research projects.
Making Mendel's Model Manageable
NSDL National Science Digital Library
Karen Mesmer
2006-01-01
Genetics is often a fascinating but difficult subject for middle level students. This engaging activity presents an approach that helps students understand how genotypes can translate into phenotypes using Gummi Bears and Gummi Dolphins to solve problems using Mendel's model, and then revising the model as necessary. Developing a model gives students a sense of how science works and how data translate into scientific ideas.
NSDL National Science Digital Library
2013-05-15
This activity helps learners visualize the Human Immunodeficiency Virus (HIV) by constructing three-dimensional HIV particle models from paper. The model to be used is a 20-sided polyhedron (icosahedron) and represents a complete viral particle. Learners combine their finished models into one mass. This is a first step toward estimating how many HIV particles could be contained inside a white blood cell before being released into the blood stream to attack new cells.
System Design Conceptual Model
Kari Tiensyrjä; Jean Mermet
This chapter presents the foundations of the System Design Conceptual Model (SDCM). The SDCM is a meta-model that serves as\\u000a a reference model of, and gives a global view and perspective on system design. The SDCM is used to describe system design\\u000a from the viewpoints of the System Design Process (SDP) and the System Under Design (SUD). The SDP and
Marks, C.H.
1987-12-01
To evaluate proposed incineration systems, designers, operators, and regulatory personnel have had to resort to general computerized combustion models. These, however, have only limited application to incinerators. This article describes one way to build a computerized model specially designed for incinerators. If the model is built to cover a wide spectrum of incinerator applications, it can provide a consistent means of quickly comparing design configuration method, the author built his model in the Lotus 1-2-3 spreadsheet, with which more people are familiar.
Models of scientific explanation
Sutton, Peter Andrew
2005-08-29
well to each branch of physical science. I present a number of different explanatory models in this paper, and some might argue that as each has different strengths and weaknesses, that each should be eclectically applied according to where it best... the inductive-statistical (I-S) 8 model to augment the explanatory power given us by the D-N model. Here is the form of the I-S model: (I-S)* p(Gx | Fx) = r Fi ======== [r] Gi This is an adequate explanation relative to a given "knowledge...
Railway switch transport model.
Horvat, Martin; Prosen, Tomaž; Benenti, Giuliano; Casati, Giulio
2012-11-01
We propose a simple model of coupled heat and particle transport based on zero-dimensional classical deterministic dynamics, which is reminiscent of a railway switch whose action is a function only of the particle's energy. It is shown that already in the minimal three-terminal model, where the second terminal is considered as a probe with zero net particle and heat currents, one can find extremely asymmetric Onsager matrices as a consequence of time-reversal symmetry breaking of the model. This minimalistic transport model provides a better understanding of thermoelectric heat engines in the presence of time-reversal symmetry breaking. PMID:23214829
Generalised integrable Hubbard models
James Drummond; Giovanni Feverati; Luc Frappat; Eric Ragoucy
2007-12-12
We construct the XX and Hubbard-like models based on unitary superalgebras gl(N|M) generalizing Shastry's and Maassarani's approach. We introduce the R-matrix of the gl(N|M) XX-type model; the one of the Hubbard-like model is defined by "coupling" two independent XX models. In both cases, we show that the R-matrices satisfy the Yang-Baxter equation. We derive the corresponding local Hamiltonian in the transfer matrix formalism and we determine its symmetries. A perturbative calculation "\\`a la Klein and Seitz" is performed. Some explicit examples are worked out. We give a description of the two-particle scattering.
Models of holographic superconductivity
Aprile, Francesco [Institute of Cosmos Sciences and Estructura i Constituents de la Materia Facultat de Fisica, Universitat de Barcelona, Avenida Diagonal 647, 08028 Barcelona (Spain); Russo, Jorge G. [Institute of Cosmos Sciences and Estructura i Constituents de la Materia Facultat de Fisica, Universitat de Barcelona, Avenida Diagonal 647, 08028 Barcelona (Spain); Institucio Catalana de Recerca i Estudis Avancats (ICREA), Paseo Lluis Companys, 23, 08010 Barcelona (Spain)
2010-01-15
We construct general models for holographic superconductivity parametrized by three couplings which are functions of a real scalar field and show that under general assumptions they describe superconducting phase transitions. While some features are universal and model independent, important aspects of the quantum critical behavior strongly depend on the choice of couplings, such as the order of the phase transition and critical exponents of second-order phase transitions. In particular, we study a one-parameter model where the phase transition changes from second to first order above some critical value of the parameter and a model with tunable critical exponents.
NASA Technical Reports Server (NTRS)
Sapyta, Joe; Reid, Hank; Walton, Lew
1993-01-01
The topics are presented in viewgraph form and include the following: particle bed reactor (PBR) core cross section; PBR bleed cycle; fuel and moderator flow paths; PBR modeling requirements; characteristics of PBR and nuclear thermal propulsion (NTP) modeling; challenges for PBR and NTP modeling; thermal hydraulic computer codes; capabilities for PBR/reactor application; thermal/hydralic codes; limitations; physical correlations; comparison of predicted friction factor and experimental data; frit pressure drop testing; cold frit mask factor; decay heat flow rate; startup transient simulation; and philosophy of systems modeling.
Visualizing Risk Prediction Models
Van Belle, Vanya; Van Calster, Ben
2015-01-01
Objective Risk prediction models can assist clinicians in making decisions. To boost the uptake of these models in clinical practice, it is important that end-users understand how the model works and can efficiently communicate its results. We introduce novel methods for interpretable model visualization. Methods The proposed visualization techniques are applied to two prediction models from the Framingham Heart Study for the prediction of intermittent claudication and stroke after atrial fibrillation. We represent models using color bars, and visualize the risk estimation process for a specific patient using patient-specific contribution charts. Results The color-based model representations provide users with an attractive tool to instantly gauge the relative importance of the predictors. The patient-specific representations allow users to understand the relative contribution of each predictor to the patient’s estimated risk, potentially providing insightful information on which to base further patient management. Extensions towards non-linear models and interactions are illustrated on an artificial dataset. Conclusion The proposed methods summarize risk prediction models and risk predictions for specific patients in an alternative way. These representations may facilitate communication between clinicians and patients. PMID:26176945
Lightning return stroke models
NASA Technical Reports Server (NTRS)
Lin, Y. T.; Uman, M. A.; Standler, R. B.
1980-01-01
We test the two most commonly used lightning return stroke models, Bruce-Golde and transmission line, against subsequent stroke electric and magnetic field wave forms measured simultaneously at near and distant stations and show that these models are inadequate to describe the experimental data. We then propose a new return stroke model that is physically plausible and that yields good approximations to the measured two-station fields. Using the new model, we derive return stroke charge and current statistics for about 100 subsequent strokes.
NSDL National Science Digital Library
NASA: Challenger Center
With this carbon/temperature interactive model, students investigate the role of atmospheric carbon in the greenhouse effect using a relationship between atmospheric carbon dioxide and global temperature.
NONE
1997-04-01
Western Research Institute (wRI) has developed a numerical model (TCROW) to describe CROW{sup TM} processing of contaminated aquifers. CROW is a patented technology for the removal of contaminant organics from water-saturated formations by injection of hot water or low- temperature steam. TCROW is based on a fully implicit, thermal, compositional model (TSRS) previously developed by wRI. TCROW`s formulation represents several enhancements and simplifications over TSRS and results in a model specifically tailored to model the CROW process.
General background on modeling and specifics of modeling vapor intrusion are given. Three classical model applications are described and related to the problem of petroleum vapor intrusion. These indicate the need for model calibration and uncertainty analysis. Evaluation of Bi...
Model selection in compositional spaces
Grosse, Roger Baker
2014-01-01
We often build complex probabilistic models by composing simpler models-using one model to generate parameters or latent variables for another model. This allows us to express complex distributions over the observed data ...
Epidemic modeling techniques for smallpox
McLean, Cory Y. (Cory Yuen Fu)
2004-01-01
Infectious disease models predict the impact of outbreaks. Discrepancies between model predictions stem from both the disease parameters used and the underlying mathematics of the models. Smallpox has been modeled extensively ...
The Anderson Model as a matrix model
J. Magnen; G. Poirot; V. Rivasseau
1997-01-01
In this paper we describe a strategy to study the Anderson model of an electron in a random potential at weak coupling by a renormalization group analysis. There is an interesting technical analogy between this problem and the theory of random matrices. In d = 2 the random matrices which appear are approximately of the free type well known to
Modeling Imports in a Keynesian Expenditure Model
ERIC Educational Resources Information Center
Findlay, David W.
2010-01-01
The author discusses several issues that instructors of introductory macroeconomics courses should consider when introducing imports in the Keynesian expenditure model. The analysis suggests that the specification of the import function should partially, if not completely, be the result of a simple discussion about the spending and import…
Geostatistical Modeling: Model Selection and Parameter Estimation
prediction) for spatial models · Mardia and Marshall (1984) · Zimmerman and Zimmerman (1991) · Zimmerman and Cressie (1992) · Cressie (1993) · Abt (1999) · Lark (2000) · Zimmerman (2005) Asymptotics for spatial and Zimmerman (2005) J. Hoeting, CSU, 2005 5 #12;GOALS OF RESEARCH: Part 1 What are the implications of changing
Biosphere Process Model Report
J. Schmitt
2000-05-25
To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor. Collectively, the potential human receptor and exposure pathways form the biosphere model. More detailed technical information and data about potential human receptor groups and the characteristics of exposure pathways have been developed in a series of AMRs and Calculation Reports.
Active Shape Models - 'Smart Snakes
T. F. Cootes; C. J. Taylor
1992-01-01
We describe 'Active Shape Models' which iteratively adapt to refine esti- mates of the pose, scale and shape of models of image objects. The method uses flexible models derived from sets of training examples. These models, known as Point Distribution Models, represent objects as sets of labelled points. An initial estimate of the location of the model points in an
Acoustic models and sonar systems
Michael B. Porter; Z MEDINA BANK; X BALTIC
1993-01-01
The basic types of acoustic models are reviewed. These include ray models, spectral integral models, normal mode models, parabolic equation modeling, and 3-D acoustic modeling. Their application to conventional sonar simulation problems is demonstrated. Examples of their use in more advanced signal processing applications are presented
Mathematical modeling of scroll compressors
Yu Chen
2000-01-01
This thesis presents the development of a comprehensive R-410A scroll compressor model. The model is based on a previously existing R-22 scroll compressor model. This comprehensive compressor model combines a detailed compression process model and an overall compressor model and was used to investigate the compressor's performance under different operating conditions and subject to design changes. ^ The governing mass
Bayesian Model Averaging: A Tutorial
Jennifer A. Hoeting; David Madigan; Adrian E. Raftery; Chris T. Volinsky
Standard statistical practice ignores model uncertainty. Data analysts typically select a model from some class of models and then proceed as if the selected model had generated the data. This approach ignores the uncertainty in model selection, leading to over-confident in- ferences and decisions that are more risky than one thinks they are. Bayesian model averaging (BMA) provides a coherent
Complexity regularized hydrological model selection
NASA Astrophysics Data System (ADS)
Arkesteijn, Liselot; Pande, Saket; Savenije, Hubert
2014-05-01
Ill-posed hydrological model selection problems (that may be unstable or have non-unique solutions) are regularized with hydrological model complexity as the stabilizer. We propose and apply a notion of model complexity, based on Vapnik-Chervonenkis generalization theory, to complexity regularized hydrologic model selection. Better hydrologic models (better performance on future unseen data) on small sample sizes are identified using complexity regularized model selection than when using traditional model selection (without regularization) while both converge in performance for large samples (i.e. regularized model selection is 'consistent'). Case studies using SAC-SMA, SIXPAR and flexible model structures are used to 1) compute and compare model complexities of different model structures, 2) demonstrate the 'consistency' of complexity regularized model selection and 3) demonstrate that regularized model selection identifies the best model structure (out of a set of competing structures) on small sample sizes better than un-regularized model selection.
Interharmonics: Theory and Modeling
A. Testa; M. F. Akram; R. Burch; G. Carpinelli; G. Chang; V. Dinavahi; C. Hatziadoniu; W. M. Grady; E. Gunther; M. Halpin; P. Lehn; Y. Liu; R. Langella; M. Lowenstein; A. Medina; T. Ortmeyer; S. Ranade; P. Ribeiro; N. Watson; J. Wikston; W. Xu
2007-01-01
Some of the most remarkable issues related to interharmonic theory and modeling are presented. Starting from the basic definitions and concepts, attention is first devoted to interharmonic sources. Then, the interharmonic assessment is considered with particular attention to the problem of the frequency resolution and of the computational burden associated with the analysis of periodic steady-state waveforms. Finally, modeling of
Peter Hasenfratz; Julius Kuti
1978-01-01
The quark bag model is reviewed here with particular emphasis on spectroscopic applications and the discussion of exotic objects as baryonium, gluonium, and the quark phase of matter. The physical vacuum is pictured in the model as a two-phase medium. In normal phase of the vacuum, outside hadrons, the propagation of quark and gluon fields is forbidden. When small bubbles
Jan Bartlema
1988-01-01
A combined marco-micro model is applied to a population similar to that forecast for 2035 in the Netherlands in order to simulate the effect on kinship networks of a mating system of serial monogamy. The importance of incorporating a parameter for the degree of concentration of childbearing over the female population is emphasized. The inputs to the model are vectors
ERIC Educational Resources Information Center
Zwaan, Rolf A.; Madden, Carol J.
2004-01-01
The authors examined how situation models are updated during text comprehension. If comprehenders keep track of the evolving situation, they should update their models such that the most current information, the here and now, is more available than outdated information. Contrary to this updating hypothesis, E. J. O'Brien, M. L. Rizzella, J. E.…
ERIC Educational Resources Information Center
Buggey, Tom; Ogle, Lindsey
2012-01-01
Video self-modeling (VSM) first appeared on the psychology and education stage in the early 1970s. The practical applications of VSM were limited by lack of access to tools for editing video, which is necessary for almost all self-modeling videos. Thus, VSM remained in the research domain until the advent of camcorders and VCR/DVD players and,…
NASA Astrophysics Data System (ADS)
Taniguchi, Tadahiro; Sawaragi, Tetsuo
In this paper, a new machine-learning method, called Dual-Schemata model, is presented. Dual-Schemata model is a kind of self-organizational machine learning methods for an autonomous robot interacting with an unknown dynamical environment. This is based on Piaget's Schema model, that is a classical psychological model to explain memory and cognitive development of human beings. Our Dual-Schemata model is developed as a computational model of Piaget's Schema model, especially focusing on sensori-motor developing period. This developmental process is characterized by a couple of two mutually-interacting dynamics; one is a dynamics formed by assimilation and accommodation, and the other dynamics is formed by equilibration and differentiation. By these dynamics schema system enables an agent to act well in a real world. This schema's differentiation process corresponds to a symbol formation process occurring within an autonomous agent when it interacts with an unknown, dynamically changing environment. Experiment results obtained from an autonomous facial robot in which our model is embedded are presented; an autonomous facial robot becomes able to chase a ball moving in various ways without any rewards nor teaching signals from outside. Moreover, emergence of concepts on the target movements within a robot is shown and discussed in terms of fuzzy logics on set-subset inclusive relationships.
NSDL National Science Digital Library
Juliann Garza (University of Texas-Pan American Physician Assistant Studies)
2010-08-16
Lesson is designed to introduce students to cranial nerves through the use of an introductory lecture. Students will then create a three-dimensional model of the cranial nerves. An information sheet will accompany the model in order to help students learn crucial aspects of the cranial nerves.
Model Checking Contractual Protocols
Aspassia Daskalopulu
2001-01-01
Abstract. This paper discusses how model checking, a technique used for the verification of behavioural requirements of dynamic systems, can be usefully deployed for the verification of contracts. A process view of agreements between parties is taken, whereby a c ontract i s modelled as it evolves over time in terms of actions or more generally events that effect changes
J. C. Kips; G. P. Anderson; J. J. Fredberg; U. Herz; M. D. Inman; M. Jordana; D. M. Kemeny; J. Lotvall; R. A. Pauwels; C. G. Plopper; D. Schmidt; P. J. Sterk; A. J. M. Van Oosterhout; B. B. Vargaftig; K. F. Chung
2003-01-01
In vivo animal models can offer valuable information on several aspects of asthma pathogenesis and treatment. The mouse is increasingly used in these models, mainly because this species allows for the application in vivo of a broad range of immunological tools, including gene deletion technology. Mice, therefore, seem particularly useful to further elucidate factors influencing the response to inhaled allergens.
K. I. Calvert; M. B. Doar; E. W. Zegura
1997-01-01
The topology of a network, or a group of networks such as the Internet, has a strong bearing on many management and performance issues. Good models of the topological structure of a network are essential for developing and analyzing internetworking technology. This article discusses how graph-based models can be used to represent the topology of large networks, particularly aspects of
Spreadsheet Modeling for Insight
Stephen G. Powell
It is widely recognized that spreadsheets are error-filled, their creators are over- confident, and the process by which they are developed is chaotic. It is less well- understood that spreadsheet users generally lack the skills needed to derive practical insights from their models. Modeling for insight requires skills in establishing a base case, performing sensitivity analysis, using back-solving, and (when
ERIC Educational Resources Information Center
Gustafson, B. Kerry; Hample, Stephen R.
General documentation for the Enrollment Projection Model used by the Maryland Council for Higher Education (MCHE) is provided. The manual is directed toward both the potential users of the model as well as others interested in enrollment projections. The first four chapters offer administrators or planners insight into the derivation of the…
Unitary Response Regression Models
ERIC Educational Resources Information Center
Lipovetsky, S.
2007-01-01
The dependent variable in a regular linear regression is a numerical variable, and in a logistic regression it is a binary or categorical variable. In these models the dependent variable has varying values. However, there are problems yielding an identity output of a constant value which can also be modelled in a linear or logistic regression with…
ERIC Educational Resources Information Center
Parks, Melissa
2014-01-01
Model-eliciting activities (MEAs) are not new to those in engineering or mathematics, but they were new to Melissa Parks. Model-eliciting activities are simulated real-world problems that integrate engineering, mathematical, and scientific thinking as students find solutions for specific scenarios. During this process, students generate solutions…
Winslow
1988-01-01
The BOILER PERFORMANCE MODEL is a package of eleven programs for predicting the heat transfer performance of fossil-fired utility boilers. The programs can model a wide variety of boiler designs, provide boiler performance estimates for coal, oil or gaseous fuels, determine the influence of slagging and fouling characteristics on boiler performance, and calculate performance factors for tradeoff analyses comparing boilers
Multilevel Mixture Factor Models
ERIC Educational Resources Information Center
Varriale, Roberta; Vermunt, Jeroen K.
2012-01-01
Factor analysis is a statistical method for describing the associations among sets of observed variables in terms of a small number of underlying continuous latent variables. Various authors have proposed multilevel extensions of the factor model for the analysis of data sets with a hierarchical structure. These Multilevel Factor Models (MFMs)…
Ion Thruster Performance Model
John Raymond Brophy
1984-01-01
A model of ion thruster performance is developed for high flux density cusped magnetic field thruster designs. This model is formulated in terms of the average energy required to produce an ion in the discharge chamber plasma and the fraction of these ions that are extracted to form the beam. The direct loss of high energy (primary) electrons from the
Eric S. K. Yu
2009-01-01
Many different types of models are used in various scientific and engineering fields, reflecting the subject matter and the kinds of understanding that is sought in each field. Conceptual modeling techniques in software and information systems engineering have in the past focused mainly on describing and analyzing behaviours and structures that are implementable in software. As software systems become ever
Parsimony and Model Evaluation.
ERIC Educational Resources Information Center
Mulaik, Stanley A.
1998-01-01
Argues that H. W. Marsh and K.-T. Hau (1996) misunderstood parsimony and its role in testing a hypothesis about an incompletely specified model to establish its objective validity. More parsimonious models represent more complete hypotheses having more ways of being tested and confirmed. Marsh and Hau could also have used more parsimonious…
Models for composition dependence
Ibrahim Ansara; Ben Burton; Qing Chen; Mats Hillert; Armando Fernandez-Guillermet; Suzana G. Fries; Hans Leo Lukas; Hans-Jürgen Seifert; W. Alan Oates
2000-01-01
A variety of problems related to modelling of composition dependent Gibbs energies and other thermodynamic properties is discussed. The current procedure adopted in the CALPHAD method for modelling intermediate metallic nonstoichiometric phases is described and analysed. Some possibilities towards more physical approaches are pointed out.
Composite Load Model Evaluation
Lu, Ning; Qiao, Hong (Amy)
2007-09-30
The WECC load modeling task force has dedicated its effort in the past few years to develop a composite load model that can represent behaviors of different end-user components. The modeling structure of the composite load model is recommended by the WECC load modeling task force. GE Energy has implemented this composite load model with a new function CMPLDW in its power system simulation software package, PSLF. For the last several years, Bonneville Power Administration (BPA) has taken the lead and collaborated with GE Energy to develop the new composite load model. Pacific Northwest National Laboratory (PNNL) and BPA joint force and conducted the evaluation of the CMPLDW and test its parameter settings to make sure that: • the model initializes properly, • all the parameter settings are functioning, and • the simulation results are as expected. The PNNL effort focused on testing the CMPLDW in a 4-bus system. An exhaustive testing on each parameter setting has been performed to guarantee each setting works. This report is a summary of the PNNL testing results and conclusions.
Career Education Personnel Model.
ERIC Educational Resources Information Center
Odbert, John T.; Trotter, Eugene E.
The purpose of the Career Education Personnel Model (CEPM) was to develop competency-based models for the preparation and training of education personnel (K-12) who will plan and implement career education programs at the local level. The CEPM staff identified educational personnel competencies essential to local career education programs by…
Technology Transfer Automated Retrieval System (TEKTRAN)
Pigeonpea (Cajanus cajan (L.) Millsp.) is a widely grown legume in tropical and subtropical areas. A crop simulation model that can assist in farmer decision-making was developed. The phenological module is one of the major elements of the crop model because accurate prediction of the timing of gr...
Stiff magnetofluid cosmological model
Bali, R.; Tyagi, A.
1988-05-01
We investigate the behavior of the magnetic field in a cosmological model filled with a stiff perfect fluid in general relativity. The magnetic field is due to an electric current along the x axis. The behavior of the model when a magnetic field is absent is also discussed.
Models of technology diffusion
P. A. Geroski
2000-01-01
The literature on new technology diffusion is vast, and it spills over many conventional disciplinary boundaries. This paper surveys the literature by focusing on alternative explanations of the dominant stylized fact: that the usage of new technologies over time typically follows an S-curve. The most commonly found model which is used to account for this model is the so-called epidemic
Automated Student Model Improvement
ERIC Educational Resources Information Center
Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.
2012-01-01
Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…
Nishimura, Hiroshi
1993-05-01
Object-Oriented Programming has been used extensively to model the LBL Advanced Light Source 1.5 GeV electron storage ring. This paper is on the present status of the class library construction with emphasis on a dynamic modeling.
Harry J. Lipkin; MARIA GOEPPERT-MAYER
1984-01-01
A general overview of quark models of hadrons is presented. Experiment results and theoretical attempts to explain the data are discussed. Bag models and pion clouds, hadron masses and baryon magnetic moments, quark clustering and NN interactions, and the GA\\/GV ratio are topics included in this review. (AIP)
Edward L. Wright
2001-06-22
Models of the zodiacal light are necessary to convert measured data taken from low Earth orbit into the radiation field outside the solar system. The uncertainty in these models dominates the overall uncertainty in determining the extragalactic background light for wavelengths < 100 microns.
Jacob J. Jacobson; Gretchen Matthern
2007-04-01
System Dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, System Dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The real power of System Dynamic modeling is gaining insights into total system behavior as time, and system parameters are adjusted and the effects are visualized in real time. System Dynamic models allow decision makers and stakeholders to explore long-term behavior and performance of complex systems, especially in the context of dynamic processes and changing scenarios without having to wait decades to obtain field data or risk failure if a poor management or design approach is used. The Idaho National Laboratory recently has been developing a System Dynamic model of the US Nuclear Fuel Cycle. The model is intended to be used to identify and understand interactions throughout the entire nuclear fuel cycle and suggest sustainable development strategies. This paper describes the basic framework of the current model and presents examples of useful insights gained from the model thus far with respect to sustainable development of nuclear power.
Penny, Will
Empirical Bayes Will Penny Linear Models Empirical Bayes Isotropic Covariances EM Algorithm Coding MAP Learning Self-Inhibition Receptive Fields References Empirical Bayes Will Penny Bayesian Inference Course, WTCN, UCL, March 2013 #12;Empirical Bayes Will Penny Linear Models Empirical Bayes
NASA Technical Reports Server (NTRS)
Sellers, Piers
2012-01-01
Model results will be reviewed to assess different methods for bounding the terrestrial role in the global carbon cycle. It is proposed that a series of climate model runs could be scoped that would tighten the limits on the "missing sink" of terrestrial carbon and could also direct future satellite image analyses to search for its geographical location and understand its seasonal dynamics.
Neutrosophic Relational Data Model
Haibin Wang; Rajshekhar Sunderraman; Florentin Smarandache; Andre Rogatko
2007-01-01
In this paper, we present a generalization of the relational data model based on interval neutrosophic set (1). Our data model is capable of manipulating incomplete as well as inconsistent information. Fuzzy relation or intuitionistic fuzzy relation can only handle incomplete information. Associated with each relation are two membership functions one is called truth-membership function T which keeps track of
Unsupervised acoustic model training
Lori Lamel; Jean-Luc Gauvain; Gilles Adda
2002-01-01
This paper describes some recent experiments using unsupervised techniques for acoustic model training in order to reduce the system development cost. The approach uses a speech recognizer to transcribe unannotated raw broadcast news data. The hypothesized transcription is used to create labels for the training data. Experiments providing supervision only via the language model training materials show that including texts
Mathematical models of hysteresis
NONE
1998-08-01
The ongoing research has largely been focused on the development of mathematical models of hysteretic nonlinearities with nonlocal memories. The distinct feature of these nonlinearities is that their current states depend on past histories of input variations. It turns out that memories of hysteretic nonlinearities are quite selective. Indeed, experiments show that only some past input extrema (not the entire input variations) leave their marks upon future states of hysteretic nonlinearities. Thus special mathematical tools are needed in order to describe nonlocal selective memories of hysteretic nonlinearities. The origin of such tools can be traced back to the landmark paper of Preisach. Their research has been primarily concerned with Preisach-type models of hysteresis. All these models have a common generic feature; they are constructed as superpositions of simplest hysteretic nonlinearities-rectangular loops. During the past four years, the study has been by and large centered around the following topics: (1) further development of Scalar and vector Preisach-type models of hysteresis; (2) experimental testing of Preisach-type models of hysteresis; (3) development of new models for viscosity (aftereffect) in hysteretic systems; (4) development of mathematical models for superconducting hysteresis in the case of gradual resistive transitions; (5) software implementation of Preisach-type models of hysteresis; and (6) development of new ideas which have emerged in the course of the research work. The author briefly describes the main scientific results obtained in the areas outlined above.
Michael Beer
2007-01-01
In this paper a novel technique for random vector sampling starting from rare data are presented. This model-free sampling technique is developed to operate without a probabilistic model. Instead of estimating a distribution function, the information contained in a given small sample is extracted directly to produce the sampling result as a second sample of considerably larger size that completely
LONGPRO Stream Modeling Exercise
NSDL National Science Digital Library
Bill Locke
The purpose of this exercise is to integrate modeling with field data. The activity includes links to a "virtual field trip" of maps and photographs. Data from a creek is included in the field trip and students use an Excel spreadsheet model to analyze the data.
B. F. Myers; F. C. Montgomery; R. N. Morris
1993-01-01
The equivalent sphere model, which is widely used in calculating the release of fission gases from nuclear fuel, is idealized. The model is based on the diffusion of fission products in and their escape from a homogeneous sphere of fuel; the fission products are generated at a constant rate and undergo radiodecay. The fuel is assumed to be a set
VENTURI SCRUBBER PERFORMANCE MODEL
The paper presents a new model for predicting the particle collection performance of venturi scrubbers. It assumes that particles are collected by atomized liquid only in the throat section. The particle collection mechanism is inertial impaction, and the model uses a single drop...
Econometric disequilibrium models
Richard E. Quandt
1982-01-01
Four basic strands in the disequilibrium literature are identified. Some examples are discussed and the canonical econometric disequilibrium model and its estimation are dealt with in detail. Specific criticisms of the canonical model,dealing with price and wage rigidity, with the nature of the min condition and the price-adjustment equation, are considered and a variety of modifications is entertained. Tests of
Sinusoids: Applications and Modeling
NSDL National Science Digital Library
Roberts, Lila F.
2004-07-21
This demo actively involves students via the software simulations so that the determination of the sinusoidal model has a geometric flavor that complements the algebraic tools stressed in texts. This approach also introduces a modeling aspect since in some situations we may only be able to obtain a "close" approximation to the actual curve or data. Animations and Excel routines are included.
NSDL National Science Digital Library
NASA
2005-01-01
In this activity, learners build a paper model of the spacecraft and photometer (telescope) used during NASA's Kepler Mission. This resource includes files for the model, which can be printed out on heavy stock paper, and instructions for assembling the various parts.
Multiplying Fractions (Area Model)
NSDL National Science Digital Library
Audrey Pruitt - on Hotchalk Lesson Plans Page
2012-04-22
In this teaching idea, students will learn how to use the area model to find the product when two fractions are multiplied. NOTE: Click the Download link on the right side of the screen to display the lesson without ads and to view the graphic example of the model.
Igor Shcherback; Orly Yadid-Pecht
2001-01-01
In this paper, a unified model, based on a thorough analysis of experimental data, is developed for the overall modulation transfer function (MTF) estimation for CMOS image sensors. The model covers the physical diffusion effect together with the influence of the pixel active area geometrical shape. Comparison of both our predicted results and the MTF calculated from the point spread
NSDL National Science Digital Library
2014-01-01
This lesson plan helps students understand multi-digit division by constructing area models. The included interactive provides two example problems and helps connect the model to the traditional long division algorithm. Students solve their own problems using graph paper and/or base-10 blocks and progress toward mental strategies.
Probabilistic modeling of surfaces
NASA Astrophysics Data System (ADS)
Szeliski, Richard
1991-09-01
Energy-based surface models are commonly used in computer vision to interpolate sparse data, to smooth noisy depth estimates, and to integrate measurements from multiple sensors and viewpoints. Traditionally, a single surface estimate is produced with such models. Probabilistic surface modeling, which describes distributions over possible surfaces, enables us to integrate such measurements in a statistically optimal fashion, to model the uncertainty in the surfaces, and to develop sequential estimation algorithms. When applied to 2-1/2-D surfaces, probabilistic modeling allows us to incrementally estimate depth maps from motion image sequences and to integrate sparse range data using elevation maps. How to jointly model depth and intensity images to obtain more accurate models of depth, is shown. To better represent the structure of the visual world, full 3-D surface models must be used. These are usually represented using parametric surfaces, which can create difficulties when the surface topology is unknown. To overcome these problems, an incremental patch-based 3-D surface estimation algorithm is developed. Surface and feature-based methods are compared and a unified representation which encompasses both methods is proposed.
Dasymetric Modeling and Uncertainty
Nagle, Nicholas N.; Buttenfield, Barbara P.; Leyk, Stefan; Speilman, Seth
2014-01-01
Dasymetric models increase the spatial resolution of population data by incorporating related ancillary data layers. The role of uncertainty in dasymetric modeling has not been fully addressed as of yet. Uncertainty is usually present because most population data are themselves uncertain, and/or the geographic processes that connect population and the ancillary data layers are not precisely known. A new dasymetric methodology - the Penalized Maximum Entropy Dasymetric Model (P-MEDM) - is presented that enables these sources of uncertainty to be represented and modeled. The P-MEDM propagates uncertainty through the model and yields fine-resolution population estimates with associated measures of uncertainty. This methodology contains a number of other benefits of theoretical and practical interest. In dasymetric modeling, researchers often struggle with identifying a relationship between population and ancillary data layers. The PEDM model simplifies this step by unifying how ancillary data are included. The P-MEDM also allows a rich array of data to be included, with disparate spatial resolutions, attribute resolutions, and uncertainties. While the P-MEDM does not necessarily produce more precise estimates than do existing approaches, it does help to unify how data enter the dasymetric model, it increases the types of data that may be used, and it allows geographers to characterize the quality of their dasymetric estimates. We present an application of the P-MEDM that includes household-level survey data combined with higher spatial resolution data such as from census tracts, block groups, and land cover classifications. PMID:25067846
Dasymetric Modeling and Uncertainty.
Nagle, Nicholas N; Buttenfield, Barbara P; Leyk, Stefan; Speilman, Seth
2014-01-01
Dasymetric models increase the spatial resolution of population data by incorporating related ancillary data layers. The role of uncertainty in dasymetric modeling has not been fully addressed as of yet. Uncertainty is usually present because most population data are themselves uncertain, and/or the geographic processes that connect population and the ancillary data layers are not precisely known. A new dasymetric methodology - the Penalized Maximum Entropy Dasymetric Model (P-MEDM) - is presented that enables these sources of uncertainty to be represented and modeled. The P-MEDM propagates uncertainty through the model and yields fine-resolution population estimates with associated measures of uncertainty. This methodology contains a number of other benefits of theoretical and practical interest. In dasymetric modeling, researchers often struggle with identifying a relationship between population and ancillary data layers. The PEDM model simplifies this step by unifying how ancillary data are included. The P-MEDM also allows a rich array of data to be included, with disparate spatial resolutions, attribute resolutions, and uncertainties. While the P-MEDM does not necessarily produce more precise estimates than do existing approaches, it does help to unify how data enter the dasymetric model, it increases the types of data that may be used, and it allows geographers to characterize the quality of their dasymetric estimates. We present an application of the P-MEDM that includes household-level survey data combined with higher spatial resolution data such as from census tracts, block groups, and land cover classifications. PMID:25067846
Mathematics for dynamic modeling
Edward Beltrami
1987-01-01
Mathematical modeling techniques for dynamical systems are presented in an introductory textbook intended for upper-undergraduate and graduate science and engineering students. Chapters are devoted to simple dynamic models, stable and unstable motion, growth and decay, motion in time and space, cycles and bifurcation, bifurcation and catastrophe, chaos, and optimal controls. Specific applications to biological systems, traffic control, and the geomagnetic
Computer Model Documentation Guide.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.
These guidelines for communicating effectively the details of computer model design and operation to persons with varying interests in a model recommend the development of four different types of manuals to meet the needs of managers, users, analysts and programmers. The guidelines for preparing a management summary manual suggest a broad spectrum…
Computer modeling of detonators
C. M. Furnberg
1994-01-01
A mathematical model of detonators which describes the resistance of the exploding bridgewire or exploding foil initiator as a function of energy deposition will be described. This model includes many parameters that can be adjusted to obtain a close fit to experimental data. This has been demonstrated using recent experimental data taken within Sandia National Laboratories
ERIC Educational Resources Information Center
Wang, Wen-Chung; Wilson, Mark
2005-01-01
The Rasch testlet model for both dichotomous and polytomous items in testlet-based tests is proposed. It can be viewed as a special case of the multidimensional random coefficients multinomial logit model (MRCMLM). Therefore, the estimation procedures for the MRCMLM can be directly applied. Simulations were conducted to examine parameter recovery…
Thorsten Brants
1999-01-01
This paper presents a new approach to partial parsing of context-free structures. The approach is based on Markov Models. Each layer of the resulting structure is represented by its own Markov Model, and output of a lower layer is passed as input to the next higher layer. An empirical evaluation of the method yields very good results for NP\\/PP chunking
ERIC Educational Resources Information Center
Fischbein, Efraim
2001-01-01
Analyses several examples of tacit influences exerted by mental models on the interpretation of various mathematical concepts in the domain of actual infinity. Specifically addresses the unconscious effect of the figural-pictorial models of statements related to the infinite sets of geometrical points related to the concepts of function and…
ERIC Educational Resources Information Center
Gabel, Dorothy; And Others
1992-01-01
Chemistry can be described on three levels: sensory, molecular, and symbolic. Proposes a particle approach to teaching chemistry that uses magnets to aid students construct molecular models and solve particle problems. Includes examples of Johnstone's model of chemistry phenomena, a problem worksheet, and a student concept mastery sheet. (MDH)
This lecture will present AQUATOX, an aquatic ecosystem simulation model developed by Dr. Dick Park and supported by the U.S. EPA. The AQUATOX model predicts the fate of various pollutants, such as nutrients and organic chemicals, and their effects on the ecosystem, including fi...
Software development lifecycle models
Nayan B. Ruparelia
2010-01-01
This history column article provides a tour of the main software development life cycle (SDLC) models. (A lifecycle covers all the stages of software from its inception with requirements definition through to fielding and maintenance.) System development lifecycle models have drawn heavily on software and so the two terms can be used interchangeably in terms of SDLC, especially since software
Richard Colombo; Weina Jiang
1999-01-01
A central problem in database marketing is how to choose which customers in the firm's database to target with an offer. This paper presents a simple stochastic RFM model to carry out such a task. By making a few straightforward assumptions about the customers in the database, the stochastic model provides a means of (1) ranking customers in terms of
ERIC Educational Resources Information Center
Flannery, Maura C.
1997-01-01
Addresses the most popular models currently being chosen for biological research and the reasons behind those choices. Among the current favorites are zebra fish, fruit flies, mice, monkeys, and yeast. Concludes with a brief examination of the ethical issues involved, and why some animals may need to be replaced in research with model systems.…
Computational Modeling of Tires
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (compiler); Tanner, John A. (compiler)
1995-01-01
This document contains presentations and discussions from the joint UVA/NASA Workshop on Computational Modeling of Tires. The workshop attendees represented NASA, the Army and Air force, tire companies, commercial software developers, and academia. The workshop objectives were to assess the state of technology in the computational modeling of tires and to provide guidelines for future research.
Bidimensional traffic flow models.
Boyer, Edmond
of the presentation 1. Motivation 2. Static assignment 3. Pedestrian models 4. Macroscopic fundamental diagrams 5, Geroliminis (2006): isotropic macroscopic fundamental diagram (behavioral law) Jiang et al (2011): dynamicGRETTIA Bidimensional traffic flow models. J.P. Lebacque IFSTTAR-COSYS-GRETTIA Le Descartes 2, 2
This task provides credible state of the art air quality models and guidance for use in implementation of National Ambient Air Quality Standards for ozone and PM. This research effort is to develop and improve air quality models, such as the Community Multiscale Air Quality (CMA...
AGRICULTURAL SIMULATION MODEL (AGSIM)
AGSIM is a large-scale econometric simulation model of regional crop and national livestock production in the United States. The model was initially developed to analyze the aggregate economic impacts of a wide variety issues facing agriculture, such as technological change, pest...
E. Allen Emerson; A. Prasad Sistla
1993-01-01
We show how to exploit symmetry in model checking for concurrent systems containing many identical or isomorphic components. We focus in particular on those composed of many isomorphic processes. In many cases we are able to obtain significant, even exponential, savings in the complexity of model checking.
Thom, Ronald M.; Judd, Chaeli
2007-07-27
Successful restoration of wetland habitats depends on both our understanding of our system and our ability to characterize it. By developing a conceptual model, looking at different spatial scales and integrating diverse data streams: GIS datasets and NASA products, we were able to develop a dynamic model for site prioritization based on both qualitative and quantitative relationships found in the coastal environment.
Peter Comer; Jonathan Chard
1993-01-01
Existing process assessment methods are inherently limited as a tool for evaluation of any specific technical area of software development. A review of the coverage of software measurement within two examples of existing process assessment methods is presented. A model of the software measurement process is presented and discussed. The model has been used as the basis for development of
Gill Barequet; Subodh Kumar
1997-01-01
We describe an algorithm for repairing polyhedral CAD models that have errors in their B-REP. Errors like cracks, degeneracie s, du- plication, holes and overlaps are usually introduced in sol id mod- els due to imprecise arithmetic, model transformations, de signer's fault, programming bugs, etc. Such errors often hamper furt her pro- cessing like finite element analysis, radiosity computatio n
NSDL National Science Digital Library
Ethel D. Stanley (Beloit College; Biology)
2006-05-20
Is bacterial growth always exponential? Do bacteria with the fastest rate of growth always have the largest populations? Biota models offer extended opportunities to observe population growth over time. What are the factors that affect growth? Explore continuous, chaotic, and cyclic growth models. * examine the dynamics of growth for populations of virtual bacteria with differing growth rates and carrying capacities
Stuart Raby
2009-11-06
In this talk I review some recent progress in heterotic and F theory model building. I then consider work in progress attempting to find the F theory dual to a class of heterotic orbifold models which come quite close to the MSSM.
Stuart Raby
2007-10-19
I review some of the latest directions in supersymmetric model building, focusing on SUSY breaking mechanisms in the minimal supersymmetric standard model [MSSM], the "little" hierarchy and $\\mu$ problems, etc. I then discuss SUSY GUTs and UV completions in string theory.
Raby, Stuart [Physics Department, Ohio State University, 191 W. Woodruff Ave., Columbus, OH 43210 (United States)
2010-02-10
In this talk I review some recent progress in heterotic and F theory model building. I then consider work in progress attempting to find the F theory dual to a class of heterotic orbifold models which come quite close to the MSSM.
ERIC Educational Resources Information Center
Eichinger, John
2005-01-01
Models are crucial to science teaching and learning, yet they can create unforeseen and overlooked challenges for students and teachers. For example, consider the time-tested clay volcano that relies on a vinegar and-baking-soda mixture for its "eruption." Based on a classroom demonstration of that geologic model, elementary students may interpret…
NASA Technical Reports Server (NTRS)
Knezovich, F. M.
1976-01-01
A modular structured system of computer programs is presented utilizing earth and ocean dynamical data keyed to finitely defined parameters. The model is an assemblage of mathematical algorithms with an inherent capability of maturation with progressive improvements in observational data frequencies, accuracies and scopes. The Eom in its present state is a first-order approach to a geophysical model of the earth's dynamics.
QUAL2K (or Q2K) is a river and stream water quality model that is intended to represent a modernized version of the QUAL2E (or Q2E) model (Brown and Barnwell 1987). Q2K is similar to Q2E in the following respects: One dimensional. The channel is well-mixed vertically a...
ERIC Educational Resources Information Center
Weinburgh, Molly; Silva, Cecilia
2011-01-01
For the past five summers, the authors have taught summer school to recent immigrants and refugees. Their experiences with these fourth-grade English language learners (ELL) have taught them the value of using models to build scientific and mathematical concepts. In this article, they describe the use of different forms of 2- and 3-D models to…
A hybrid receptor model is a specified mathematical procedure which uses not only the ambient species concentration measurements that form the input data for a pure receptor model, but in addition source emission rates or atmospheric dispersion or transformation information chara...
This magnetic tape contains the FORTRAN source code, sample input data, and sample output data for the Photochemical Box Model (PBM). The PBM is a simple stationary single-cell model with a variable height lid designed to provide volume-integrated hour averages of O3 and other ph...
Barchet, W.R. (Pacific Northwest Lab., Richland, WA (United States)); Dennis, R.L. (Environmental Protection Agency, Research Triangle Park, NC (United States)); Seilkop, S.K. (Analytical Sciences, Inc., Durham, NC (United States)); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. (Atmospheric Environment Service, Downsview, ON (Canada)); Byun, D.; McHenry, J.N.
1991-12-01
The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.
NASA Astrophysics Data System (ADS)
Xie, Qiong-Tao; Cui, Shuai; Cao, Jun-Peng; Amico, Luigi; Fan, Heng
2014-04-01
We define the anisotropic Rabi model as the generalization of the spin-boson Rabi model: The Hamiltonian system breaks the parity symmetry; the rotating and counterrotating interactions are governed by two different coupling constants; a further parameter introduces a phase factor in the counterrotating terms. The exact energy spectrum and eigenstates of the generalized model are worked out. The solution is obtained as an elaboration of a recently proposed method for the isotropic limit of the model. In this way, we provide a long-sought solution of a cascade of models with immediate relevance in different physical fields, including (i) quantum optics, a two-level atom in single-mode cross-electric and magnetic fields; (ii) solid-state physics, electrons in semiconductors with Rashba and Dresselhaus spin-orbit coupling; and (iii) mesoscopic physics, Josephson-junction flux-qubit quantum circuits.
Jones, Katherine A.; Finley, Patrick D.; Moore, Thomas W.; Nozick, Linda Karen; Martin, Nathaniel; Bandlow, Alisa; Detry, Richard Joseph; Evans, Leland B.; Berger, Taylor Eugen
2013-09-01
Infectious diseases can spread rapidly through healthcare facilities, resulting in widespread illness among vulnerable patients. Computational models of disease spread are useful for evaluating mitigation strategies under different scenarios. This report describes two infectious disease models built for the US Department of Veteran Affairs (VA) motivated by a Varicella outbreak in a VA facility. The first model simulates disease spread within a notional contact network representing staff and patients. Several interventions, along with initial infection counts and intervention delay, were evaluated for effectiveness at preventing disease spread. The second model adds staff categories, location, scheduling, and variable contact rates to improve resolution. This model achieved more accurate infection counts and enabled a more rigorous evaluation of comparative effectiveness of interventions.
Tabatabai, Mohammad A; Bursac, Zoran; Williams, David K; Singh, Karan P
2007-01-01
A new two-parameter probability distribution called hypertabastic is introduced to model the survival or time-to-event data. A simulation study was carried out to evaluate the performance of the hypertabastic distribution in comparison with popular distributions. We then demonstrate the application of the hypertabastic survival model by applying it to data from two motivating studies. The first one demonstrates the proportional hazards version of the model by applying it to a data set from multiple myeloma study. The second one demonstrates an accelerated failure time version of the model by applying it to data from a randomized study of glioma patients who underwent radiotherapy treatment with and without radiosensitizer misonidazole. Based on the results from the simulation study and two applications, the proposed model shows to be a flexible and promising alternative to practitioners in this field. PMID:17963492
NASA Astrophysics Data System (ADS)
Udupa, Jayaram K.; Odhner, Dewey; Falcao, Alexandre X.; Ciesielski, Krzysztof C.; Miranda, Paulo A. V.; Vaideeswaran, Pavithra; Mishra, Shipra; Grevera, George J.; Saboury, Babak; Torigian, Drew A.
2011-03-01
To make Quantitative Radiology (QR) a reality in routine clinical practice, computerized automatic anatomy recognition (AAR) becomes essential. As part of this larger goal, we present in this paper a novel fuzzy strategy for building bodywide group-wise anatomic models. They have the potential to handle uncertainties and variability in anatomy naturally and to be integrated with the fuzzy connectedness framework for image segmentation. Our approach is to build a family of models, called the Virtual Quantitative Human, representing normal adult subjects at a chosen resolution of the population variables (gender, age). Models are represented hierarchically, the descendents representing organs contained in parent organs. Based on an index of fuzziness of the models, 32 thorax data sets, and 10 organs defined in them, we found that the hierarchical approach to modeling can effectively handle the non-linear relationships in position, scale, and orientation that exist among organs in different patients.
Insepov, Z; Proslier, T; Huang, D; Mahalingam, S; Veitzer, S
2010-01-01
We are developing a model of vacuum arcs. This model assumes that arcs develop as a result of mechanical failure of the surface due to Coulomb explosions, followed by ionization of fragments by field emission and the development of a small, dense plasma that interacts with the surface primarily through self sputtering and terminates as a unipolar arc capable of producing breakdown sites with high enhancement factors. We have attempted to produce a self consistent picture of triggering, arc evolution and surface damage. We are modeling these mechanisms using Molecular Dynamics (mechanical failure, Coulomb explosions, self sputtering), Particle-In-Cell (PIC) codes (plasma evolution), mesoscale surface thermodynamics (surface evolution), and finite element electrostatic modeling (field enhancements). We can present a variety of numerical results. We identify where our model differs from other descriptions of this phenomenon.
Integrated Workforce Modeling System
NASA Technical Reports Server (NTRS)
Moynihan, Gary P.
2000-01-01
There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.
Hughes, T.J.; Fastook, J.L. [Univ. of Maine, Orono, ME (United States). Institute for Quaternary Studies
1994-05-01
The University of Maine conducted this study for Pacific Northwest Laboratory (PNL) as part of a global climate modeling task for site characterization of the potential nuclear waste respository site at Yucca Mountain, NV. The purpose of the study was to develop a global ice sheet dynamics model that will forecast the three-dimensional configuration of global ice sheets for specific climate change scenarios. The objective of the third (final) year of the work was to produce ice sheet data for glaciation scenarios covering the next 100,000 years. This was accomplished using both the map-plane and flowband solutions of our time-dependent, finite-element gridpoint model. The theory and equations used to develop the ice sheet models are presented. Three future scenarios were simulated by the model and results are discussed.
V. Chipman
2002-10-31
The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their postclosure analyses.
Bird, R.; Hulstrom, R.L.
1980-01-01
Several recently published models of the direct component of the broadband insolation are compared for clear sky conditions. The comparison includes seven simple models and one rigorous model that is used as a basis for determining accuracy. Where possible, the comparison is made between the results of each model for each atmospheric constituent (H/sub 2/O, CO/sub 2/, O/sub 3/, O/sub 2/, aerosol and molecular scattering) separately as well as for the combined effect of all of the constituents. Two optimum simple models of varying degrees of complexity are developed as a result of this comparison. The study indicates: aerosols dominate the attenuation of the direct beam for reasonable atmospheric conditions; molecular scattering is next in importance; water vapor is an important absorber; and carbon dioxide and oxygen are relatively unimportant as attenuators of the broadband solar energy.
NASA Astrophysics Data System (ADS)
Kozák, Vladislav
2008-09-01
The paper studies the prediction of the crack growth of the brittle and ductile fracture of the structural materials. Crack extension is simulated by means of element extinction algorithms. The principal effort is concentrated on the application of the cohesive zone model with the exponential traction separation law and on the cohesive zone modelling. Determination of micro-mechanical parameters is based on the combination of static tests, microscopic observation and numerical calibration procedures. The attention is paid on the influence of initial value of J-integral and the slope of R curve which is modelled by 3D FEM. The aim of this paper can be seen in verification of the application of the cohesive model based on the separation law, experimental and calibration procedure inevitable for the determination of the cohesive parameters for the modelling.
NASA Technical Reports Server (NTRS)
North, G. R.; Crowley, T. J.
1984-01-01
Mathematical climate modelling has matured as a discipline to the point that it is useful in paleoclimatology. As an example a new two dimensional energy balance model is described and applied to several problems of current interest. The model includes the seasonal cycle and the detailed land-sea geographical distribution. By examining the changes in the seasonal cycle when external perturbations are forced upon the climate system it is possible to construct hypotheses about the origin of midlatitude ice sheets and polar ice caps. In particular the model predicts a rather sudden potential for glaciation over large areas when the Earth's orbital elements are only slightly altered. Similarly, the drift of continents or the change of atmospheric carbon dioxide over geological time induces radical changes in continental ice cover. With the advance of computer technology and improved understanding of the individual components of the climate system, these ideas will be tested in far more realistic models in the near future.
Stratiform chromite deposit model
Schulte, Ruth F.; Taylor, Ryan D.; Piatak, Nadine M.; Seal, Robert R., II
2010-01-01
Stratiform chromite deposits are of great economic importance, yet their origin and evolution remain highly debated. Layered igneous intrusions such as the Bushveld, Great Dyke, Kemi, and Stillwater Complexes, provide opportunities for studying magmatic differentiation processes and assimilation within the crust, as well as related ore-deposit formation. Chromite-rich seams within layered intrusions host the majority of the world's chromium reserves and may contain significant platinum-group-element (PGE) mineralization. This model of stratiform chromite deposits is part of an effort by the U.S. Geological Survey's Mineral Resources Program to update existing models and develop new descriptive mineral deposit models to supplement previously published models for use in mineral-resource and mineral-environmental assessments. The model focuses on features that may be common to all stratiform chromite deposits as a way to gain insight into the processes that gave rise to their emplacement and to the significant economic resources contained in them.
NASA Astrophysics Data System (ADS)
Charpentier, Arthur; Durand, Marilou
2015-07-01
In this paper, we investigate questions arising in Parsons and Geist (Bull Seismol Soc Am 102:1-11, 2012). Pseudo causal models connecting magnitudes and waiting times are considered, through generalized regression. We do use conditional model (magnitude given previous waiting time, and conversely) as an extension to joint distribution model described in Nikoloulopoulos and Karlis (Environmetrics 19: 251-269, 2008). On the one hand, we fit a Pareto distribution for earthquake magnitudes, where the tail index is a function of waiting time following previous earthquake; on the other hand, waiting times are modeled using a Gamma or a Weibull distribution, where parameters are functions of the magnitude of the previous earthquake. We use those two models, alternatively, to generate the dynamics of earthquake occurrence, and to estimate the probability of occurrence of several earthquakes within a year or a decade.
NASA Astrophysics Data System (ADS)
Quimby, Robert
2008-03-01
I explore the feasibility of detecting varying or transient astrophysical sources by comparing optical images against a model representation of the fiducial sky. For this test I use the shapelet models of Massey and Refregier, which are constructed from a complete orthonormal basis set and can often represent even complex objects with a limited number of terms. The shapelet model convolved by the (empirically determined) local PSF is fit to the reference data, resulting in a sharper model without need for deconvolution. These models can then be convolved with the PSF of subsequent observations and subtracted from the data to reveal any change from the reference. I compare the quality and speed of this procedure against existing image subtraction techniques.
Pupo, Amaury; Baez-Nieto, David; Martínez, Agustín; Latorre, Ramón; González, Carlos
2014-01-01
Voltage-gated proton channels are integral membrane proteins with the capacity to permeate elementary particles in a voltage and pH dependent manner. These proteins have been found in several species and are involved in various physiological processes. Although their primary topology is known, lack of details regarding their structures in the open conformation has limited analyses toward a deeper understanding of the molecular determinants of their function and regulation. Consequently, the function-structure relationships have been inferred based on homology models. In the present work, we review the existing proton channel models, their assumptions, predictions and the experimental facts that support them. Modeling proton channels is not a trivial task due to the lack of a close homolog template. Hence, there are important differences between published models. This work attempts to critically review existing proton channel models toward the aim of contributing to a better understanding of the structural features of these proteins. PMID:24755912
Peskin, M.E.
1997-05-01
These lectures constitute a short course in ``Beyond the Standard Model`` for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e{sup +}e{sup {minus}} colliders.
Saturn Radiation (SATRAD) Model
NASA Technical Reports Server (NTRS)
Garrett, H. B.; Ratliff, J. M.; Evans, R. W.
2005-01-01
The Saturnian radiation belts have not received as much attention as the Jovian radiation belts because they are not nearly as intense-the famous Saturnian particle rings tend to deplete the belts near where their peak would occur. As a result, there has not been a systematic development of engineering models of the Saturnian radiation environment for mission design. A primary exception is that of Divine (1990). That study used published data from several charged particle experiments aboard the Pioneer 1 1, Voyager 1, and Voyager 2 spacecraft during their flybys at Saturn to generate numerical models for the electron and proton radiation belts between 2.3 and 13 Saturn radii. The Divine Saturn radiation model described the electron distributions at energies between 0.04 and 10 MeV and the proton distributions at energies between 0.14 and 80 MeV. The model was intended to predict particle intensity, flux, and fluence for the Cassini orbiter. Divine carried out hand calculations using the model but never formally developed a computer program that could be used for general mission analyses. This report seeks to fill that void by formally developing a FORTRAN version of the model that can be used as a computer design tool for missions to Saturn that require estimates of the radiation environment around the planet. The results of that effort and the program listings are presented here along with comparisons with the original estimates carried out by Divine. In addition, Pioneer and Voyager data were scanned in from the original references and compared with the FORTRAN model s predictions. The results were statistically analyzed in a manner consistent with Divine s approach to provide estimates of the ability of the model to reproduce the original data. Results of a formal review of the model by a panel of experts are also presented. Their recommendations for further tests, analyses, and extensions to the model are discussed.
Maximally Expressive Task Modeling
NASA Technical Reports Server (NTRS)
Japp, John; Davis, Elizabeth; Maxwell, Theresa G. (Technical Monitor)
2002-01-01
Planning and scheduling systems organize "tasks" into a timeline or schedule. The tasks are defined within the scheduling system in logical containers called models. The dictionary might define a model of this type as "a system of things and relations satisfying a set of rules that, when applied to the things and relations, produce certainty about the tasks that are being modeled." One challenging domain for a planning and scheduling system is the operation of on-board experiment activities for the Space Station. The equipment used in these experiments is some of the most complex hardware ever developed by mankind, the information sought by these experiments is at the cutting edge of scientific endeavor, and the procedures for executing the experiments are intricate and exacting. Scheduling is made more difficult by a scarcity of space station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling space station experiment operations calls for a "maximally expressive" modeling schema. Modeling even the simplest of activities cannot be automated; no sensor can be attached to a piece of equipment that can discern how to use that piece of equipment; no camera can quantify how to operate a piece of equipment. Modeling is a human enterprise-both an art and a science. The modeling schema should allow the models to flow from the keyboard of the user as easily as works of literature flowed from the pen of Shakespeare. The Ground Systems Department at the Marshall Space Flight Center has embarked on an effort to develop a new scheduling engine that is highlighted by a maximally expressive modeling schema. This schema, presented in this paper, is a synergy of technological advances and domain-specific innovations.
Expert Models and Modeling Processes Associated with a Computer-Modeling Tool
ERIC Educational Resources Information Center
Zhang, BaoHui; Liu, Xiufeng; Krajcik, Joseph S.
2006-01-01
Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using "think aloud" technique…
Keinan, Alon
General Model Independent Normal Model Correlated Normal Model Conclusion Sequential Bayesian of Operations Research and Financial Engineering Princeton University Thursday May 3, 2007 ORFE General Exam #12;General Model Independent Normal Model Correlated Normal Model Conclusion Example, Time 0 #12;General
Verified Runtime Validation of Verified CPS Models From Model Checking to Checking Models
Clarke, Edmund M.
ModelPlex: Verified Runtime Validation of Verified CPS Models From Model Checking to Checking Models Stefan Mitsch AndrÂ´e Platzer Computer Science Department, Carnegie Mellon University Clarke Symposium, Sept. 20, 2014 For details, see ModelPlex paper at RV'14 Stefan Mitsch, AndrÂ´e Platzer--Model
Spiral model pilot project information model
NASA Technical Reports Server (NTRS)
1991-01-01
The objective was an evaluation of the Spiral Model (SM) development approach to allow NASA Marshall to develop an experience base of that software management methodology. A discussion is presented of the Information Model (IM) that was used as part of the SM methodology. A key concept of the SM is the establishment of an IM to be used by management to track the progress of a project. The IM is the set of metrics that is to be measured and reported throughout the life of the project. These metrics measure both the product and the process to ensure the quality of the final delivery item and to ensure the project met programmatic guidelines. The beauty of the SM, along with the IM, is the ability to measure not only the correctness of the specification and implementation of the requirements but to also obtain a measure of customer satisfaction.
Dislocation Kink Chain Model Versus String Model
Georg Alefeld
1965-01-01
The Peierls stress for dislocation motion causes characteristic nonlinearities of the micro-stress-strain law of dislocations. Contrary to the string model, the nonlinear stress-strain law of kinked dislocations can be the source of amplitude-dependent internal friction and modulus defect at amplitudes which are below or comparable to the breakaway stress from pinning points. Under the assumption that double-kink generation can be
Models, Traffic Models, Simulation, and Traffic Simulation
Jaume Barceló
\\u000a This introductory chapter to a book on traffic simulation fundamentals is aimed at setting up a comprehensive framework for\\u000a simulation as a well-established and grounded OR technique and its specificities when applied to traffic systems; the main\\u000a approaches to traffic simulation and the principles of traffic simulation model building; the fundamentals of traffic flow\\u000a theory and its application to traffic
Automatic Model Selection for Partially Linear Models.
Ni, Xiao; Zhang, Hao Helen; Zhang, Daowen
2009-10-01
We propose and study a unified procedure for variable selection in partially linear models. A new type of double-penalized least squares is formulated, using the smoothing spline to estimate the nonparametric part and applying a shrinkage penalty on parametric components to achieve model parsimony. Theoretically we show that, with proper choices of the smoothing and regularization parameters, the proposed procedure can be as efficient as the oracle estimator (Fan and Li, 2001). We also study the asymptotic properties of the estimator when the number of parametric effects diverges with the sample size. Frequentist and Bayesian estimates of the covariance and confidence intervals are derived for the estimators. One great advantage of this procedure is its linear mixed model (LMM) representation, which greatly facilitates its implementation by using standard statistical software. Furthermore, the LMM framework enables one to treat the smoothing parameter as a variance component and hence conveniently estimate it together with other regression coefficients. Extensive numerical studies are conducted to demonstrate the effective performance of the proposed procedure. PMID:20160947
Alexander Belyaev; Stephen F. King; Patrik Svantesson
2013-03-04
We propose a new class of models called Little Z' models in order to reduce the fine-tuning due to the current experimental limits on the Z' mass in E_6 inspired supersymmetric models, where the Higgs doublets are charged under the extra U(1)' gauge group. The proposed Little Z' models allow a lower mass Z' due to the spontaneously broken extra U(1)' gauge group having a reduced gauge coupling. We show that reducing the value of the extra gauge coupling relaxes the experimental limits, leading to the possibility of low mass Z' resonances, for example down to 200 GeV, which may yet appear in LHC searches. Although the source of tree level fine-tuning due to the Z' mass is reduced in Little Z' models, it typically does so at the expense of increasing the vacuum expectation value of the U(1)'-breaking standard model singlet field, reducing the fine-tuning to similar levels to that in the Minimal Supersymmetric Standard Model.
Multiscale Cloud System Modeling
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo; Moncrieff, Mitchell W.
2009-01-01
The central theme of this paper is to describe how cloud system resolving models (CRMs) of grid spacing approximately 1 km have been applied to various important problems in atmospheric science across a wide range of spatial and temporal scales and how these applications relate to other modeling approaches. A long-standing problem concerns the representation of organized precipitating convective cloud systems in weather and climate models. Since CRMs resolve the mesoscale to large scales of motion (i.e., 10 km to global) they explicitly address the cloud system problem. By explicitly representing organized convection, CRMs bypass restrictive assumptions associated with convective parameterization such as the scale gap between cumulus and large-scale motion. Dynamical models provide insight into the physical mechanisms involved with scale interaction and convective organization. Multiscale CRMs simulate convective cloud systems in computational domains up to global and have been applied in place of contemporary convective parameterizations in global models. Multiscale CRMs pose a new challenge for model validation, which is met in an integrated approach involving CRMs, operational prediction systems, observational measurements, and dynamical models in a new international project: the Year of Tropical Convection, which has an emphasis on organized tropical convection and its global effects.
Computationally modeling interpersonal trust
Lee, Jin Joo; Knox, W. Bradley; Wormwood, Jolie B.; Breazeal, Cynthia; DeSteno, David
2013-01-01
We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust. PMID:24363649
Multiscale modeling of chalcogenides
NASA Astrophysics Data System (ADS)
Mauro, John C.
Chalcogenide glasses exhibit unique properties applicable to a wide range of fields, including electrical and optical switching and the transmission of infrared radiation. In this thesis, we adopt a hierarchical multiscale modeling approach to investigate the fundamental physics of chalcogenide systems. Our multiscale modeling begins in Part I at the quantum mechanical level, where we use the highly accurate Moller-Plesset perturbation technique to derive interaction potentials for elemental and heterogeneous chalcogenide systems. The resulting potentials consist of two-, three-, and effective four-body terms. In Part II, we use these ab initio potentials in classical Monte Carlo simulations to investigate the structure of chalcogenide glasses. We discuss our simulation results in relation to the Phillips model of topological constraints, which predicts critical behavior in chalcogenide systems as a function of average coordination number. Finally, in Part III we address the issue of glass transition range behavior. After reviewing previous models of the glass transition, we derive a new model based on nonequilibrium statistical mechanics and an energy landscape formalism. The new model requires as input a description of inherent structure energies and the transition energies between these structures. To address this issue, we derive an eigenvector-following technique for mapping a multidimensional potential energy landscape. This technique is then extended for application to enthalpy landscapes. Our model will enable the first-ever calculation of glass transition behavior based on only ab initio physics.
NASA Technical Reports Server (NTRS)
Mccutcheon, Kimble D.; Gordon, Stephen S.; Thompson, Paul A.
1992-01-01
NASA uses the Variable Polarity Plasma Arc Welding (VPPAW) process extensively for fabrication of Space Shuttle External Tanks. This welding process has been in use at NASA since the late 1970's but the physics of the process have never been satisfactorily modeled and understood. In an attempt to advance the level of understanding of VPPAW, Dr. Arthur C. Nunes, Jr., (NASA) has developed a mathematical model of the process. The work described in this report evaluated and used two versions (level-0 and level-1) of Dr. Nunes' model, and a model derived by the University of Alabama at Huntsville (UAH) from Dr. Nunes' level-1 model. Two series of VPPAW experiments were done, using over 400 different combinations of welding parameters. Observations were made of VPPAW process behavior as a function of specific welding parameter changes. Data from these weld experiments was used to evaluate and suggest improvements to Dr. Nunes' model. Experimental data and correlations with the model were used to develop a multi-variable control algorithm for use with a future VPPAW controller. This algorithm is designed to control weld widths (both on the crown and root of the weld) based upon the weld parameters, base metal properties, and real-time observation of the crown width. The algorithm exhibited accuracy comparable to that of the weld width measurements for both aluminum and mild steel welds.
Wanner, O
1996-01-01
A mixed-culture biofilm (MCB) model is available which describes the progression of biofilm thickness and the spatial distribution and development in time of dissolved and paniculate components in the biofilm. The MCB model is able to predict the physico-chemical conditions at the interface between the biofilm and the solid surface, on which the biofilm grows, as a function of the conditions in the bulk fluid, the microbial composition of the biofilm, and the transport and transformation processes which take place in the biofilm. The mass balance equations of the MCB model are generally valid and can be applied to almost any microbial system if its kinetics and stoichiometry can be provided. AQUASIM is a new computer program for the identification and simulation of aquatic systems. The program solves the equations of the MCB model. It has a window-type user interface and includes routines for simulation, sensitivity analysis, automatic parameter estimation and data fitting. The MCB model has been developed and is primarily used in the field of waste water treatment. However, under certain conditions and with some additional simplifications this model can also be used for the investigation of biofouling and biocorrosion problems. The possibilities and limitations of the application of the MCB model and of AQUASIM to this type of problem are briefly discussed. PMID:22115101
Combining thermal comfort models
Yigit, A.
1999-07-01
Two models commonly used in thermal comfort studies were combined to develop a two-dimensional computer model that estimates the resistance to dry and evaporative heat transfer for a clothing system from fabric resistance data, fabric thickness data, and information concerning the amount of body surface area covered by different fabric layers and the amount of air trapped between fabric layers. Five different clothing ensembles with different total thermal insulation and very different distributions of the insulation on the body were simulated with 16 sedentary subjects. This paper first evaluates total thermal insulation predictions from the Fanger steady-state model and then uses these data in the Gagge two-compartment (or two-node) model. The combined model uses the transient heat balance of each segment and the whole body. It estimates total insulation value and then uses this value to calculate transient temperature and wettedness. By application of the combined model, predictions of human responses to a wide range of thermal conditions are compared with the responses of human subjects as described in reports of laboratory experiments. Possible reasons for discrepancies between the observed data and predictions of the model are briefly discussed.
Model compilation: An approach to automated model derivation
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Baudin, Catherine; Iwasaki, Yumi; Nayak, Pandurang; Tanaka, Kazuo
1990-01-01
An approach is introduced to automated model derivation for knowledge based systems. The approach, model compilation, involves procedurally generating the set of domain models used by a knowledge based system. With an implemented example, how this approach can be used to derive models of different precision and abstraction is illustrated, and models are tailored to different tasks, from a given set of base domain models. In particular, two implemented model compilers are described, each of which takes as input a base model that describes the structure and behavior of a simple electromechanical device, the Reaction Wheel Assembly of NASA's Hubble Space Telescope. The compilers transform this relatively general base model into simple task specific models for troubleshooting and redesign, respectively, by applying a sequence of model transformations. Each transformation in this sequence produces an increasingly more specialized model. The compilation approach lessens the burden of updating and maintaining consistency among models by enabling their automatic regeneration.
Modelling vegetation landslides
NASA Astrophysics Data System (ADS)
Vorpahl, Peter; Dislich, Claudia; Elsenbeer, Helmut; Märker, Michael; Schröder, Boris
2010-05-01
Shallow translational landslides are believed to represent a major ecosystem disturbance in the Andean rain forests of South Ecuador. Aiming at a better understanding of gap dynamics in this mega-diverse ecosystem, we investigated several landslides in an area of undisturbed tropical montane rain forest and found that in some cases almost no inorganic material was involved. Current physically-based landslide models cannot reproduce this type of process, since they focus on soil physical properties. Even though vegetation is incorporated in these models by its weight and by the contribution of roots to soil cohesion and hence to shear resistance, we think that the role of vegetation has to be viewed differently within this ecosystem: Roots do mainly grow in a thick organic layer above the mineral soil and do not penetrate sufficiently deep into the mineral soil to contribute to slope stability according to common models. To accommodate such circumstances, we formulated an extension to the widely used infinite slope model for assessing slope stability, and applied it to our research site. Biomass, root layer and soil properties before sliding events were reconstructed on and close to landslides that occurred within the preceding years. By introduction of an additional factor of safety for the organic layer, we are able to mathematically describe classical shallow translational landslides as well as vegetation slides. A high spatial and temporal variability of vegetation, root layer and soil physical properties within the research area complicate model applications. Thus we assumed spatial gradients for ranges of model parameters and stochastic parameter variations within these ranges according to our field measurements and published data. Finally we outline the model validation by comparison to historical landslide inventories. Possible applications of the model are located within undisturbed tropical montane rain forests and contribute to the fields of automated landslide classification as well as spatiotemporal modelling of landslides and forest gap dynamics.
NASA Astrophysics Data System (ADS)
Hill, Mary; Ye, Ming; Foglia, Laura; Lu, Dan
2015-04-01
Modeling frameworks include many ideas about, for example, how to parameterize models, conduct sensitivity analysis (including identifying observations and parameters important to calibration and prediction), quantify uncertainty, and so on. Of concern in this talk is meaningful testing of how ideas proposed for any modeling framework perform. The design of meaningful tests depends on the aspect of the framework being tested and the timing of system dynamics. Consider a situation in which the aspect being tested is prediction accuracy and the quantities of concern are readily measured and change quickly, such as for precipitation, floods, or hurricanes. In such cases meaningful tests involve comparing simulated and measured values and tests can be conducted daily, hourly or even more frequently. Though often challenged by measurement difficulties, this remains the simplest circumstance for conducting meaningful tests of modeling frameworks. If measurements are not readily available and(or) the system responds to changes over decades or centuries, as generally occurs for climate change, saltwater intrusion of groundwater systems, and dewatering of aquifers, prediction accuracy needs to be evaluated in other ways. Often these require high performance computing. For example, complex and simple models can be compared or cross-validation experiments can be conducted. Both can require massive computational resources for any but the simplest of problems. Testing other aspects of a modeling framework can require different types of tests. For example, testing methods of identifying observations or parameters important to model calibration or predictions might entail evaluation of many circumstances for methods that are themselves commonly computationally demanding. Again, high performance computing is needed even when the goal is to include computationally frugal methods in the modeling framework. In this talk we discuss the importance of such testing, stress the need to design and implement tests when any modeling framework is developed, and provide examples of tests from several recent publications.
Solar radiation modeling and comparisons with current solar radiation models
A. Mujahid; W. D. Turner
1980-01-01
Solar radiation models which can be used to predict global radiation from cloud cover data, and direct normal radiation from measurements of global have been developed for Blytheville, Arkansas. These models are compared to the current NOAA models used in the SOLMET weather tapes. The accuracies of the NOAA cloud cover model and the Randall and Whitson direct normal model
Mathematical modelling in veterinary epidemiology. why model building is important
Mart C. M. de Jong
1995-01-01
Some consider modelling to be very important for (veterinary) epidemiology, others severely criticise the use of modelling. Before joining this heated debate it is worthwhile to reflect on the role of mathematical modelling. Mathematical modelling is useful for the study of complex phenomena, like the population dynamics of infectious agents, because models show how separate measurements can be seen as
High School Students' Modeling Knowledge High School Students' Modeling Knowledge
High School Students' Modeling Knowledge High School Students' Modeling Knowledge David Fortus of the authors. #12;High School Students' Modeling Knowledge Abstract Modeling is a core scientific practice. This study probed the modeling knowledge of high school students who had not any explicit exposure
Thurstonian Models Bradley-Terry and Plackett-Luce Models
Villõ, Csiszár
Thurstonian Models Bradley-Terry and Plackett-Luce Models EM algorithms for Thurstonian and Bradley-Terry-type random permutation models Villo Csiszár Department of Probability and Statistics Eötvös Loránd University, Budapest Prague Stochastics 2010 Villo Csiszár EM algorithms for permutation models #12;Thurstonian Models
Model-Based Reinforcement Learning with an Approximate, Learned Model
Sutton, Richard S.
Model-Based Reinforcement Learning with an Approximate, Learned Model Leonid Kuvayev Rich Sutton that model-based methods do indeed perform better than model-free reinforcement learning. Keywords: Reinforcement learning, planning, model-based learning, function approximation, CMAC networks. 1 Introduction
NASA Astrophysics Data System (ADS)
Tajima, Y.; Madsen, O. S.
2002-12-01
The paper will present a theoretical model for the prediction of undertow velocity profiles in the surf zone due to near-normally incident waves. The waves may be periodic or narrow-banded random waves, and the beach may be plane or barred. The theoretical model consists of three components: (i) breaking wave model; (ii) surface roller model; and (iii) undertow velocity profile model. \\textit{The breaking wave model} (Tajima and Madsen, 2002) is based on the concept of an equivalent linear wave and predicts linear wave characteristics for shoaling, breaking and broken waves. Non-linear wave characteristics, e.g., near-bottom orbital velocity, are obtained from equivalent linear wave characteristics and local bottom slope through use of simple transform formulae. \\textit{The surface roller model} is based on the same principle as Dally et al. (1985), but differs from this by transferring only the potential energy lost from the wave motion into the surface roller and calculating the decay of surface roller energy using a decay coefficient equal to that obtained for the breaking wave dissipation model. \\textit{The undertow velocity profile model} assumes a linearly varying shear stress over the water depth combined with an assumed form of the turbulent eddy viscosity. The shear stress at the surface is obtained from the breaking wave and surface roller models, whereas the bottom shear stress is obtained from considerations of mass conservation, i.e., depth-integrated undertow velocity must equal the volume transport of waves and surface roller above trough level. The near-bottom undertow velocity is calculated at the edge of the wave-bottom boundary layer, from knowledge of near-bottom orbital velocity, bottom shear stress and bottom roughness, using the combined wave-current bottom boundary layer theory by Madsen (1994). Comparison of predicted and measured undertow velocity profiles are performed for periodic and random waves normally incident on plane and barred concrete beaches as well as random waves near-normally incident on barred movable bed beach profiles. In general the agreement between predicted and observed undertow velocities is excellent. It is shown that model predictions are fairly insensitive to the choice of turbulent eddy viscosity, which is the only adjustable quantity in the model.
Zakrzewski, W.J.
1988-01-01
We discuss classical solutions of U(N) sigma models in two dimensions. We show how from these solutions we can construct solutions of the U(N) sigma model with the Wess--Zumino term (with an arbitrary coefficient). We discuss briefly various properties of these solutions. Next we consider the O(3) sigma model in 2 + 1 dimensions and describe the preliminary results of some numerical work in which we studied the time evolution of some of the previously discussed two dimensional structures (instantons and anti-instantons) under suitable assumptions about their initial values. 9 refs., 6 figs.
Bali, R.
1986-07-01
The object of this paper is to investigate the behavior of the magnetic field in a cosmological model for perfect fluid distribution. The magnetic field is due to an electric current produced along the /chi/ axis. It is assumed that expansion (/theta/) in the model is proportional to sigma/sup 1//sub 1/, the eigenvalue of the shear tensor sigma /sup j/ /sub i/. The behavior of the model when the magnetic field tends to zero and other physical properties are also discussed.
NSDL National Science Digital Library
This site includes a model course from the CyberWatch Center. The site does not currently include any educational materials, but does provide a model framework for structuring a course on this topic. This course would cover the current risks to electronic data, as well as a structured way to address security problems. The course would provide a good starting point for students going on to study specialized security. A detailed course outline is included. Users must register to view the model course, but registration is free and easy.
Aviation Safety Simulation Model
NASA Technical Reports Server (NTRS)
Houser, Scott; Yackovetsky, Robert (Technical Monitor)
2001-01-01
The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.
Modeling EERE Deployment Programs
Cort, Katherine A.; Hostick, Donna J.; Belzer, David B.; Livingston, Olga V.
2007-11-08
The purpose of this report is to compile information and conclusions gathered as part of three separate tasks undertaken as part of the overall project, “Modeling EERE Deployment Programs,” sponsored by the Planning, Analysis, and Evaluation office within the Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE). The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address improvements to modeling in the near term, and note gaps in knowledge where future research is needed.
Nonparametric Transfer Function Models
Liu, Jun M.; Chen, Rong; Yao, Qiwei
2009-01-01
In this paper a class of nonparametric transfer function models is proposed to model nonlinear relationships between ‘input’ and ‘output’ time series. The transfer function is smooth with unknown functional forms, and the noise is assumed to be a stationary autoregressive-moving average (ARMA) process. The nonparametric transfer function is estimated jointly with the ARMA parameters. By modeling the correlation in the noise, the transfer function can be estimated more efficiently. The parsimonious ARMA structure improves the estimation efficiency in finite samples. The asymptotic properties of the estimators are investigated. The finite-sample properties are illustrated through simulations and one empirical example. PMID:20628584
Atmospheric and Oceanic Modeling
NSDL National Science Digital Library
Adcroft, Alistair
The numerical methods, formulation and parameterizations used in models of the circulation of the atmosphere and ocean will be described in detail. Widely used numerical methods will be the focus but we will also review emerging concepts and new methods. The numerics underlying a hierarchy of models will be discussed, ranging from simple GFD models to the high-end GCMs. In the context of ocean GCMs, we will describe parameterization of geostrophic eddies, mixing and the surface and bottom boundary layers. In the atmosphere, we will review parameterizations of convection and large scale condensation, the planetary boundary layer and radiative transfer.
Constructive Epistemic Modeling: A Hierarchical Bayesian Model Averaging Method
NASA Astrophysics Data System (ADS)
Tsai, F. T. C.; Elshall, A. S.
2014-12-01
Constructive epistemic modeling is the idea that our understanding of a natural system through a scientific model is a mental construct that continually develops through learning about and from the model. Using the hierarchical Bayesian model averaging (HBMA) method [1], this study shows that segregating different uncertain model components through a BMA tree of posterior model probabilities, model prediction, within-model variance, between-model variance and total model variance serves as a learning tool [2]. First, the BMA tree of posterior model probabilities permits the comparative evaluation of the candidate propositions of each uncertain model component. Second, systemic model dissection is imperative for understanding the individual contribution of each uncertain model component to the model prediction and variance. Third, the hierarchical representation of the between-model variance facilitates the prioritization of the contribution of each uncertain model component to the overall model uncertainty. We illustrate these concepts using the groundwater modeling of a siliciclastic aquifer-fault system. The sources of uncertainty considered are from geological architecture, formation dip, boundary conditions and model parameters. The study shows that the HBMA analysis helps in advancing knowledge about the model rather than forcing the model to fit a particularly understanding or merely averaging several candidate models. [1] Tsai, F. T.-C., and A. S. Elshall (2013), Hierarchical Bayesian model averaging for hydrostratigraphic modeling: Uncertainty segregation and comparative evaluation. Water Resources Research, 49, 5520-5536, doi:10.1002/wrcr.20428. [2] Elshall, A.S., and F. T.-C. Tsai (2014). Constructive epistemic modeling of groundwater flow with geological architecture and boundary condition uncertainty under Bayesian paradigm, Journal of Hydrology, 517, 105-119, doi: 10.1016/j.jhydrol.2014.05.027.
John Fonweban Linking Growth & Timber Quality Models
John Fonweban Linking Growth & Timber Quality Models #12;Linking Growth & Timber Quality Models · Background · Growth Modelling · Timber quality Modelling · Why link the two approaches · Examples linking Growth & TQ Models · Perspective (s) #12;Linking Growth & Timber Quality Models Growth & Yield modelling
NASA Astrophysics Data System (ADS)
Clark, M. P.; Gupta, H. V.; Martinez, G.; Tom, H.; Slater, A. G.; Hilary, M.
2008-12-01
Improving hydrological models requires understanding the reasons for model weaknesses. This may seem obvious, but there are surprisingly few methods available that attribute the error in model simulations to specific model components, and there are few methods available to compare alternative model parameterizations. Most model inter-comparison experiments to date evaluate model simulations using a small set of summary statistics, and there are too many degrees of freedom in each model to discern why model differences arise. The limited capacity to rigorously evaluate alternative modeling strategies severely limits progress for the discipline of hydrology. This presentation will describe use of FUSE (Framework for Understanding Structural Errors) to understand reasons for model weaknesses. FUSE provides the capability to change just one component of a hydrological model (e.g., the equations used to model surface runoff) and keep all other model components constant. This facilitates attributing the error in model simulations to a specific model component. The FUSE models are evaluated using multiple hydrological indices (diagnostic signatures), where the different diagnostic signatures have explanatory power for different components of the hydrological model. The process of using multiple hydrological indices to evaluate multiple hydrological models provides new insights on inter-model differences and new directions for model development.
Morphological modeling of neurons
Mulchandani, Kishore
1995-01-01
A formal representation of neuron morphology, adequate for the geometric modeling of manually-traced neurons, is presented. The concept of a stochastic L-system is then introduced and the critical distribution functions governing the stochastic...
Accurate numerical schemes have been developed for simulation of the condensation/evaporation processes with vapor conservation for a single component aerosol. These have been incorporated in modules which allow simulation of aerosol dynamics in models for dispersion and transpor...
Fluidized bed combustor modeling
NASA Technical Reports Server (NTRS)
Horio, M.; Rengarajan, P.; Krishnan, R.; Wen, C. Y.
1977-01-01
A general mathematical model for the prediction of performance of a fluidized bed coal combustor (FBC) is developed. The basic elements of the model consist of: (1) hydrodynamics of gas and solids in the combustor; (2) description of gas and solids contacting pattern; (3) kinetics of combustion; and (4) absorption of SO2 by limestone in the bed. The model is capable of calculating the combustion efficiency, axial bed temperature profile, carbon hold-up in the bed, oxygen and SO2 concentrations in the bubble and emulsion phases, sulfur retention efficiency and particulate carry over by elutriation. The effects of bed geometry, excess air, location of heat transfer coils in the bed, calcium to sulfur ratio in the feeds, etc. are examined. The calculated results are compared with experimental data. Agreement between the calculated results and the observed data are satisfactory in most cases. Recommendations to enhance the accuracy of prediction of the model are suggested.
Modelling Immunological Memory
Garret, Simon; Walker, Joanne; Wilson, William; Aickelin, Uwe
2010-01-01
Accurate immunological models offer the possibility of performing highthroughput experiments in silico that can predict, or at least suggest, in vivo phenomena. In this chapter, we compare various models of immunological memory. We first validate an experimental immunological simulator, developed by the authors, by simulating several theories of immunological memory with known results. We then use the same system to evaluate the predicted effects of a theory of immunological memory. The resulting model has not been explored before in artificial immune systems research, and we compare the simulated in silico output with in vivo measurements. Although the theory appears valid, we suggest that there are a common set of reasons why immunological memory models are a useful support tool; not conclusive in themselves.
;Systemic Grounded in systems theory and change models Diverse Capitalizes on the strengths of diversity Workplace Preparation Global Knowledge ... Improve workplace practice and economic strength ... Strengthen commitment to democracy/ diversity ... Strengthen citizen participation and civic responsibility to improve
in systems theory and change models Diverse Capitalizes on the strengths of diversity Contextual Recognizes resources to ... Educated Person Educated Community/World Values Civic Engagement Workplace Preparation Global Knowledge ... Improve workplace practice and economic strength ... Strengthen commitment
Lynch, Nancy
2009-06-04
We describe a modeling framework and collection of foundational composition results for the study of probabilistic distributed algorithms in synchronous radio networks. Existing results in this setting rely on informal ...
Mandal, Esan
2004-09-30
The goal of this thesis is to introduce new methods to create intricate perforated shapes in a computing environment. Modeling shapes with a large number of holes and handles, while requiring minimal human interaction, is an unsolved research...
Yonezawa, Akinori
1977-06-01
Distributed systems are multi-processor information processing systems which do not rely on the central shared memory for communication. This paper presents ideas and techniques in modelling distributed systems and ...
NASA Technical Reports Server (NTRS)
Horwitz, James L.
1992-01-01
The purpose of this work was to assist with the development of analytical techniques for the interpretation of infrared observations. We have done the following: (1) helped to develop models for continuum absorption calculations for water vapor in the far infrared spectral region; (2) worked on models for pressure-induced absorption for O2 and N2 and their comparison with available observations; and (3) developed preliminary studies of non-local thermal equilibrium effects in the upper stratosphere and mesosphere for infrared gases. These new techniques were employed for analysis of balloon-borne far infrared data by a group at the Harvard-Smithsonian Center for Astrophysics. The empirical continuum absorption model for water vapor in the far infrared spectral region and the pressure-induced N2 absorption model were found to give satisfactory results in the retrieval of the mixing ratios of a number of stratospheric trace constituents from balloon-borne far infrared observations.
UPDATING APPLIED DIFFUSION MODELS
Most diffusion models currently used in air quality applications are substantially out of date with understanding of turbulence and diffusion in the planetary boundary layer. Under a Cooperative Agreement with the Environmental Protection Agency, the American Meteorological Socie...
... problems not easily examined in real life. The experiments consist of computer simulations—representations that closely match ... strategies may vary during a pandemic. The modeling experiments also indicated that people at risk for serious ...
NASA Technical Reports Server (NTRS)
Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John
2011-01-01
A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.
NSDL National Science Digital Library
Bob Mackay
Created by Bob MacKay, Clark College "Scientific practice involves the construction, validation and application of scientific models, so science instruction should be designed to engage students in making and ...
NSDL National Science Digital Library
1998-01-01
Designed by the International Monetary Fund (IMF), MULTIMOD is a modern, dynamic, multi-country macro model of the world economy designed to study the transmission of shocks across countries as well as the short-run and medium-run consequences of alternative monetary and fiscal policies. Country sub-models include the seven largest industrial countries--Canada, France, Germany, Italy, Japan, the UK, and the US--and an aggregate grouping of fourteen smaller industrial nations. Full documentation of the MULTIMOD's current variant, the Mark III Econometric Model, and its sub-models, and downloading instructions are provided as well as a bibliography of articles that either explain Multimod, apply it, or are part of its technical implementation.
Theoretical Leonid entry modeling
ReVelle, D. O. (Douglas O.)
2001-01-01
In this work we present a model originally developed by ReVelle (1979, 1993) that has been applied to model large Leonid bolides with a few relatively minor modifications and one major modification which allows for catastrophic 'pancake: fragmentation processes as described below. The minor modifications include allowing the energy of ablation per unit mass for vaporization, Qvap, to be a free variable that is adjusted until agreement is obtained between the theoretical model and the statistically expected ablation coefficient for the Leonids (Group IIIB type bolide). It was found that the Qvap had to be reduced by a factor of about five times compared to the accepted value of Qvap for cometary materials. Alternative ways of achieving this degree of agreement between theory and observations are also suggested as well. In a separate paper we apply this model to a specific Leonid bolide during the 1998 storm period.
Morphological modeling of neurons
Mulchandani, Kishore
1995-01-01
A formal representation of neuron morphology, adequate for the geometric modeling of manually-traced neurons, is presented. The concept of a stochastic L-system is then introduced and the critical distribution functions governing the stochastic...
Warren, Jeff; Iversen, Colleen; Brooks, Jonathan; Ricciuto, Daniel
2014-06-26
Using dogwood trees, Oak Ridge National Laboratory researchers are gaining a better understanding of the role photosynthesis and respiration play in the atmospheric carbon dioxide cycle. Their findings will aid computer modelers in improving the accuracy of climate simulations.
NASA Technical Reports Server (NTRS)
Jaap, John; Davis, Elizabeth; Richardson, Lea
2004-01-01
Planning and scheduling systems organize tasks into a timeline or schedule. Tasks are logically grouped into containers called models. Models are a collection of related tasks, along with their dependencies and requirements, that when met will produce the desired result. One challenging domain for a planning and scheduling system is the operation of on-board experiments for the International Space Station. In these experiments, the equipment used is among the most complex hardware ever developed; the information sought is at the cutting edge of scientific endeavor; and the procedures are intricate and exacting. Scheduling is made more difficult by a scarcity of station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling International Space Station experiment operations calls for a maximally expressive modeling schema.
Bagger, J.A.
1984-09-01
We begin to construct the most general supersymmetric Lagrangians in one, two and four dimensions. We find that the matter couplings have a natural interpretation in the language of the nonlinear sigma model.
NASA Technical Reports Server (NTRS)
Olsen, N.; Holme, R.; Hulot, G.; Sabaka, T.; Neubert, T.; Toffner-Clausen, L.; Primdahl, F.; Jorgensen, J.; Leger, J.-M.; Barraclough, D.; Smith, David E. (Technical Monitor)
2000-01-01
Magnetic measurements taken by the Orsted satellite during geomagnetic quiet conditions around January 1, 2000 have been used to derive a spherical harmonic model of the Earth's magnetic field for epoch 2000.0. The maximum degree and order of the model is 19 for internal, and 2 for external, source fields; however, coefficients above degree 14 may not be robust. Such detailed models exist for only one previous epoch, 1980. Achieved rms misfit is 2 nT for the scalar intensity and 4 nT for the vector components perpendicular to the magnetic field. This model is of higher detail than the IGRF 2000, which for scientific purposes related to the Orsted mission it supersedes.
Tilted bulk cosmological model
NASA Astrophysics Data System (ADS)
Bagora Menaria, A.; Purohit, R.
2015-04-01
We investigated tilted Bianchi type I cosmological model for barotropic fluid with presence and absence of bulk viscous fluid (p = ??), where p being isotropic pressure, ? the matter density with 0 ? ? ? 1. The effect of bulk viscosity on the evolution of the homogeneous cosmological models is considered. Solutions are found with a barotropic equation of state and a bulk viscosity coefficient for this, we assume that ?? = K (constant), where ? is the coefficient of bulk viscosity and ? the expansion in the model. To determine the complete solution, we have assumed the relation between metric potential as A=BC. The physical and geometrical aspects of the model in the presence and absence of bulk viscosity are also discussed.
Recht, Benjamin Harris, 1978-
2006-01-01
As the study of complex interconnected networks becomes widespread across disciplines, modeling the large-scale behavior of these systems becomes both increasingly important and increasingly difficult. In particular, it ...
NASA Technical Reports Server (NTRS)
1998-01-01
Part of the high pressure nitrogen system used for the 1% scale X-33 reaction control system model. Installed in the Unitary Plan Wind Tunnel for supersonic testing. In building 1251, test section #2.
Modeling of planarization technologies
Truque, Daniel
2007-01-01
The need for better planarity becomes more critical in semiconductor manufacturing as dimensions and tolerance margins keep shrinking. The purpose of this thesis is to understand and model new technologies for the planarization ...
NSDL National Science Digital Library
Yarden Livnat
This research page provides links to two animations of modeled mantle convection, showing the progression of convection over millions of years. There are also links to other work and publications by the author.
NSDL National Science Digital Library
2009-06-17
This is a standards-based simulation for middle school, developed to help students visualize how total energy is conserved in a simple pendulum. It depicts a child swinging on a swing suspended from a stationary point. Students can drag the swing to different heights, then activate the motion. As the swing moves in periodic motion, energy bar graphs are simultaneously displayed that show changing levels of kinetic and potential energy. The simulation is accompanied by a lesson plan and printable student activity guide. This item was created with Easy Java Simulations (EJS), a modeling tool that allows users without formal programming experience to generate computer models and simulations. To modify or customize the model, See Related Materials for detailed instructions on installing and running the EJS Modeling and Authoring Tool.
NSDL National Science Digital Library
David Joiner
The Pendulum Motion Model lets users change the length, mass, initial displacement, initial velocity and acceleration due to gravity of a thin rigid pendulum capable of rotating a full 360 degrees. DIfferent integration methods are provided.
Modeling Newspaper Advertising
ERIC Educational Resources Information Center
Harper, Joseph; And Others
1978-01-01
Presents a mathematical model for simulating a newspaper financial system. Includes the effects of advertising and circulation for predicting advertising linage as a function of population, income, and advertising rate. (RL)
Direct integration transmittance model
NASA Technical Reports Server (NTRS)
Kunde, V. G.; Maguire, W. C.
1973-01-01
A transmittance model was developed for the 200-2000/cm region for interpretation of high spectral resolution measurements of laboratory absorption and of planetary thermal emission. The high spectral resolution requires transmittances to be computed monochromatically by summing the contribution of individual molecular absorption lines. A magnetic tape atlas of H2O,O3, and CO2 molecular line parameters serves as input to the transmittance model with simple empirical representations used for continuum regions wherever suitable laboratory data exist. The theoretical formulation of the transmittance model and the computational procedures used for the evaluation of the transmittances are discussed. Application is demonstrated of the model to several homogenous path laboratory absorption examples.
NSDL National Science Digital Library
Created by Bob MacKay, Clark College People receive information, process this information, and respond accordingly many times each day. This sort of processing of information is essentially a conceptual model (or ...
An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...
Lagrangians for biological models
M. C. Nucci; K. M. Tamizhmani
2011-08-10
We show that a method presented in [S.L. Trubatch and A. Franco, Canonical Procedures for Population Dynamics, J. Theor. Biol. 48 (1974), 299-324] and later in [G.H. Paine, The development of Lagrangians for biological models, Bull. Math. Biol. 44 (1982) 749-760] for finding Lagrangians of classic models in biology, is actually based on finding the Jacobi Last Multiplier of such models. Using known properties of Jacobi Last Multiplier we show how to obtain linear Lagrangians of those first-order systems and nonlinear Lagrangian of the corresponding single second-order equations that can be derived from them, even in the case where those authors failed such as the host-parasite model.
NASA Technical Reports Server (NTRS)
Glaese, John R.; Tobbe, Patrick A.
1986-01-01
The Space Station Mechanism Test Bed consists of a hydraulically driven, computer controlled six degree of freedom (DOF) motion system with which docking, berthing, and other mechanisms can be evaluated. Measured contact forces and moments are provided to the simulation host computer to enable representation of orbital contact dynamics. This report describes the development of a generalized math model which represents the relative motion between two rigid orbiting vehicles. The model allows motion in six DOF for each body, with no vehicle size limitation. The rotational and translational equations of motion are derived. The method used to transform the forces and moments from the sensor location to the vehicles' centers of mass is also explained. Two math models of docking mechanisms, a simple translational spring and the Remote Manipulator System end effector, are presented along with simulation results. The translational spring model is used in an attempt to verify the simulation with compensated hardware in the loop results.
NASA Technical Reports Server (NTRS)
Weissman, Paul R.; Kieffer, Hugh H.
1987-01-01
The past year was one of tremendous activity because of the appearance of Halley's Comet. Observations of the comet were collected from a number of sources and compared with the detailed predictions of the comet thermal modeling program. Spacecraft observations of key physical parameters for cometary nucleus were incorporated into the thermal model and new cases run. These results have led to a much better understanding of physical processes on the nucleus and have pointed the way for further improvements to the modeling program. A model for the large-scale structure of cometary nuclei was proposed in which comets were envisioned as loosely bound agglomerations of smaller icy planetesimals, essentially a rubble pile of primordial dirty snowballs. In addition, a study of the physical history of comets was begun, concentrating on processes during formation and in the Oort cloud which would alter the volatile and nonvolatile materials in cometary nuclei from their pristine state before formation.
Dietary Exposure Potential Model
Existing food consumption and contaminant residue databases, typically products of nutrition and regulatory monitoring, contain useful information to characterize dietary intake of environmental chemicals. A PC-based model with resident database system, termed the Die...
Model Fundamentals - version 2
NSDL National Science Digital Library
2014-09-14
Model Fundamentals, part of the Numerical Weather Prediction Professional Development Series and the "NWP Training Series: Effective Use of NWP in the Forecast Process", describes the components of an NWP model and how they fit into the forecast development process. It also explores why parameterization of many physical processes is necessary in NWP models. The module covers background concepts and terminology necessary for learning from the other modules in this series on NWP. Back in 2000, the subject matter expert for this module was Dr. Ralph Petersen of the National Centers for Environmental Prediction, Environmental Modeling Center (NCEP/EMC). Revisions to the module were made in 2009 by Drs. Bill Bua and Stephen Jascourt, from the NWP team at UCAR/COMET.
ADVANCED CHEMISTRY BASINS MODEL
William Goddard III; Lawrence Cathles III; Mario Blanco; Paul Manhardt; Peter Meulbroek; Yongchun Tang
2004-05-01
The advanced Chemistry Basin Model project has been operative for 48 months. During this period, about half the project tasks are on projected schedule. On average the project is somewhat behind schedule (90%). Unanticipated issues are causing model integration to take longer then scheduled, delaying final debugging and manual development. It is anticipated that a short extension will be required to fulfill all contract obligations.
José J. Canals; Steven Stern
2001-01-01
\\u000a This chapter discusses various methods that have been used to estimate structural models of job search. The focus is mainly\\u000a on using available data to estimate the parameters of the wage offer distribution, the reservation wage or reservation wage\\u000a function, the cost of search, the offer arrival rate, and the discount rate. We describe models, estimation procedures, and\\u000a issues associated
Modeling Orbital Debris Problems
NSDL National Science Digital Library
2011-01-24
This algebra lesson from Illuminations helps students develop their understanding of mathematical functions and modeling using spreadsheets, graphing calculators, and computer graphing utilities. The differences between linear, quadratic and exponential models are described. Students will also improve their understanding of how to choose the appropriate graphical representations for data. The material is intended for grades 9-12 and should require 5 class periods to complete.
Improved steamflood analytical model
Chandra, Suandy
2006-10-30
two field cases, a 45x23x8 model was used that represented 1/8 of a 10-acre 5-spot pattern unit, using typical rock and reservoir fluid properties. In the SPE project case, three models were used: 23x12x12 (2.5 ac), 31x16x12 (5 ac) and 45x23x8 (10 ac...
Computational Human Body Models
Jac Wismans; Riender Happee; J. A. W. Dommelen
Computational human body models are widely used for automotive crashsafety research and design and as such have significantly\\u000a contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models\\u000a based on crash-dummies. However crash dummies differ significantly from the real human body and moreover crash dummies are\\u000a only available for a limited set of
Reviewed Work: Modeling Nature
Pierotti, Raymond
1986-11-01
Shore Dr., Chicago, IL 60605. Modeling nature.-Sharon E. Kingsland. 1985. Univ. of Chicago Press, Chicago. 267 p. The use of mathematics as a tool in attempting to de- scribe biological processes has a long and stormy history. Perhaps nowhere has... this procedure ngendered more con- troversy than in the field of population biology. What is perhaps most interesting (some might regard it as amusing) about the repeated controversies over the use of mathe- matical models in population biology is that almost...
NASA Technical Reports Server (NTRS)
Callis, S. L.; Sakamoto, C.
1984-01-01
Five models based on multiple regression were developed to estimate wheat yields for the five wheat growing provinces of Argentina. Meteorological data sets were obtained for each province by averaging data for stations within each province. Predictor variables for the models were derived from monthly total precipitation, average monthly mean temperature, and average monthly maximum temperature. Buenos Aires was the only province for which a trend variable was included because of increasing trend in yield due to technology from 1950 to 1963.
Andre Krueger
2002-06-24
Recent results of tests of the Standard Model of electroweak interactions are presented. Data are used from the four LEP experiments, ALEPH, DELPHI, L3, OPAL, the SLD experiment at SLC, the TEVATRON p-pbar experiments CDF and D0 and the NuTeV neutrino experiment. chi2-fits are performed in order to study the consistency of the Standard Model of electroweak interactions.
Mathematics for dynamic modeling
Beltrami, E.
1987-01-01
Mathematical modeling techniques for dynamical systems are presented in an introductory textbook intended for upper-undergraduate and graduate science and engineering students. Chapters are devoted to simple dynamic models, stable and unstable motion, growth and decay, motion in time and space, cycles and bifurcation, bifurcation and catastrophe, chaos, and optimal controls. Specific applications to biological systems, traffic control, and the geomagnetic field are considered; and diagrams, graphs, exercises, and a review of ODEs are included. 50 references.
CONCRETE MIX TRANSPORTATION MODELLING
Edwin Ko?niewski; Zygmunt Or?owski
2003-01-01
The paper presents the simulation model of concrete mix transportation process in the system—batching centre—a number of construction sites. An essential part of the analysed model is a multi-phase system of queues which emerge in concrete transportation characterised by: waiting time for loading, waiting for discharge, the distances from the plant to construction sites, the number and type of truck
Improved steamflood analytical model
Chandra, Suandy
2006-10-30
two field cases, a 45x23x8 model was used that represented 1/8 of a 10-acre 5-spot pattern unit, using typical rock and reservoir fluid properties. In the SPE project case, three models were used: 23x12x12 (2.5 ac), 31x16x12 (5 ac) and 45x23x8 (10 ac...
T. G. Forbes; J. A. Linker; J. Chen; C. Cid; J. Kóta; M. A. Lee; G. Mann; Z. Mikic; M. S. Potgieter; J. M. Schmidt; G. L. Siscoe; R. Vainio; S. K. Antiochos; P. Riley
2006-01-01
This chapter provides an overview of current efforts in the theory and modeling of CMEs. Five key areas are discussed: (1)\\u000a CME initiation; (2) CME evolution and propagation; (3) the structure of interplanetary CMEs derived from flux rope modeling;\\u000a (4) CME shock formation in the inner corona; and (5) particle acceleration and transport at CME driven shocks. In the section
Theory Modeling and Simulation
Shlachter, Jack [Los Alamos National Laboratory
2012-08-23
Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.
Lipkin, H.J.
1986-01-01
The success of simple constituent quark models in single-hardon physics and their failure in multiquark physics is discussed, emphasizing the relation between meson and baryon spectra, hidden color and the color matrix, breakup decay modes, coupled channels, and hadron-hadron interactions via flipping and tunneling of flux tubes. Model-independent predictions for possible multiquark bound states are considered and the most promising candidates suggested. A quark approach to baryon-baryon interactions is discussed.
Animal models of steatohepatitis
Ayman Koteish; Anna Mae Diehl
2002-01-01
Animal models of hepatic steatosis and steatohepatitis have improved our understanding of the pathogenesis of non-alcoholic fatty liver disease (NAFLD). Three models, genetically obese ob\\/ob mice, lipoatrophic mice and normal rats fed choline-deficient, methionine-restricted diets, have been particularly informative. All support the multiple ‘hit’ hypothesis for NAFLD pathogenesis that suggests that fatty livers are unusually vulnerable to oxidants and develop
Ion Thruster Performance Model.
NASA Astrophysics Data System (ADS)
Brophy, John Raymond
A model of ion thruster performance is developed for high flux density cusped magnetic field thruster designs. This model is formulated in terms of the average energy required to produce an ion in the discharge chamber plasma and the fraction of these ions that are extracted to form the beam. The direct loss of high energy (primary) electrons from the plasma to the anode is shown to have a major effect on thruster performance. The model provides simple algebraic equations enabling one to calculate the beam ion energy cost, the average discharge chamber plasma ion energy cost, the primary electron density, the primary-to-Maxwellian electron density ratio and the Maxwellian electron temperature. Experiments indicate that the model correctly predicts the variation in plasma ion energy cost for changes in propellant gas (Ar, Kr and Xe), grid transparency to neutral atoms, beam extraction area, discharge voltage, and discharge chamber wall temperature. The model and experiments indicate that thruster performance may be described in terms of only four thruster configuration dependent parameters and two operating parameters. The model also suggests that improved performance should be exhibited by thruster designs which extract a large fraction of the ions produced in the discharge chamber, which have good primary electron and neutral atom containment and which operate at high propellant flow rates. In addition, it suggests that hollow cathode efficiency becomes increasingly important to the discharge chamber performance as the discharge voltage is reduced. Finally, the utility of the model is mission analysis calculations is demonstrated. The model makes it easy to determine which changes in thruster design or operating parameters have the greatest effect on the payload fraction and/or mission duration.
Shears, Tara
2012-02-28
The Standard Model is the theory used to describe the interactions between fundamental particles and fundamental forces. It is remarkably successful at predicting the outcome of particle physics experiments. However, the theory has not yet been completely verified. In particular, one of the most vital constituents, the Higgs boson, has not yet been observed. This paper describes the Standard Model, the experimental tests of the theory that have led to its acceptance and its shortcomings. PMID:22253237