Science.gov

Sample records for advanced modeling simulation

  1. DEVELOPMENT OF THE ADVANCED UTILITY SIMULATION MODEL

    EPA Science Inventory

    The paper discusses the development of the Advanced Utility Simulation Model (AUSM), developed for the National Acid Precipitation Assessment Program (NAPAP), to forecast air emissions of pollutants from electric utilities. USM integrates generating unit engineering detail with d...

  2. Center for Advanced Modeling and Simulation Intern

    SciTech Connect

    Gertman, Vanessa

    2010-01-01

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  3. Center for Advanced Modeling and Simulation Intern

    ScienceCinema

    Gertman, Vanessa

    2013-05-28

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  4. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  5. Gasification CFD Modeling for Advanced Power Plant Simulations

    SciTech Connect

    Zitney, S.E.; Guenther, C.P.

    2005-09-01

    In this paper we have described recent progress on developing CFD models for two commercial-scale gasifiers, including a two-stage, coal slurry-fed, oxygen-blown, pressurized, entrained-flow gasifier and a scaled-up design of the PSDF transport gasifier. Also highlighted was NETL’s Advanced Process Engineering Co-Simulator for coupling high-fidelity equipment models with process simulation for the design, analysis, and optimization of advanced power plants. Using APECS, we have coupled the entrained-flow gasifier CFD model into a coal-fired, gasification-based FutureGen power and hydrogen production plant. The results for the FutureGen co-simulation illustrate how the APECS technology can help engineers better understand and optimize gasifier fluid dynamics and related phenomena that impact overall power plant performance.

  6. ADVANCED UTILITY SIMULATION MODEL DESCRIPTION OF MODIFICATIONS TO THE STATE LEVEL MODEL (VERSION 3.0)

    EPA Science Inventory

    The report documents modifications to the state level model portion of the Advanced Utility Simulation Model (AUSM), one of four stationary source emission and control cost forecasting models developed for the National Acid Precipitation Assessment Program (NAPAP). The AUSM model...

  7. Advanced modeling and simulation to design and manufacture high performance and reliable advanced microelectronics and microsystems.

    SciTech Connect

    Nettleship, Ian (University of Pittsburgh, Pittsburgh, PA); Hinklin, Thomas; Holcomb, David Joseph; Tandon, Rajan; Arguello, Jose Guadalupe, Jr.; Dempsey, James Franklin; Ewsuk, Kevin Gregory; Neilsen, Michael K.; Lanagan, Michael (Pennsylvania State University, University Park, PA)

    2007-07-01

    An interdisciplinary team of scientists and engineers having broad expertise in materials processing and properties, materials characterization, and computational mechanics was assembled to develop science-based modeling/simulation technology to design and reproducibly manufacture high performance and reliable, complex microelectronics and microsystems. The team's efforts focused on defining and developing a science-based infrastructure to enable predictive compaction, sintering, stress, and thermomechanical modeling in ''real systems'', including: (1) developing techniques to and determining materials properties and constitutive behavior required for modeling; (2) developing new, improved/updated models and modeling capabilities, (3) ensuring that models are representative of the physical phenomena being simulated; and (4) assessing existing modeling capabilities to identify advances necessary to facilitate the practical application of Sandia's predictive modeling technology.

  8. Enabling Advanced Modeling and Simulations for Fuel-Flexible Combustors

    SciTech Connect

    Pitsch, Heinz

    2010-05-31

    The overall goal of the present project is to enable advanced modeling and simulations for the design and optimization of fuel-flexible turbine combustors. For this purpose we use a high fidelity, extensively-tested large-eddy simulation (LES) code and state-of-the-art models for premixed/partially-premixed turbulent combustion developed in the PI's group. In the frame of the present project, these techniques are applied, assessed, and improved for hydrogen enriched premixed and partially premixed gas-turbine combustion. Our innovative approaches include a completely consistent description of flame propagation; a coupled progress variable/level set method to resolve the detailed flame structure, and incorporation of thermal-diffusion (non-unity Lewis number) effects. In addition, we have developed a general flamelet-type transformation holding in the limits of both non-premixed and premixed burning. As a result, a model for partially premixed combustion has been derived. The coupled progress variable/level method and the general flamelet transformation were validated by LES of a lean-premixed low-swirl burner that has been studied experimentally at Lawrence Berkeley National Laboratory. The model is extended to include the non-unity Lewis number effects, which play a critical role in fuel-flexible combustor with high hydrogen content fuel. More specifically, a two-scalar model for lean hydrogen and hydrogen-enriched combustion is developed and validated against experimental and direct numerical simulation (DNS) data. Results are presented to emphasize the importance of non-unity Lewis number effects in the lean-premixed low-swirl burner of interest in this project. The proposed model gives improved results, which shows that the inclusion of the non-unity Lewis number effects is essential for accurate prediction of the lean-premixed low-swirl flame.

  9. Enabling Advanced Modeling and Simulations for Fuel-Flexible Combustors

    SciTech Connect

    Heinz Pitsch

    2010-05-31

    The overall goal of the present project is to enable advanced modeling and simulations for the design and optimization of fuel-flexible turbine combustors. For this purpose we use a high-fidelity, extensively-tested large-eddy simulation (LES) code and state-of-the-art models for premixed/partially-premixed turbulent combustion developed in the PI's group. In the frame of the present project, these techniques are applied, assessed, and improved for hydrogen enriched premixed and partially premixed gas-turbine combustion. Our innovative approaches include a completely consistent description of flame propagation, a coupled progress variable/level set method to resolve the detailed flame structure, and incorporation of thermal-diffusion (non-unity Lewis number) effects. In addition, we have developed a general flamelet-type transformation holding in the limits of both non-premixed and premixed burning. As a result, a model for partially premixed combustion has been derived. The coupled progress variable/level method and the general flamelet tranformation were validated by LES of a lean-premixed low-swirl burner that has been studied experimentally at Lawrence Berkeley National Laboratory. The model is extended to include the non-unity Lewis number effects, which play a critical role in fuel-flexible combustor with high hydrogen content fuel. More specifically, a two-scalar model for lean hydrogen and hydrogen-enriched combustion is developed and validated against experimental and direct numerical simulation (DNS) data. Results are presented to emphasize the importance of non-unity Lewis number effects in the lean-premixed low-swirl burner of interest in this project. The proposed model gives improved results, which shows that the inclusion of the non-unity Lewis number effects is essential for accurate prediction of the lean-premixed low-swirl flame.

  10. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    SciTech Connect

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow in the wellbore); and (3) accurate approaches to account for the effects of reservoir heterogeneity and for the optimization of nonconventional well deployment. An overview of our progress in each of these main areas is as follows. A general purpose object-oriented research simulator (GPRS) was developed under this project. The GPRS code is managed using modern software management techniques and has been deployed to many companies and research institutions. The simulator includes general black-oil and compositional modeling modules. The formulation is general in that it allows for the selection of a wide variety of primary and secondary variables and accommodates varying degrees of solution implicitness. Specifically, we developed and implemented an IMPSAT procedure (implicit in pressure and saturation, explicit in all other variables) for compositional modeling as well as an adaptive implicit procedure. Both of these capabilities allow for efficiency gains through selective implicitness. The code treats cell connections through a general connection list, which allows it to accommodate both structured and unstructured grids. The GPRS code was written to be easily extendable so new modeling techniques can be readily incorporated. Along these lines, we developed a new dual porosity module compatible with the GPRS framework, as well as a new discrete fracture model applicable for fractured or faulted reservoirs. Both of these methods display substantial advantages over previous implementations. Further, we assessed the performance of different preconditioners in an attempt to improve the efficiency of the linear solver. As a result of this investigation, substantial improvements in solver performance were achieved.

  11. Advanced 3D Photocathode Modeling and Simulations Final Report

    SciTech Connect

    Dimitre A Dimitrov; David L Bruhwiler

    2005-06-06

    High brightness electron beams required by the proposed Next Linear Collider demand strong advances in photocathode electron gun performance. Significant improvement in the production of such beams with rf photocathode electron guns is hampered by the lack high-fidelity simulations. The critical missing piece in existing gun codes is a physics-based, detailed treatment of the very complex and highly nonlinear photoemission process.

  12. Advancing botnet modeling techniques for military and security simulations

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  13. AN ADVANCED METHODOLOGY FOR HETERODOX SIMULATION MODELS BASED ON CRITICAL REALISM

    E-print Network

    Tesfatsion, Leigh

    uncertainty. We base our advanced methodology on Critical Realism, because it deals with inherent uncertainty in different countries. Keywords Methodology, Heterodox Simulation Models, Critical Realism, Uncertainty JEL. To avoid this impression, we suggest empirically calibrating heterodox simulation models in a way

  14. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    SciTech Connect

    Durlofsky, Louis J.

    2000-08-28

    This project targets the development of (1) advanced reservoir simulation techniques for modeling non-conventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and well index (for use in simulation models), including the effects of wellbore flow; and (3) accurate approaches to account for heterogeneity in the near-well region.

  15. ADVANCED UTILITY SIMULATION MODEL, REPORT OF SENSITIVITY TESTING, CALIBRATION, AND MODEL OUTPUT COMPARISONS (VERSION 3.0)

    EPA Science Inventory

    The report gives results of activities relating to the Advanced Utility Simulation Model (AUSM): sensitivity testing. comparison with a mature electric utility model, and calibration to historical emissions. The activities were aimed at demonstrating AUSM's validity over input va...

  16. Advanced Stochastic Modeling and Simulation (SE/ME714) Spring 2009

    E-print Network

    Lin, Xi

    Advanced Stochastic Modeling and Simulation (SE/ME714) Spring 2009 INSTRUCTOR Pirooz Vakili Division of Systems Engineering & Mecahnical Engineering Department 15 St. Mary's street, Room 126 Phone) Martingales (f) Stochastic order relations 2. Simulation (a) Random number generation (b) Statistical analysis

  17. Gastrointestinal Simulation Based on the Advanced Compartmental Absorption and Transit (ACAT) Model

    E-print Network

    Bolger, Michael

    2006-10-26

    ? Compartmental Absorption & Transit (CAT) ? Yu, LX and Amidon, GL - 1996 ? Heterogeneous Tube Model ? Kalampokis, A and Macheras, P - 1999 ? Advanced CAT Model (ACAT) ? Simulations Plus, Inc. ? 1998 - 2006 2 ? GPEN, Kansas, 2006 Data Integration Tool Gastro... metabolism Blood 3 ? GPEN, Kansas, 2006 = transit, K t (i) = 1/transit time (i) = controlled release, K r ? dose * time-release profile = dissolution, K d (i) = 3 ? (C S -C L ) / (? r 2 ) ? = particle density r = initial particle radius Advanced...

  18. Advanced simulation noise model for modern fighter aircraft

    NASA Astrophysics Data System (ADS)

    Ikelheimer, Bruce

    2005-09-01

    NoiseMap currently represents the state of the art for military airfield noise analysis. While this model is sufficient for the current fleet of aircraft, it has limits in its capability to model the new generation of fighter aircraft like the JSF and the F-22. These aircraft's high-powered engines produce noise with significant nonlinear content. Combining this with their ability to vector the thrust means they have noise characteristics that are outside of the basic modeling assumptions of the currently available noise models. Wyle Laboratories, Penn State University, and University of Alabama are in the process of developing a new noise propagation model for the Strategic Environmental Research and Development Program. Source characterization will be through complete spheres (or hemispheres if there is not sufficient data) for each aircraft state (including thrust vector angles). Fixed and rotor wing aircraft will be included. Broadband, narrowband, and pure tone propagation will be included. The model will account for complex terrain and weather effects, as well as the effects of nonlinear propagation. It will be a complete model capable of handling a range of noise sources from small subsonic general aviation aircraft to the latest fighter aircraft like the JSF.

  19. ADVANCED UTILITY SIMULATION MODEL, DESCRIPTION OF THE NATIONAL LOOP (VERSION 3.0)

    EPA Science Inventory

    The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...

  20. Development of a VOR/DME model for an advanced concepts simulator

    NASA Technical Reports Server (NTRS)

    Steinmetz, G. G.; Bowles, R. L.

    1984-01-01

    The report presents a definition of a VOR/DME, airborne and ground systems simulation model. This description was drafted in response to a need in the creation of an advanced concepts simulation in which flight station design for the 1980 era can be postulated and examined. The simulation model described herein provides a reasonable representation of VOR/DME station in the continental United States including area coverage by type and noise errors. The detail in which the model has been cast provides the interested researcher with a moderate fidelity level simulator tool for conducting research and evaluation of navigator algorithms. Assumptions made within the development are listed and place certain responsibilities (data bases, communication with other simulation modules, uniform round earth, etc.) upon the researcher.

  1. The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering

    NASA Technical Reports Server (NTRS)

    Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen

    2006-01-01

    This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.

  2. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    SciTech Connect

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  3. ADVANCED UTILITY SIMULATION MODEL, REPORT OF SENSITIVITY TESTING, CALIBRATION, AND MODEL OUTPUT COMPARISONS (VERSION 3.0) TAPE

    EPA Science Inventory

    The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...

  4. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  5. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    SciTech Connect

    Kimberlyn C. Mousseau

    2011-10-01

    The Nuclear Energy Computational Fluid Dynamics Advanced Modeling and Simulation (NE-CAMS) system is being developed at the Idaho National Laboratory (INL) in collaboration with Bettis Laboratory, Sandia National Laboratory (SNL), Argonne National Laboratory (ANL), Utah State University (USU), and other interested parties with the objective of developing and implementing a comprehensive and readily accessible data and information management system for computational fluid dynamics (CFD) verification and validation (V&V) in support of nuclear energy systems design and safety analysis. The two key objectives of the NE-CAMS effort are to identify, collect, assess, store and maintain high resolution and high quality experimental data and related expert knowledge (metadata) for use in CFD V&V assessments specific to the nuclear energy field and to establish a working relationship with the U.S. Nuclear Regulatory Commission (NRC) to develop a CFD V&V database, including benchmark cases, that addresses and supports the associated NRC regulations and policies on the use of CFD analysis. In particular, the NE-CAMS system will support the Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program, which aims to develop and deploy advanced modeling and simulation methods and computational tools for reliable numerical simulation of nuclear reactor systems for design and safety analysis. Primary NE-CAMS Elements There are four primary elements of the NE-CAMS knowledge base designed to support computer modeling and simulation in the nuclear energy arena as listed below. Element 1. The database will contain experimental data that can be used for CFD validation that is relevant to nuclear reactor and plant processes, particularly those important to the nuclear industry and the NRC. Element 2. Qualification standards for data evaluation and classification will be incorporated and applied such that validation data sets will result in well-defined, well-characterized data. Element 3. Standards will be established for the design and operation of experiments for the generation of new validation data sets that are to be submitted to NE-CAMS that addresses the completeness and characterization of the dataset. Element 4. Standards will be developed for performing verification and validation (V&V) to establish confidence levels in CFD analyses of nuclear reactor processes; such processes will be acceptable and recognized by both CFD experts and the NRC.

  6. Simulating carbon exchange using a regional atmospheric model coupled to an advanced land-surface model

    NASA Astrophysics Data System (ADS)

    Ter Maat, H. W.; Hutjes, R. W. A.

    2008-10-01

    A large scale mismatch exists between our understanding and quantification of ecosystem atmosphere exchange of carbon dioxide at local scale and continental scales. This paper will focus on the carbon exchange on the regional scale to address the following question: What are the main controlling factors determining atmospheric carbon dioxide content at a regional scale? We use the Regional Atmospheric Modelling System (RAMS), coupled with a land surface scheme simulating carbon, heat and momentum fluxes (SWAPS-C), and including also sub models for urban and marine fluxes, which in principle include the main controlling mechanisms and capture the relevant dynamics of the system. To validate the model, observations are used which were taken during an intensive observational campaign in the central Netherlands in summer 2002. These included flux-site observations, vertical profiles at tall towers and spatial fluxes of various variables taken by aircraft. The coupled regional model (RAMS-SWAPS-C) generally does a good job in simulating results close to reality. The validation of the model demonstrates that surface fluxes of heat, water and CO2 are reasonably well simulated. The comparison against aircraft data shows that the regional meteorology is captured by the model. Comparing spatially explicit simulated and observed fluxes we conclude that in general simulated latent heat fluxes are underestimated by the model to the observations which exhibit large standard deviation for all flights. Sensitivity experiments demonstrated the relevance of the urban emissions of carbon dioxide for the carbon balance in this particular region. The same test also show the relation between uncertainties in surface fluxes and those in atmospheric concentrations.

  7. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    SciTech Connect

    Rich Johnson; Kimberlyn C. Mousseau; Hyung Lee

    2011-09-01

    NE-KAMS knowledge base will assist computational analysts, physics model developers, experimentalists, nuclear reactor designers, and federal regulators by: (1) Establishing accepted standards, requirements and best practices for V&V and UQ of computational models and simulations, (2) Establishing accepted standards and procedures for qualifying and classifying experimental and numerical benchmark data, (3) Providing readily accessible databases for nuclear energy related experimental and numerical benchmark data that can be used in V&V assessments and computational methods development, (4) Providing a searchable knowledge base of information, documents and data on V&V and UQ, and (5) Providing web-enabled applications, tools and utilities for V&V and UQ activities, data assessment and processing, and information and data searches. From its inception, NE-KAMS will directly support nuclear energy research, development and demonstration programs within the U.S. Department of Energy (DOE), including the Consortium for Advanced Simulation of Light Water Reactors (CASL), the Nuclear Energy Advanced Modeling and Simulation (NEAMS), the Light Water Reactor Sustainability (LWRS), the Small Modular Reactors (SMR), and the Next Generation Nuclear Power Plant (NGNP) programs. These programs all involve computational modeling and simulation (M&S) of nuclear reactor systems, components and processes, and it is envisioned that NE-KAMS will help to coordinate and facilitate collaboration and sharing of resources and expertise for V&V and UQ across these programs. In addition, from the outset, NE-KAMS will support the use of computational M&S in the nuclear industry by developing guidelines and recommended practices aimed at quantifying the uncertainty and assessing the applicability of existing analysis models and methods. The NE-KAMS effort will initially focus on supporting the use of computational fluid dynamics (CFD) and thermal hydraulics (T/H) analysis for M&S of nuclear reactor systems, components and processes, and will later expand to include materials, fuel system performance and other areas of M&S as time and funding allow.

  8. Propulsion Simulations Using Advanced Turbulence Models with the Unstructured Grid CFD Tool, TetrUSS

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Frink, Neal T.; Deere, Karen A.; Pandya, Mohangna J.

    2004-01-01

    A computational investigation has been completed to assess the capability of TetrUSS for exhaust nozzle flows. Three configurations were chosen for this study (1) an axisymmetric supersonic jet, (2) a transonic axisymmetric boattail with solid sting operated at different Reynolds number and Mach number, and (3) an isolated non-axisymmetric nacelle with a supersonic cruise nozzle. These configurations were chosen because existing experimental data provided a means for measuring the ability of TetrUSS for simulating complex nozzle flows. The main objective of this paper is to validate the implementation of advanced two-equation turbulence models in the unstructured-grid CFD code USM3D for propulsion flow cases. USM3D is the flow solver of the TetrUSS system. Three different turbulence models, namely, Menter Shear Stress Transport (SST), basic k epsilon, and the Spalart-Allmaras (SA) are used in the present study. The results are generally in agreement with other implementations of these models in structured-grid CFD codes. Results indicate that USM3D provides accurate simulations for complex aerodynamic configurations with propulsion integration.

  9. Making NEAR work: cooperative modeling and simulation with an advanced guidance and control system

    NASA Astrophysics Data System (ADS)

    Heyler, Gene A.; Harch, Ann P.

    2002-01-01

    The Near Earth Asteroid Rendezvous spacecraft performed a 1-year orbital mission around asteroid 433 Eros until 12 February 2001. The mission consisted of daily science data collection events and occasional orbit correction maneuvers, and culminated with a controlled descent and soft landing on the asteroid's surface. These events all required meticulously planned, simulated, and executed spacecraft pointing scenarios. An advanced guiding and control system and a high-fidelity visual planning tool were critical for these operations as was the close interaction among the imaging scientists, the guidance and control engineers, and the navigation team. These teams used detailed truth models of both the spacecraft and asteroid environments. Of particular interest was the controlled descent to Eros' surface, which consisted of pointing and thrusting events simultaneous with image collection. The unexpected survival of the spacecraft on landing permitted the collection of telemetry data pertaining to the final resting attitude.

  10. Advanced Wellbore Thermal Simulator

    Energy Science and Technology Software Center (ESTSC)

    1992-03-04

    GEOTEMP2, which is based on the earlier GEOTEMP program, is a wellbore thermal simulator designed for geothermal well drilling and production applications. The code treats natural and forced convection and conduction within the wellbore and heat conduction within the surrounding rock matrix. A variety of well operations can be modeled including injection, production, forward and reverse circulation with gas or liquid, gas or liquid drilling, and two-phase steam injection and production. Well completion with severalmore »different casing sizes and cement intervals can be modeled. The code allows variables, such as flow rate, to change with time enabling a realistic treatment of well operations. Provision is made in the flow equations to allow the flow areas of the tubing to vary with depth in the wellbore. Multiple liquids can exist in GEOTEMP2 simulations. Liquid interfaces are tracked through the tubing and annulus as one liquid displaces another. GEOTEMP2, however, does not attempt to simulate displacement of liquids with a gas or two-phase steam or vice versa. This means that it is not possible to simulate an operation where the type of drilling fluid changes, e.g. mud going to air. GEOTEMP2 was designed primarily for use in predicting the behavior of geothermal wells, but it is flexible enough to handle many typical drilling, production, and injection problems in the oil industry as well. However, GEOTEMP2 does not allow the modeling of gas-filled annuli in production or injection problems. In gas or mist drilling, no radiation losses are included in the energy balance. No attempt is made to model flow in the formation. Average execution time is 50 CP seconds on a CDC CYBER170. This edition of GEOTEMP2 is designated as Version 2.0 by the contributors.« less

  11. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    SciTech Connect

    Durlofsky, Louis J.; Aziz, Khalid

    2001-08-23

    Research results for the second year of this project on the development of improved modeling techniques for non-conventional (e.g., horizontal, deviated or multilateral) wells were presented. The overall program entails the development of enhanced well modeling and general simulation capabilities. A general formulation for black-oil and compositional reservoir simulation was presented.

  12. An architecture and model for cognitive engineering simulation analysis - Application to advanced aviation automation

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Smith, Barry R.

    1993-01-01

    The process of designing crew stations for large-scale, complex automated systems is made difficult because of the flexibility of roles that the crew can assume, and by the rapid rate at which system designs become fixed. Modern cockpit automation frequently involves multiple layers of control and display technology in which human operators must exercise equipment in augmented, supervisory, and fully automated control modes. In this context, we maintain that effective human-centered design is dependent on adequate models of human/system performance in which representations of the equipment, the human operator(s), and the mission tasks are available to designers for manipulation and modification. The joint Army-NASA Aircrew/Aircraft Integration (A3I) Program, with its attendant Man-machine Integration Design and Analysis System (MIDAS), was initiated to meet this challenge. MIDAS provides designers with a test bed for analyzing human-system integration in an environment in which both cognitive human function and 'intelligent' machine function are described in similar terms. This distributed object-oriented simulation system, its architecture and assumptions, and our experiences from its application in advanced aviation crew stations are described.

  13. An open software framework for advancement of x-ray optics simulation and modeling

    NASA Astrophysics Data System (ADS)

    Bruhwiler, David L.; Chubar, Oleg; Nagler, Robert; Krzywinski, Jacek; Boehnlein, Amber

    2014-09-01

    Accurate physical-optics based simulation of emission, transport and use in experiments of fully- and partially-coherent X-ray radiation is essential for both designers and users of experiments at state-of-the-art light sources: low-emittance storage rings, energy-recovery linacs and free-electron lasers. To be useful for different applications, the simulations must include accurate physical models for the processes of emission, for the structures of X-ray optical elements, interaction of the radiation with samples, and propagation of scattered X-rays to a detector. Based on the "Synchrotron Radiation Workshop" (SRW) open source computer code, we are developing a simulation framework, including a graphical user interface, web interface for client-server simulations, data format for wave-optics based representation of partially-coherent X-ray radiation, and a dictionary for universal description of optical elements. Also, we are evaluating formats for sample and experimental data representation for different types of experiments and processing. The simulation framework will facilitate start-to-end simulations by different computer codes complementary to SRW, for example GENESIS and FAST codes for simulating self-amplified spontaneous emission, SHADOW and McXtrace geometrical ray-tracing codes, as well as codes for simulation of interaction of radiation with matter and data processing in experiments exploiting coherence of radiation. The development of the new framework is building on components developed for the Python-based RadTrack software, which is designed for loose coupling of multiple electron and radiation codes to enable sophisticated workflows. We are exploring opportunities for collaboration with teams pursuing similar developments at European Synchrotron Radiation Facility and the European XFEL.

  14. Advancing Nucleosynthesis in Core-Collapse Supernovae Models Using 2D CHIMERA Simulations

    NASA Astrophysics Data System (ADS)

    Harris, J. A.; Hix, W. R.; Chertkow, M. A.; Bruenn, S. W.; Lentz, E. J.; Messer, O. B.; Mezzacappa, A.; Blondin, J. M.; Marronetti, P.; Yakunin, K.

    2014-01-01

    The deaths of massive stars as core-collapse supernovae (CCSN) serve as a crucial link in understanding galactic chemical evolution since the birth of the universe via the Big Bang. We investigate CCSN in polar axisymmetric simulations using the multidimensional radiation hydrodynamics code CHIMERA. Computational costs have traditionally constrained the evolution of the nuclear composition in CCSN models to, at best, a 14-species ?-network. However, the limited capacity of the ?-network to accurately evolve detailed composition, the neutronization and the nuclear energy generation rate has fettered the ability of prior CCSN simulations to accurately reproduce the chemical abundances and energy distributions as known from observations. These deficits can be partially ameliorated by "post-processing" with a more realistic network. Lagrangian tracer particles placed throughout the star record the temporal evolution of the initial simulation and enable the extension of the nuclear network evolution by incorporating larger systems in post-processing nucleosynthesis calculations. We present post-processing results of the four ab initio axisymmetric CCSN 2D models of Bruenn et al. (2013) evolved with the smaller ?-network, and initiated from stellar metallicity, non-rotating progenitors of mass 12, 15, 20, and 25 M? from Woosley & Heger (2007). As a test of the limitations of post-processing, we provide preliminary results from an ongoing simulation of the 15 M? model evolved with a realistic 150 species nuclear reaction network in situ. With more accurate energy generation rates and an improved determination of the thermodynamic trajectories of the tracer particles, we can better unravel the complicated multidimensional "mass-cut" in CCSN simulations and probe for less energetically significant nuclear processes like the ?p-process and the r-process, which require still larger networks.

  15. A Damage Model for the Simulation of Delamination in Advanced Composites under Variable-Mode Loading

    NASA Technical Reports Server (NTRS)

    Turon, A.; Camanho, P. P.; Costa, J.; Davila, C. G.

    2006-01-01

    A thermodynamically consistent damage model is proposed for the simulation of progressive delamination in composite materials under variable-mode ratio. The model is formulated in the context of Damage Mechanics. A novel constitutive equation is developed to model the initiation and propagation of delamination. A delamination initiation criterion is proposed to assure that the formulation can account for changes in the loading mode in a thermodynamically consistent way. The formulation accounts for crack closure effects to avoid interfacial penetration of two adjacent layers after complete decohesion. The model is implemented in a finite element formulation, and the numerical predictions are compared with experimental results obtained in both composite test specimens and structural components.

  16. Numerical Simulation of Earth Directed CMEs with an Advanced Two-Temperature Coronal Model (Invited)

    NASA Astrophysics Data System (ADS)

    Manchester, W. B.; van der Holst, B.; Frazin, R. A.; Vasquez, A. M.; Toth, G.; Gombosi, T. I.

    2010-12-01

    We present progress on modeling Earth-directed CMEs including the December 12, 2008 CME and the May 13, 2005 campaign event from initiation to heliospheric propagation. Our earlier work on the 2005 event followed the CME to the orbit of Saturn employing the coronal model of Cohen et al. (2007), which relies on a spatially varying adiabatic index (gamma) to produce the bimodal solar wind. This model was able to reproduce several features of the observed event, but suffered from artifacts of the artificially thermodynamics. We will examine results of a recent simulation performed with a new two-temperature solar corona model developed at the University of Michigan. This model employs heat conduction for both ion and electron species, constant adiabatic index (=5/3), and includes Alfven waves to drive the solar wind. The model includes SOHO/MDI magnetogram data to calculate the coronal field, and also uses SOHO/EIT observations to specify the density and temperature at the coronal boundary by the Differential Emission Measure Tomography (DEMT) method. The Wang-Sheeley-Arge empirical model is used to determine the Alfven wave pressure necessary to produce the observed solar wind speeds. We find that the new model is much better able to reproduce the solar wind densities, and also correctly captures the compression at the CME-driven shock due to the fixed adiabatic index.

  17. Advanced Vadose Zone Simulations Using TOUGH

    SciTech Connect

    Finsterle, S.; Doughty, C.; Kowalsky, M.B.; Moridis, G.J.; Pan,L.; Xu, T.; Zhang, Y.; Pruess, K.

    2007-02-01

    The vadose zone can be characterized as a complex subsurfacesystem in which intricate physical and biogeochemical processes occur inresponse to a variety of natural forcings and human activities. Thismakes it difficult to describe, understand, and predict the behavior ofthis specific subsurface system. The TOUGH nonisothermal multiphase flowsimulators are well-suited to perform advanced vadose zone studies. Theconceptual models underlying the TOUGH simulators are capable ofrepresenting features specific to the vadose zone, and of addressing avariety of coupled phenomena. Moreover, the simulators are integratedinto software tools that enable advanced data analysis, optimization, andsystem-level modeling. We discuss fundamental and computationalchallenges in simulating vadose zone processes, review recent advances inmodeling such systems, and demonstrate some capabilities of the TOUGHsuite of codes using illustrative examples.

  18. Advancing predictive models for particulate formation in turbulent flames via massively parallel direct numerical simulations

    PubMed Central

    Bisetti, Fabrizio; Attili, Antonio; Pitsch, Heinz

    2014-01-01

    Combustion of fossil fuels is likely to continue for the near future due to the growing trends in energy consumption worldwide. The increase in efficiency and the reduction of pollutant emissions from combustion devices are pivotal to achieving meaningful levels of carbon abatement as part of the ongoing climate change efforts. Computational fluid dynamics featuring adequate combustion models will play an increasingly important role in the design of more efficient and cleaner industrial burners, internal combustion engines, and combustors for stationary power generation and aircraft propulsion. Today, turbulent combustion modelling is hindered severely by the lack of data that are accurate and sufficiently complete to assess and remedy model deficiencies effectively. In particular, the formation of pollutants is a complex, nonlinear and multi-scale process characterized by the interaction of molecular and turbulent mixing with a multitude of chemical reactions with disparate time scales. The use of direct numerical simulation (DNS) featuring a state of the art description of the underlying chemistry and physical processes has contributed greatly to combustion model development in recent years. In this paper, the analysis of the intricate evolution of soot formation in turbulent flames demonstrates how DNS databases are used to illuminate relevant physico-chemical mechanisms and to identify modelling needs. PMID:25024412

  19. An advanced object-based software framework for complex ecosystem modeling and simulation

    SciTech Connect

    Sydelko, P. J.; Dolph, J. E.; Majerus, K. A.; Taxon, T. N.

    2000-06-29

    Military land managers and decision makers face an ever increasing challenge to balance maximum flexibility for the mission with a diverse set of multiple land use, social, political, and economic goals. In addition, these goals encompass environmental requirements for maintaining ecosystem health and sustainability over the long term. Spatiotemporal modeling and simulation in support of adaptive ecosystem management can be best accomplished through a dynamic, integrated, and flexible approach that incorporates scientific and technological components into a comprehensive ecosystem modeling framework. The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) integrates ecological models and decision support techniques through a geographic information system (GIS)-based backbone. Recently, an object-oriented (OO) architectural framework was developed for IDLAMS (OO-IDLAMS). This OO-IDLAMS Prototype was built upon and leverages from the Dynamic Information Architecture System (DIAS) developed by Argonne National Laboratory. DIAS is an object-based architectural framework that affords a more integrated, dynamic, and flexible approach to comprehensive ecosystem modeling than was possible with the GIS-based integration approach of the original IDLAMS. The flexibility, dynamics, and interoperability demonstrated through this case study of an object-oriented approach have the potential to provide key technology solutions for many of the military's multiple-use goals and needs for integrated natural resource planning and ecosystem management.

  20. Advances in the simulation of toroidal gyro Landau fluid model turbulence

    SciTech Connect

    Waltz, R.E.; Kerbel, G.D.; Milovich, J.; Hammett, G.W.

    1994-12-01

    The gyro-Landau fluid (GLF) model equations for toroidal geometry have been recently applied to the study ion temperature gradient (ITG) mode turbulence using the 3D nonlinear ballooning mode representation (BMR). The present paper extends this work by treating some unresolved issues conceming ITG turbulence with adiabatic electrons. Although eddies are highly elongated in the radial direction long time radial correlation lengths are short and comparable to poloidal lengths. Although transport at vanishing shear is not particularly large, transport at reverse global shear, is significantly less. Electrostatic transport at moderate shear is not much effected by inclusion of local shear and average favorable curvature. Transport is suppressed when critical E{times}B rotational shear is comparable to the maximum linear growth rate with only a weak dependence on magnetic shear. Self consistent turbulent transport of toroidal momentum can result in a transport bifurcation at suffciently large r/(Rq). However the main thrust of the new formulation in the paper deals with advances in the development of finite beta GLF models with trapped electron and BMR numerical methods for treating the fast parallel field motion of the untrapped electrons.

  1. Numerical Simulations of Optical Turbulence Using an Advanced Atmospheric Prediction Model: Implications for Adaptive Optics Design

    NASA Astrophysics Data System (ADS)

    Alliss, R.

    2014-09-01

    Optical turbulence (OT) acts to distort light in the atmosphere, degrading imagery from astronomical telescopes and reducing the data quality of optical imaging and communication links. Some of the degradation due to turbulence can be corrected by adaptive optics. However, the severity of optical turbulence, and thus the amount of correction required, is largely dependent upon the turbulence at the location of interest. Therefore, it is vital to understand the climatology of optical turbulence at such locations. In many cases, it is impractical and expensive to setup instrumentation to characterize the climatology of OT, so numerical simulations become a less expensive and convenient alternative. The strength of OT is characterized by the refractive index structure function Cn2, which in turn is used to calculate atmospheric seeing parameters. While attempts have been made to characterize Cn2 using empirical models, Cn2 can be calculated more directly from Numerical Weather Prediction (NWP) simulations using pressure, temperature, thermal stability, vertical wind shear, turbulent Prandtl number, and turbulence kinetic energy (TKE). In this work we use the Weather Research and Forecast (WRF) NWP model to generate Cn2 climatologies in the planetary boundary layer and free atmosphere, allowing for both point-to-point and ground-to-space seeing estimates of the Fried Coherence length (ro) and other seeing parameters. Simulations are performed using a multi-node linux cluster using the Intel chip architecture. The WRF model is configured to run at 1km horizontal resolution and centered on the Mauna Loa Observatory (MLO) of the Big Island. The vertical resolution varies from 25 meters in the boundary layer to 500 meters in the stratosphere. The model top is 20 km. The Mellor-Yamada-Janjic (MYJ) TKE scheme has been modified to diagnose the turbulent Prandtl number as a function of the Richardson number, following observations by Kondo and others. This modification deweights the contribution of the buoyancy term in the equation for TKE by reducing the ratio of the eddy diffusivity of heat to momentum. This is necessary particularly in the stably stratified free atmosphere where turbulence occurs in thin layers not typically resolvable by the model. The modified MYJ scheme increases the probability and strength of TKE in thermally stable conditions thereby increasing the probability of optical turbulence. Over twelve months of simulations have been generated. Results indicate realistic values of the Fried Coherence Length (ro) are obtained when compared with observations from a Differential Image Motion Monitor (DIMM) instrument. Seeing is worse during day than at night with large ros observed just after sunset and just before sunrise. Three dimensional maps indicate that the vast lava fields, which characterize the Big Island, have a large impact on turbulence generation with a large dependence on elevation. Results from this study are being used to make design decisions for adaptive optics systems. Detailed results of this study will be presented at the conference.

  2. Cartographic support for advanced distributed simulation

    NASA Astrophysics Data System (ADS)

    Lukes, George E.

    1995-07-01

    Emerging requirements for diverse spatial data bases to support a broad spectrum of modeling and simulation activities can profit from advances in the state-of-the-art for automated cartographic data base generation and maintenance. This paper provides an introduction to Advanced Distributed Simulation (ADS) and the associated requirements for Synthetic Environments to support Synthetic Theaters of War. It provides a technical rationale for exploring applications of image understanding technology to automated cartography in support of ADS and related programs benefiting from automated analysis of mapping, Earth resources and reconnaissance imagery.

  3. High Level Requirements for the Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    SciTech Connect

    Rich Johnson; Hyung Lee; Kimberlyn C. Mousseau

    2011-09-01

    The US Department of Energy, Office of Nuclear Energy (DOE-NE), has been tasked with the important mission of ensuring that nuclear energy remains a compelling and viable energy source in the U.S. The motivations behind this mission include cost-effectively meeting the expected increases in the power needs of the country, reducing carbon emissions and reducing dependence on foreign energy sources. In the near term, to ensure that nuclear power remains a key element of U.S. energy strategy and portfolio, the DOE-NE will be working with the nuclear industry to support safe and efficient operations of existing nuclear power plants. In the long term, to meet the increasing energy needs of the U.S., the DOE-NE will be investing in research and development (R&D) and working in concert with the nuclear industry to build and deploy new, safer and more efficient nuclear power plants. The safe and efficient operations of existing nuclear power plants and designing, licensing and deploying new reactor designs, however, will require focused R&D programs as well as the extensive use and leveraging of advanced modeling and simulation (M&S). M&S will play a key role in ensuring safe and efficient operations of existing and new nuclear reactors. The DOE-NE has been actively developing and promoting the use of advanced M&S in reactor design and analysis through its R&D programs, e.g., the Nuclear Energy Advanced Modeling and Simulation (NEAMS) and Consortium for Advanced Simulation of Light Water Reactors (CASL) programs. Also, nuclear reactor vendors are already using CFD and CSM, for design, analysis, and licensing. However, these M&S tools cannot be used with confidence for nuclear reactor applications unless accompanied and supported by verification and validation (V&V) and uncertainty quantification (UQ) processes and procedures which provide quantitative measures of uncertainty for specific applications. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base that will provide technical services and resources for V&V and UQ of M&S in nuclear energy sciences and engineering. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear reactor design, analysis and licensing. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and the public and will help ensure the safe, economical and reliable operation of existing and future nuclear reactors. From its inception, NE-KAMS will directly support nuclear energy research, development and demonstration programs within the U.S. Department of Energy (DOE), including the CASL, NEAMS, Light Water Reactor Sustainability (LWRS), Small Modular Reactors (SMR), and Next Generation Nuclear Power Plant (NGNP) programs. These programs all involve M&S of nuclear reactor systems, components and processes, and it is envisioned that NE-KAMS will help to coordinate and facilitate collaboration and sharing of resources and expertise for V&V and UQ across these programs.

  4. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    SciTech Connect

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  5. Advanced concepts flight simulation facility.

    PubMed

    Chappell, S L; Sexton, G A

    1986-12-01

    The cockpit environment is changing rapidly. New technology allows airborne computerised information, flight automation and data transfer with the ground. By 1995, not only will the pilot's task have changed, but also the tools for doing that task. To provide knowledge and direction for these changes, the National Aeronautics and Space Administration (NASA) and the Lockheed-Georgia Company have completed three identical Advanced Concepts Flight Simulation Facilities. Many advanced features have been incorporated into the simulators - e g, cathode ray tube (CRT) displays of flight and systems information operated via touch-screen or voice, print-outs of clearances, cockpit traffic displays, current databases containing navigational charts, weather and flight plan information, and fuel-efficient autopilot control from take-off to touchdown. More importantly, this cockpit is a versatile test bed for studying displays, controls, procedures and crew management in a full-mission context. The facility also has an air traffic control simulation, with radio and data communications, and an outside visual scene with variable weather conditions. These provide a veridical flight environment to evaluate accurately advanced concepts in flight stations. PMID:15676591

  6. Science based integrated approach to advanced nuclear fuel development - integrated multi-scale multi-physics hierarchical modeling and simulation framework Part III: cladding

    SciTech Connect

    Tome, Carlos N; Caro, J A; Lebensohn, R A; Unal, Cetin; Arsenlis, A; Marian, J; Pasamehmetoglu, K

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating the phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.

  7. Leveraging data analytics, patterning simulations and metrology models to enhance CD metrology accuracy for advanced IC nodes

    NASA Astrophysics Data System (ADS)

    Rana, Narender; Zhang, Yunlin; Kagalwala, Taher; Hu, Lin; Bailey, Todd

    2014-04-01

    Integrated Circuit (IC) technology is changing in multiple ways: 193i to EUV exposure, planar to non-planar device architecture, from single exposure lithography to multiple exposure and DSA patterning etc. Critical dimension (CD) control requirement is becoming stringent and more exhaustive: CD and process window are shrinking., three sigma CD control of < 2 nm is required in complex geometries, and metrology uncertainty of < 0.2 nm is required to achieve the target CD control for advanced IC nodes (e.g. 14 nm, 10 nm and 7 nm nodes). There are fundamental capability and accuracy limits in all the metrology techniques that are detrimental to the success of advanced IC nodes. Reference or physical CD metrology is provided by CD-AFM, and TEM while workhorse metrology is provided by CD-SEM, Scatterometry, Model Based Infrared Reflectrometry (MBIR). Precision alone is not sufficient moving forward. No single technique is sufficient to ensure the required accuracy of patterning. The accuracy of CD-AFM is ~1 nm and precision in TEM is poor due to limited statistics. CD-SEM, scatterometry and MBIR need to be calibrated by reference measurements for ensuring the accuracy of patterned CDs and patterning models. There is a dire need of measurement with < 0.5 nm accuracy and the industry currently does not have that capability with inline measurments. Being aware of the capability gaps for various metrology techniques, we have employed data processing techniques and predictive data analytics, along with patterning simulation and metrology models, and data integration techniques to selected applications demonstrating the potential solution and practicality of such an approach to enhance CD metrology accuracy. Data from multiple metrology techniques has been analyzed in multiple ways to extract information with associated uncertainties and integrated to extract the useful and more accurate CD and profile information of the structures. This paper presents the optimization of scatterometry and MBIR model calibration and feasibility to extrapolate not only in design and process space but from one process step to a previous process step. Well calibrated scatterometry model or patterning simulation model can be used to accurately extrapolate and interpolate in the design and process space for lithography patterning where AFM is not capable to accurately measure sub-40 nm trenches. Uncertainty associated with extrapolation can be large and needs to be minimized. We have made use of measurements from CD-SEM and CD-AFM, along with the patterning and scatterometry simulation models to estimate the uncertainty associated with extrapolation and methods to reduce it. First time we have reported the application of machine learning (Artificial Neural Networks) to the resist shrinkage systematic phenomenon to accurately predict the preshrink CD based on supervised learning using the CD-AFM data. The study lays out various basic concepts, approaches and protocols of multiple source data processing and integration for hybrid metrology approach. Impacts of this study include more accurate metrology, patterning models and better process controls for advanced IC nodes.

  8. Challenge problem and milestones for : Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC).

    SciTech Connect

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe, Jr.

    2010-09-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  9. ADVANCED URBANIZED METEOROLOGICAL MODELING AND AIR QUALITY SIMULATIONS WITH CMAQ AT NEIGHBORHOOD SCALES

    EPA Science Inventory

    We present results from a study testing the new boundary layer parameterization method, the canopy drag approach (DA) which is designed to explicitly simulate the effects of buildings, street and tree canopies on the dynamic, thermodynamic structure and dispersion fields in urban...

  10. To Be Presented at the Advanced Simulation Technology Symposium (ASTC), Washington DC, April 2004. Building Simulation Modeling Environments Using Systems Theory and Software

    E-print Network

    . Building Simulation Modeling Environments Using Systems Theory and Software Architecture Principles Hessam Abstract Use of simulations for design of systems requires assurance that the underlying modeling theories S. Sarjoughian and Ranjit K. Singh Arizona Center for Integrative Modeling and Simulation Dept

  11. Advancements in multi scale modeling: Adaptive resolution simulations and related issues

    NASA Astrophysics Data System (ADS)

    Guenza, Marina G.

    2015-09-01

    Adaptive resolution methods are becoming increasingly important in the study of complex systems by multi scale modeling. In this paper we present a brief overview of the method and highlight some questions that in our opinion are relevant for the future development of the method, and more in general of the field of multiscale modeling.

  12. Software Framework for Advanced Power Plant Simulations

    SciTech Connect

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  13. Cavity control system advanced modeling and simulations for TESLA linear accelerator and free electron laser

    NASA Astrophysics Data System (ADS)

    Czarski, Tomasz; Romaniuk, Ryszard S.; Pozniak, Krzysztof T.; Simrock, Stefan

    2004-07-01

    The cavity control system for the TESLA -- TeV-Energy Superconducting Linear Accelerator project is initially introduced. The elementary analysis of the cavity resonator on RF (radio frequency) level and low level frequency with signal and power considerations is presented. For the field vector detection the digital signal processing is proposed. The electromechanical model concerning Lorentz force detuning is applied for analyzing the basic features of the system performance. For multiple cavities driven by one klystron the field vector sum control is considered. Simulink model implementation is developed to explore the feedback and feed-forward system operation and some experimental results for signals and power considerations are presented.

  14. WARPM Framework for advanced plasma model simulations on many-core architectures

    NASA Astrophysics Data System (ADS)

    Reddell, Noah; Shumlak, Uri

    2012-10-01

    A new framework WARPM designed for many-core computing architectures such as GPU is presented. The framework supports both multi-fluid and continuum kinetic plasma models. We provide exemplary physics results including whistler wave propagation, and show performance gains. For good performance on many-core architectures, code design should minimize data movement. The algorithms developed are thus both local and explicit. Fluid and continuum kinetic models on structured grids also benefit from predictable data access patterns as opposed to PIC models. The resulting framework is a hybrid combination of MPI for communication between nodes, threads for task parallelism on each node, and OpenCL parallel numerical method implementation across hundreds of cores per node. The framework manages data movement, sub-domain sequencing, and I/O intelligently such that memory bandwidth bottlenecks can be significantly hidden. Use of OpenCL and our method for sequencing computation naturally allows for heterogeneous computation utilizing both CPU and GPU on a node. A new dynamic OpenCL code assembly scheme allows support for many different models, numerical methods, and geometries; a specific combination of these is chosen at runtime then used to generate a single compiled kernel.

  15. Lightweighting Automotive Materials for Increased Fuel Efficiency and Delivering Advanced Modeling and Simulation Capabilities to U.S. Manufacturers

    SciTech Connect

    Hale, Steve

    2013-09-11

    Abstract The National Center for Manufacturing Sciences (NCMS) worked with the U.S. Department of Energy (DOE), National Energy Technology Laboratory (NETL), to bring together research and development (R&D) collaborations to develop and accelerate the knowledgebase and infrastructure for lightweighting materials and manufacturing processes for their use in structural and applications in the automotive sector. The purpose/importance of this DOE program: • 2016 CAFÉ standards. • Automotive industry technology that shall adopt the insertion of lightweighting material concepts towards manufacturing of production vehicles. • Development and manufacture of advanced research tools for modeling and simulation (M&S) applications to reduce manufacturing and material costs. • U.S. competitiveness that will help drive the development and manufacture of the next generation of materials. NCMS established a focused portfolio of applied R&D projects utilizing lightweighting materials for manufacture into automotive structures and components. Areas that were targeted in this program: • Functionality of new lightweighting materials to meet present safety requirements. • Manufacturability using new lightweighting materials. • Cost reduction for the development and use of new lightweighting materials. The automotive industry’s future continuously evolves through innovation, and lightweight materials are key in achieving a new era of lighter, more efficient vehicles. Lightweight materials are among the technical advances needed to achieve fuel/energy efficiency and reduce carbon dioxide (CO2) emissions: • Establish design criteria methodology to identify the best materials for lightweighting. • Employ state-of-the-art design tools for optimum material development for their specific applications. • Match new manufacturing technology to production volume. • Address new process variability with new production-ready processes.

  16. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    SciTech Connect

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems, and (4) design, situational awareness and control of complex networks. The program elements consist of a group of Complex Networked Systems Research Institutes (CNSRI), tightly coupled to an associated individual-investigator-based Complex Networked Systems Basic Research (CNSBR) program. The CNSRI's will be principally located at the DOE National Laboratories and are responsible for identifying research priorities, developing and maintaining a networked systems modeling and simulation software infrastructure, operating summer schools, workshops and conferences and coordinating with the CNSBR individual investigators. The CNSBR individual investigator projects will focus on specific challenges for networked systems. Relevancy of CNSBR research to DOE needs will be assured through the strong coupling provided between the CNSBR grants and the CNSRI's.

  17. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  18. High-fidelity Simulation of Jet Noise from Rectangular Nozzles . [Large Eddy Simulation (LES) Model for Noise Reduction in Advanced Jet Engines and Automobiles

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj

    2014-01-01

    This Phase II project validated a state-of-the-art LES model, coupled with a Ffowcs Williams-Hawkings (FW-H) far-field acoustic solver, to support the development of advanced engine concepts. These concepts include innovative flow control strategies to attenuate jet noise emissions. The end-to-end LES/ FW-H noise prediction model was demonstrated and validated by applying it to rectangular nozzle designs with a high aspect ratio. The model also was validated against acoustic and flow-field data from a realistic jet-pylon experiment, thereby significantly advancing the state of the art for LES.

  19. Patient-Specific Geometry Modeling and Mesh Generation for Simulating Obstructive Sleep Apnea Syndrome Cases by Maxillomandibular Advancement.

    PubMed

    Ito, Yasushi; Cheng, Gary C; Shih, Alan M; Koomullil, Roy P; Soni, Bharat K; Sittitavornwong, Somsak; Waite, Peter D

    2011-05-01

    The objective of this paper is the reconstruction of upper airway geometric models as hybrid meshes from clinically used Computed Tomography (CT) data sets in order to understand the dynamics and behaviors of the pre- and postoperative upper airway systems of Obstructive Sleep Apnea Syndrome (OSAS) patients by viscous Computational Fluid Dynamics (CFD) simulations. The selection criteria for OSAS cases studied are discussed because two reasonable pre- and postoperative upper airway models for CFD simulations may not be created for every case without a special protocol for CT scanning. The geometry extraction and manipulation methods are presented with technical barriers that must be overcome so that they can be used along with computational simulation software as a daily clinical evaluation tool. Eight cases are presented in this paper, and each case consists of pre- and postoperative configurations. The results of computational simulations of two cases are included in this paper as demonstration. PMID:21625395

  20. Patient-Specific Geometry Modeling and Mesh Generation for Simulating Obstructive Sleep Apnea Syndrome Cases by Maxillomandibular Advancement

    PubMed Central

    Ito, Yasushi; Cheng, Gary C.; Shih, Alan M.; Koomullil, Roy P.; Soni, Bharat K.; Sittitavornwong, Somsak; Waite, Peter D.

    2011-01-01

    The objective of this paper is the reconstruction of upper airway geometric models as hybrid meshes from clinically used Computed Tomography (CT) data sets in order to understand the dynamics and behaviors of the pre- and postoperative upper airway systems of Obstructive Sleep Apnea Syndrome (OSAS) patients by viscous Computational Fluid Dynamics (CFD) simulations. The selection criteria for OSAS cases studied are discussed because two reasonable pre- and postoperative upper airway models for CFD simulations may not be created for every case without a special protocol for CT scanning. The geometry extraction and manipulation methods are presented with technical barriers that must be overcome so that they can be used along with computational simulation software as a daily clinical evaluation tool. Eight cases are presented in this paper, and each case consists of pre- and postoperative configurations. The results of computational simulations of two cases are included in this paper as demonstration. PMID:21625395

  1. Development of an Advanced Simulator to Model Mobility Control and Geomechanics during CO{sub 2} Floods

    SciTech Connect

    Delshad, Mojdeh; Wheeler, Mary; Sepehrnoori, Kamy; Pope, Gary

    2013-12-31

    The simulator is an isothermal, three-dimensional, four-phase, compositional, equation-of– state (EOS) simulator. We have named the simulator UTDOE-CO2 capable of simulating various recovery processes (i.e., primary, secondary waterflooding, and miscible and immiscible gas flooding). We include both the Peng-Robinson EOS and the Redlich-Kwong EOS models. A Gibbs stability test is also included in the model to perform a phase identification test to consistently label each phase for subsequent property calculations such as relative permeability, viscosity, density, interfacial tension, and capillary pressure. Our time step strategy is based on an IMPEC-type method (implicit pressure and explicit concentration). The gridblock pressure is solved first using the explicit dating of saturation-dependent terms. Subsequently, the material balance equations are solved explicitly for the total concentration of each component. The physical dispersion term is also included in the governing equations. The simulator includes (1) several foam model(s) for gas mobility control, (2) compositional relative permeability models with the hysteresis option, (3) corner point grid and several efficient solvers, (4) geomechanics module to compute stress field as the result of CO{sub 2} injection/production, (5) the format of commercial visualization software, S3graf from Science-soft Ltd., was implemented for user friendly visualization of the simulation results. All tasks are completed and the simulator was fully tested and delivered to the DOE office including a user’s guide and several input files and the executable for Windows Pcs. We have published several SPE papers, presented several posters, and one MS thesis is completed (V. Pudugramam, 2013) resulting from this DOE funded project.

  2. Presented by CASL: The Consortium for Advanced Simulation

    E-print Network

    Presented by Nuclear Energy CASL: The Consortium for Advanced Simulation of Light Water Reactors A DOE Energy Innovation Hub for Modeling and Simulation of Nuclear Reactors Doug Kothe Director, CASL (BWR) Common types of Light Water Reactors (LWRs) #12;www.casl.gov U.S. Nuclear Energy Increasing

  3. Advancing the LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group

    2013-01-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.

  4. An advanced fuel cell simulator 

    E-print Network

    Acharya, Prabha Ramchandra

    2005-11-01

    , they have a very low environmental impact. The fuel cell system consists of several subsystems requiring a lot of e?ort from engineers in diverse areas. Fuel cell simulators can provide a convenient and economic alternative for testing the electrical...

  5. Simulation model of the F/A-18 high angle-of-attack research vehicle utilized for the design of advanced control laws

    NASA Technical Reports Server (NTRS)

    Strickland, Mark E.; Bundick, W. Thomas; Messina, Michael D.; Hoffler, Keith D.; Carzoo, Susan W.; Yeager, Jessie C.; Beissner, Fred L., Jr.

    1996-01-01

    The 'f18harv' six degree-of-freedom nonlinear batch simulation used to support research in advanced control laws and flight dynamics issues as part of NASA's High Alpha Technology Program is described in this report. This simulation models an F/A-18 airplane modified to incorporate a multi-axis thrust-vectoring system for augmented pitch and yaw control power and actuated forebody strakes for enhanced aerodynamic yaw control power. The modified configuration is known as the High Alpha Research Vehicle (HARV). The 'f18harv' simulation was an outgrowth of the 'f18bas' simulation which modeled the basic F/A-18 with a preliminary version of a thrust-vectoring system designed for the HARV. The preliminary version consisted of two thrust-vectoring vanes per engine nozzle compared with the three vanes per engine actually employed on the F/A-18 HARV. The modeled flight envelope is extensive in that the aerodynamic database covers an angle-of-attack range of -10 degrees to +90 degrees, sideslip range of -20 degrees to +20 degrees, a Mach Number range between 0.0 and 2.0, and an altitude range between 0 and 60,000 feet.

  6. Defining Research and Development Directions for Modeling and Simulation of

    E-print Network

    Defining Research and Development Directions for Modeling and Simulation of Complex, Interdependent. The National Infrastructure Simulation and Analysis Center, or NISAC, provides advanced modeling and simulation concepts from the Complex Systems literature that cross disciplines, and identifying study/modeling

  7. Advances in atomic oxygen simulation

    NASA Technical Reports Server (NTRS)

    Froechtenigt, Joseph F.; Bareiss, Lyle E.

    1990-01-01

    Atomic oxygen (AO) present in the atmosphere at orbital altitudes of 200 to 700 km has been shown to degrade various exposed materials on Shuttle flights. The relative velocity of the AO with the spacecraft, together with the AO density, combine to yield an environment consisting of a 5 eV beam energy with a flux of 10(exp 14) to 10(exp 15) oxygen atoms/sq cm/s. An AO ion beam apparatus that produces flux levels and energy similar to that encountered by spacecraft in low Earth orbit (LEO) has been in existence since 1987. Test data was obtained from the interaction of the AO ion beam with materials used in space applications (carbon, silver, kapton) and with several special coatings of interest deposited on various surfaces. The ultimate design goal of the AO beam simulation device is to produce neutral AO at sufficient flux levels to replicate on-orbit conditions. A newly acquired mass spectrometer with energy discrimination has allowed 5 eV neutral oxygen atoms to be separated and detected from the background of thermal oxygen atoms of approx 0.2 eV. Neutralization of the AO ion beam at 5 eV was shown at the Martin Marietta AO facility.

  8. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or...

  9. Computational Models of Human Performance: Validation of Memory and Procedural Representation in Advanced Air/Ground Simulation

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Labacqz, J. Victor (Technical Monitor)

    1997-01-01

    The Man-Machine Interaction Design and Analysis System (MIDAS) under joint U.S. Army and NASA cooperative is intended to assist designers of complex human/automation systems in successfully incorporating human performance capabilities and limitations into decision and action support systems. MIDAS is a computational representation of multiple human operators, selected perceptual, cognitive, and physical functions of those operators, and the physical/functional representation of the equipment with which they operate. MIDAS has been used as an integrated predictive framework for the investigation of human/machine systems, particularly in situations with high demands on the operators. We have extended the human performance models to include representation of both human operators and intelligent aiding systems in flight management, and air traffic service. The focus of this development is to predict human performance in response to aiding system developed to identify aircraft conflict and to assist in the shared authority for resolution. The demands of this application requires representation of many intelligent agents sharing world-models, coordinating action/intention, and cooperative scheduling of goals and action in an somewhat unpredictable world of operations. In recent applications to airborne systems development, MIDAS has demonstrated an ability to predict flight crew decision-making and procedural behavior when interacting with automated flight management systems and Air Traffic Control. In this paper, we describe two enhancements to MIDAS. The first involves the addition of working memory in the form of an articulatory buffer for verbal communication protocols and a visuo-spatial buffer for communications via digital datalink. The second enhancement is a representation of multiple operators working as a team. This enhanced model was used to predict the performance of human flight crews and their level of compliance with commercial aviation communication procedures. We show how the data produced by MIDAS compares with flight crew performance data from full mission simulations. Finally, we discuss the use of these features to study communication issues connected with aircraft-based separation assurance.

  10. Dynamic Simulations of Advanced Fuel Cycles

    SciTech Connect

    Steven J. Piet; Brent W. Dixon; Jacob J. Jacobson; Gretchen E. Matthern; David E. Shropshire

    2011-03-01

    Years of performing dynamic simulations of advanced nuclear fuel cycle options provide insights into how they could work and how one might transition from the current once-through fuel cycle. This paper summarizes those insights from the context of the 2005 objectives and goals of the U.S. Advanced Fuel Cycle Initiative (AFCI). Our intent is not to compare options, assess options versus those objectives and goals, nor recommend changes to those objectives and goals. Rather, we organize what we have learned from dynamic simulations in the context of the AFCI objectives for waste management, proliferation resistance, uranium utilization, and economics. Thus, we do not merely describe “lessons learned” from dynamic simulations but attempt to answer the “so what” question by using this context. The analyses have been performed using the Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics (VISION). We observe that the 2005 objectives and goals do not address many of the inherently dynamic discriminators among advanced fuel cycle options and transitions thereof.

  11. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  12. Computational Aerodynamic Simulations of an 840 ft/sec Tip Speed Advanced Ducted Propulsor Fan System Model for Acoustic Methods Assessment and Development

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of an 840 ft/sec tip speed, Advanced Ducted Propulsor fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, lownoise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15- foot Low Speed Wind Tunnel at the NASA Glenn Research Center, resulting in quality, detailed aerodynamic and acoustic measurement data. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating conditions simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, excluding a long core duct section downstream of the core inlet guide vane. As a result, only fan rotational speed and system bypass ratio, set by specifying static pressure downstream of the core inlet guide vane row, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. The computed blade row flow fields for all five fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive boundary layer separations or related secondary-flow problems. A few spanwise comparisons between computational and measurement data in the bypass duct show that they are in good agreement, thus providing a partial validation of the computational results.

  13. Hybrid and electric advanced vehicle systems (heavy) simulation

    NASA Technical Reports Server (NTRS)

    Hammond, R. A.; Mcgehee, R. K.

    1981-01-01

    A computer program to simulate hybrid and electric advanced vehicle systems (HEAVY) is described. It is intended for use early in the design process: concept evaluation, alternative comparison, preliminary design, control and management strategy development, component sizing, and sensitivity studies. It allows the designer to quickly, conveniently, and economically predict the performance of a proposed drive train. The user defines the system to be simulated using a library of predefined component models that may be connected to represent a wide variety of propulsion systems. The development of three models are discussed as examples.

  14. Chemical Kinetic Modeling of Advanced Transportation Fuels

    SciTech Connect

    PItz, W J; Westbrook, C K; Herbinet, O

    2009-01-20

    Development of detailed chemical kinetic models for advanced petroleum-based and nonpetroleum based fuels is a difficult challenge because of the hundreds to thousands of different components in these fuels and because some of these fuels contain components that have not been considered in the past. It is important to develop detailed chemical kinetic models for these fuels since the models can be put into engine simulation codes used for optimizing engine design for maximum efficiency and minimal pollutant emissions. For example, these chemistry-enabled engine codes can be used to optimize combustion chamber shape and fuel injection timing. They also allow insight into how the composition of advanced petroleum-based and non-petroleum based fuels affect engine performance characteristics. Additionally, chemical kinetic models can be used separately to interpret important in-cylinder experimental data and gain insight into advanced engine combustion processes such as HCCI and lean burn engines. The objectives are: (1) Develop detailed chemical kinetic reaction models for components of advanced petroleum-based and non-petroleum based fuels. These fuels models include components from vegetable-oil-derived biodiesel, oil-sand derived fuel, alcohol fuels and other advanced bio-based and alternative fuels. (2) Develop detailed chemical kinetic reaction models for mixtures of non-petroleum and petroleum-based components to represent real fuels and lead to efficient reduced combustion models needed for engine modeling codes. (3) Characterize the role of fuel composition on efficiency and pollutant emissions from practical automotive engines.

  15. Advanced Civil Transport Simulator Cockpit View

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Advanced Civil Transport Simulator (ACTS) is a futuristic aircraft cockpit simulator designed to provide full-mission capabilities for researching issues that will affect future transport aircraft flight stations and crews. The objective is to heighten the pilots situation awareness through improved information availability and ease of interpretation in order to reduce the possibility of misinterpreted data. The simulators five 13-inch Cathode Ray Tubes are designed to display flight information in a logical easy-to-see format. Two color flat panel Control Display Units with touch sensitive screens provide monitoring and modification of aircraft parameters, flight plans, flight computers, and aircraft position. Three collimated visual display units have been installed to provide out-the-window scenes via the Computer Generated Image system. The major research objectives are to examine needs for transfer of information to and from the flight crew; study the use of advanced controls and displays for all-weather flying; explore ideas for using computers to help the crew in decision making; study visual scanning and reach behavior under different conditions with various levels of automation and flight deck-arrangements.

  16. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  17. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    SciTech Connect

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.

  18. The Consortium for Advanced Simulation of Light Water Reactors

    SciTech Connect

    Ronaldo Szilard; Hongbin Zhang; Doug Kothe; Paul Turinsky

    2011-10-01

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) is a DOE Energy Innovation Hub for modeling and simulation of nuclear reactors. It brings together an exceptionally capable team from national labs, industry and academia that will apply existing modeling and simulation capabilities and develop advanced capabilities to create a usable environment for predictive simulation of light water reactors (LWRs). This environment, designated as the Virtual Environment for Reactor Applications (VERA), will incorporate science-based models, state-of-the-art numerical methods, modern computational science and engineering practices, and uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs). It will couple state-of-the-art fuel performance, neutronics, thermal-hydraulics (T-H), and structural models with existing tools for systems and safety analysis and will be designed for implementation on both today's leadership-class computers and the advanced architecture platforms now under development by the DOE. CASL focuses on a set of challenge problems such as CRUD induced power shift and localized corrosion, grid-to-rod fretting fuel failures, pellet clad interaction, fuel assembly distortion, etc. that encompass the key phenomena limiting the performance of PWRs. It is expected that much of the capability developed will be applicable to other types of reactors. CASL's mission is to develop and apply modeling and simulation capabilities to address three critical areas of performance for nuclear power plants: (1) reduce capital and operating costs per unit energy by enabling power uprates and plant lifetime extension, (2) reduce nuclear waste volume generated by enabling higher fuel burnup, and (3) enhance nuclear safety by enabling high-fidelity predictive capability for component performance.

  19. Impact of the Assimilation of Hyperspectral Infrared Profiles on Advanced Weather and Research Model Simulations of a Non-Convective Wind Event

    NASA Technical Reports Server (NTRS)

    Berndt, Emily B.; Zavodsky, Bradley T; Jedlovec, Gary J.; Elmer, Nicholas J.

    2013-01-01

    Non-convective wind events commonly occur with passing extratropical cyclones and have significant societal and economic impacts. Since non-convective winds often occur in the absence of specific phenomena such as a thunderstorm, tornado, or hurricane, the public are less likely to heed high wind warnings and continue daily activities. Thus non-convective wind events result in as many fatalities as straight line thunderstorm winds. One physical explanation for non-convective winds includes tropopause folds. Improved model representation of stratospheric air and associated non-convective wind events could improve non-convective wind forecasts and associated warnings. In recent years, satellite data assimilation has improved skill in forecasting extratropical cyclones; however errors still remain in forecasting the position and strength of extratropical cyclones as well as the tropopause folding process. The goal of this study is to determine the impact of assimilating satellite temperature and moisture retrieved profiles from hyperspectral infrared (IR) sounders (i.e. Atmospheric Infrared Sounder (AIRS), Cross-track Infrared and Microwave Sounding Suite (CrIMSS), and Infrared Atmospheric Sounding Interferometer (IASI)) on the model representation of the tropopause fold and an associated high wind event that impacted the Northeast United States on 09 February 2013. Model simulations using the Advanced Research Weather Research and Forecasting Model (ARW) were conducted on a 12-km grid with cycled data assimilation mimicking the operational North American Model (NAM). The results from the satellite assimilation run are compared to a control experiment (without hyperspectral IR retrievals), North American Regional Reanalysis (NARR) reanalysis, and Rapid Refresh analyses.

  20. Impact of the Assimilation of Hyperspectral Infrared Retrieved Profiles on Advanced Weather and Research Model Simulations of a Non-Convective Wind Event

    NASA Technical Reports Server (NTRS)

    Berndt, E. B.; Zavodsky, B. T.; Jedlovec, G. J.

    2014-01-01

    Non-convective wind events commonly occur with passing extratropical cyclones and have significant societal and economic impacts. Since non-convective winds often occur in the absence of specific phenomena such as a thunderstorm, tornado, or hurricane, the public are less likely to heed high wind warnings and continue daily activities. Thus non-convective wind events result in as many fatalities as straight line thunderstorm winds. One physical explanation for non-convective winds includes tropopause folds. Improved model representation of stratospheric air and associated non-convective wind events could improve non-convective wind forecasts and associated warnings. In recent years, satellite data assimilation has improved skill in forecasting extratropical cyclones; however errors still remain in forecasting the position and strength of extratropical cyclones as well as the tropopause folding process. The goal of this study is to determine the impact of assimilating satellite temperature and moisture retrieved profiles from hyperspectral infrared (IR) sounders (i.e. Atmospheric Infrared Sounder (AIRS), Cross-track Infrared and Microwave Sounding Suite (CrIMSS), and Infrared Atmospheric Sounding Interferometer (IASI)) on the model representation of the tropopause fold and an associated high wind event that impacted the Northeast United States on 09 February 2013. Model simulations using the Advanced Research Weather Research and Forecasting Model (ARW) were conducted on a 12-km grid with cycled data assimilation mimicking the operational North American Model (NAM). The results from the satellite assimilation run are compared to a control experiment (without hyperspectral IR retrievals), Modern Era-Retrospective Analysis for Research and Applications (MERRA) reanalysis, and Rapid Refresh analyses.

  1. The Impact of the Assimilation of Hyperspectral Infrared Retrieved Profiles on Advanced Weather and Research Model Simulations of a Non-Convective Wind Event

    NASA Technical Reports Server (NTRS)

    Berndt, Emily; Zavodsky, Bradley; Jedlovec, Gary; Elmer, Nicholas

    2013-01-01

    Non-convective wind events commonly occur with passing extratropical cyclones and have significant societal and economic impacts. Since non-convective winds often occur in the absence of specific phenomena such as a thunderstorm, tornado, or hurricane, the public are less likely to heed high wind warnings and continue daily activities. Thus non-convective wind events result in as many fatalities as straight line thunderstorm winds. One physical explanation for non-convective winds includes tropopause folds. Improved model representation of stratospheric air and associated non-convective wind events could improve non-convective wind forecasts and associated warnings. In recent years, satellite data assimilation has improved skill in forecasting extratropical cyclones; however errors still remain in forecasting the position and strength of extratropical cyclones as well as the tropopause folding process. The goal of this study is to determine the impact of assimilating satellite temperature and moisture retrieved profiles from hyperspectral infrared (IR) sounders (i.e. Atmospheric Infrared Sounder (AIRS), Cross-track Infrared and Microwave Sounding Suite (CrIMSS), and Infrared Atmospheric Sounding Interferometer (IASI)) on the model representation of the tropopause fold and an associated high wind event that impacted the Northeast United States on 09 February 2013. Model simulations using the Advanced Research Weather Research and Forecasting Model (ARW) were conducted on a 12-km grid with cycled data assimilation mimicking the operational North American Model (NAM). The results from the satellite assimilation run are compared to a control experiment (without hyperspectral IR retrievals), Modern Era-Retrospective Analysis for Research and Applications (MERRA) reanalysis, and Rapid Refresh analyses.

  2. Impact of the Assimilation of Hyperspectral Infrared Retrieved Profiles on Advanced Weather and Research Model Simulations of a Non-Convective Wind Event

    NASA Technical Reports Server (NTRS)

    Berndt, E. B.; Zavodsky, B. T.; Folmer, M. J.; Jedlovec, G. J.

    2014-01-01

    Non-convective wind events commonly occur with passing extratropical cyclones and have significant societal and economic impacts. Since non-convective winds often occur in the absence of specific phenomena such as a thunderstorm, tornado, or hurricane, the public are less likely to heed high wind warnings and continue daily activities. Thus non-convective wind events result in as many fatalities as straight line thunderstorm winds. One physical explanation for non-convective winds includes tropopause folds. Improved model representation of stratospheric air and associated non-convective wind events could improve non-convective wind forecasts and associated warnings. In recent years, satellite data assimilation has improved skill in forecasting extratropical cyclones; however errors still remain in forecasting the position and strength of extratropical cyclones as well as the tropopause folding process. The goal of this study is to determine the impact of assimilating satellite temperature and moisture retrieved profiles from hyperspectral infrared (IR) sounders (i.e. Atmospheric Infrared Sounder (AIRS), Cross-track Infrared and Microwave Sounding Suite (CrIMSS), and Infrared Atmospheric Sounding Interferometer (IASI)) on the model representation of the tropopause fold and an associated high wind event that impacted the Northeast United States on 09 February 2013. Model simulations using the Advanced Research Weather Research and Forecasting Model (ARW) were conducted on a 12-km grid with cycled data assimilation mimicking the operational North American Model (NAM). The results from the satellite assimilation run are compared to a control experiment (without hyperspectral IR retrievals), 32-km North American Regional Reanalysis (NARR) interpolated to a 12-km grid, and 13-km Rapid Refresh analyses.

  3. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    SciTech Connect

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert; McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  4. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  5. Aviation Safety Simulation Model

    NASA Technical Reports Server (NTRS)

    Houser, Scott; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.

  6. Modelling the response of a Himalayan watershed to climate change: new insights from linking high resolution in-situ data and remote sensing with an advanced simulation model

    NASA Astrophysics Data System (ADS)

    Ragettli, S.; Pellicciotti, F.; Immerzeel, W.

    2014-12-01

    In high-elevation watersheds of the Himalayan region the correct representation of the internal states and process dynamics in glacio-hydrological models can often not be verified due to missing in-situ measurements. The aim of this study is to provide a fundamental understanding of the hydrology of a Himalayan watershed through the systematic integration of in-situ data in a glacio-hydrological model. We use ground data from the upper Langtang valley in Nepal combined with high resolution satellite data to understand specific processes and test the application of new model components specifically developed. We apply a new model for ablation under debris that takes into account the varying effect of debris thickness on melt rates. A novel approach is tested to reconstruct spatial fields of debris thickness through combination of energy balance modelling, UAV-derived geodetic mass balance and statistical techniques. The systematic integration of in-situ data for model calibration enables the application of a state-of-the art model with many parameters to model glacier evolution and catchment runoff in spite of the lack of continuous long-term historical records. It allows drawing conclusions on the importance of processes that have been suggested as being relevant but never quantified before. The simulations show that 8.7% of total water inputs originate from sub-debris ice melt. 4.5% originate from melted avalanched snow. These components can be locally much more important, since the spatial variability of processes within the valley is high. The model is then used to simulate the response of the catchment to climate change. We show that climate warming leads to an increase in future icemelt and a peak in glacier runoff by mid-century. The increase in total icemelt is due to higher melt rates and large areas that are currently located above the equilibrium line altitude additionally that will contribute to melt. Catchment runoff will not reach below current levels throughout the 21st century due to precipitation increases. Debris covered glacier area will disappear at a slower pace than non-debris covered area. Still, due to the relative climate insensitivity of melt rates below thick debris, the contribution of sub-debris icemelt to runoff will not exceed 10% at all times.

  7. Recent advances in superconducting-mixer simulations

    NASA Technical Reports Server (NTRS)

    Withington, S.; Kennedy, P. R.

    1992-01-01

    Over the last few years, considerable progress have been made in the development of techniques for fabricating high-quality superconducting circuits, and this success, together with major advances in the theoretical understanding of quantum detection and mixing at millimeter and submillimeter wavelengths, has made the development of CAD techniques for superconducting nonlinear circuits an important new enterprise. For example, arrays of quasioptical mixers are now being manufactured, where the antennas, matching networks, filters and superconducting tunnel junctions are all fabricated by depositing niobium and a variety of oxides on a single quartz substrate. There are no adjustable tuning elements on these integrated circuits, and therefore, one must be able to predict their electrical behavior precisely. This requirement, together with a general interest in the generic behavior of devices such as direct detectors and harmonic mixers, has lead us to develop a range of CAD tools for simulating the large-signal, small-signal, and noise behavior of superconducting tunnel junction circuits.

  8. Advanced Chemistry Basins Model

    SciTech Connect

    Blanco, Mario; Cathles, Lawrence; Manhardt, Paul; Meulbroek, Peter; Tang, Yongchun

    2003-02-13

    The objective of this project is to: (1) Develop a database of additional and better maturity indicators for paleo-heat flow calibration; (2) Develop maturation models capable of predicting the chemical composition of hydrocarbons produced by a specific kerogen as a function of maturity, heating rate, etc.; assemble a compositional kinetic database of representative kerogens; (3) Develop a 4 phase equation of state-flash model that can define the physical properties (viscosity, density, etc.) of the products of kerogen maturation, and phase transitions that occur along secondary migration pathways; (4) Build a conventional basin model and incorporate new maturity indicators and data bases in a user-friendly way; (5) Develop an algorithm which combines the volume change and viscosities of the compositional maturation model to predict the chemistry of the hydrocarbons that will be expelled from the kerogen to the secondary migration pathways; (6) Develop an algorithm that predicts the flow of hydrocarbons along secondary migration pathways, accounts for mixing of miscible hydrocarbon components along the pathway, and calculates the phase fractionation that will occur as the hydrocarbons move upward down the geothermal and fluid pressure gradients in the basin; and (7) Integrate the above components into a functional model implemented on a PC or low cost workstation.

  9. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    SciTech Connect

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and the public and will help ensure the safe, economical and reliable operation of existing and future nuclear reactors.

  10. Pilot evaluation of an advanced hingeless rotor XY-15 simulation

    NASA Technical Reports Server (NTRS)

    Mcveigh, M. A.

    1977-01-01

    A piloted simulation of an advanced hingeless rotor XV-15 tilt-rotor aircraft was carried out. The evaluation was made by a pilot from NASA-Ames who had previous experience flying a simulation of the current gimballed rotor NASA/Army XV-15. It was pointed out that some modifications to the force feel system were needed in order to provide rapid force trimming during rapid maneuvers. Some additional tailoring of the SCAS system was required to achieve good nap-of-the-earth performance. Overall pilot opinion on the hingeless rotor XV-15 tilt rotor was favorable. Brief discussion on the mathematical models and the simulator configuration are presented. The maneuvers and pilot comments are given along with some engineering comments.

  11. Advanced simulation of intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Tentner, A.

    1996-11-01

    A large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS) has been developed which is capable of running on parallel computers and distributed (networked) computer systems. The simulator currently models instrumented {open_quotes}smart{close_quotes} vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of this approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.

  12. Interoperable Technologies for Advanced Petascale Simulations

    SciTech Connect

    Li, Xiaolin

    2013-01-14

    Our final report on the accomplishments of ITAPS at Stony Brook during period covered by the research award includes component service, interface service and applications. On the component service, we have designed and implemented a robust functionality for the Lagrangian tracking of dynamic interface. We have migrated the hyperbolic, parabolic and elliptic solver from stage-wise second order toward global second order schemes. We have implemented high order coupling between interface propagation and interior PDE solvers. On the interface service, we have constructed the FronTier application programer's interface (API) and its manual page using doxygen. We installed the FronTier functional interface to conform with the ITAPS specifications, especially the iMesh and iMeshP interfaces. On applications, we have implemented deposition and dissolution models with flow and implemented the two-reactant model for a more realistic precipitation at the pore level and its coupling with Darcy level model. We have continued our support to the study of fluid mixing problem for problems in inertial comfinement fusion. We have continued our support to the MHD model and its application to plasma liner implosion in fusion confinement. We have simulated a step in the reprocessing and separation of spent fuels from nuclear power plant fuel rods. We have implemented the fluid-structure interaction for 3D windmill and parachute simulations. We have continued our collaboration with PNNL, BNL, LANL, ORNL, and other SciDAC institutions.

  13. Theory Modeling and Simulation

    SciTech Connect

    Shlachter, Jack

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  14. Modeling Molecular Dynamics from Simulations

    SciTech Connect

    Hinrichs, Nina Singhal

    2009-01-28

    Many important processes in biology occur at the molecular scale. A detailed understanding of these processes can lead to significant advances in the medical and life sciences. For example, many diseases are caused by protein aggregation or misfolding. One approach to studying these systems is to use physically-based computational simulations to model the interactions and movement of the molecules. While molecular simulations are computationally expensive, it is now possible to simulate many independent molecular dynamics trajectories in a parallel fashion by using super- or distributed- computing methods such as Folding@Home or Blue Gene. The analysis of these large, high-dimensional data sets presents new computational challenges. In this seminar, I will discuss a novel approach to analyzing large ensembles of molecular dynamics trajectories to generate a compact model of the dynamics. This model groups conformations into discrete states and describes the dynamics as Markovian, or history-independent, transitions between the states. I will discuss why the Markovian state model (MSM) is suitable for macromolecular dynamics, and how it can be used to answer many interesting and relevant questions about the molecular system. I will also discuss many of the computational and statistical challenges in building such a model, such as how to appropriately cluster conformations, determine the statistical reliability, and efficiently design new simulations.

  15. Advanced computational simulation of transient, multiphase combustion

    SciTech Connect

    Hosangadi, A.; Sinha, N.; Dash, S.M.

    1995-12-31

    In recent activities involving combustion chamber simulation of next-generation guns (Liquid Propellant Gun, Electrothermal-Chemical Gun, Ram Accelerator), a three-dimensional upwind/implicit Navier-Stokes code, CRAFT, has been extended to analyze very complex multiphase combustion problems. The extensions have included: (1) gas/bulk liquid interaction modeling; (2) droplet combustion of liquid propellants; and (3) fluidized bed combustion of solid (balled) propellants. The computational framework utilizes Reimann-based Roe/TVD upwind numerics with strongly coupled, fully-implicit numerics (all equations and source terms are strongly coupled) which permits the accurate analysis of complex transient processes including combustion instabilities. A large eddy simulation (LES) framework is used to represent the turbulence with simplified subgrid stress (SGS) modeling. While problem specific details of next-generation gun models are restricted (details have been presented at JANNAF Combustion meetings), the basic methodology is not and is being extended to several non-DoD arenas. This paper will present an overview of the new methodology developed and will describe fundamental studies of varied transient, multiphase combusting flows. Details of the physics will be emphasized including large eddy turbulent behavior in complex multiphase environments. Analyses performed have involved detailed spectral processing in situations where chamber acoustics strongly interact with combustion processes.

  16. ADVANCED MIXING MODELS

    SciTech Connect

    Lee, S.; Dimenna, R.; Tamburello, D.

    2011-02-14

    The process of recovering and processing High Level Waste (HLW) the waste in storage tanks at the Savannah River Site (SRS) typically requires mixing the contents of the tank with one to four mixers (pumps) located within the tank. The typical criteria to establish a mixed condition in a tank are based on the number of pumps in operation and the time duration of operation. To ensure that a mixed condition is achieved, operating times are typically set conservatively long. This approach results in high operational costs because of the long mixing times and high maintenance and repair costs for the same reason. A significant reduction in both of these costs might be realized by reducing the required mixing time based on calculating a reliable indicator of mixing with a suitably validated computer code. The focus of the present work is to establish mixing criteria applicable to miscible fluids, with an ultimate goal of addressing waste processing in HLW tanks at SRS and quantifying the mixing time required to suspend sludge particles with the submersible jet pump. A single-phase computational fluid dynamics (CFD) approach was taken for the analysis of jet flow patterns with an emphasis on the velocity decay and the turbulent flow evolution for the farfield region from the pump. Literature results for a turbulent jet flow are reviewed, since the decay of the axial jet velocity and the evolution of the jet flow patterns are important phenomena affecting sludge suspension and mixing operations. The work described in this report suggests a basis for further development of the theory leading to the identified mixing indicators, with benchmark analyses demonstrating their consistency with widely accepted correlations. Although the indicators are somewhat generic in nature, they are applied to Savannah River Site (SRS) waste tanks to provide a better, physically based estimate of the required mixing time. Waste storage tanks at SRS contain settled sludge which varies in height from zero to 10 ft. The sludge has been characterized and modeled as micron-sized solids, typically 1 to 5 microns, at weight fractions as high as 20 to 30 wt%, specific gravities to 1.4, and viscosities up to 64 cp during motion. The sludge is suspended and mixed through the use of submersible slurry jet pumps. To suspend settled sludge, water is added to the tank as a slurry medium and stirred with the jet pump. Although there is considerable technical literature on mixing and solid suspension in agitated tanks, very little literature has been published on jet mixing in a large-scale tank. One of the main objectives in the waste processing is to provide feed of a uniform slurry composition at a certain weight percentage (e.g. typically {approx}13 wt% at SRS) over an extended period of time. In preparation of the sludge for slurrying, several important questions have been raised with regard to sludge suspension and mixing of the solid suspension in the bulk of the tank: (1) How much time is required to prepare a slurry with a uniform solid composition? (2) How long will it take to suspend and mix the sludge for uniform composition in any particular waste tank? (3) What are good mixing indicators to answer the questions concerning sludge mixing stated above in a general fashion applicable to any waste tank/slurry pump geometry and fluid/sludge combination?

  17. ADVANCED POWER PLANT MODELING WITH APPLICATIONS TO THE ADVANCED BOILING

    E-print Network

    Mitchell, John E.

    ADVANCED POWER PLANT MODELING WITH APPLICATIONS TO THE ADVANCED BOILING WATER REACTOR AND THE HEAT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 3.1 Temperature Wave with Lateral Heat Transfer . . . . . . . . . . . . . 7 3.2 One . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 2. Advanced Boiling Water Reactor - General Description . . . . . . . . . . . 3 2.1 Modifications

  18. ADVANCED MIXING MODELS

    SciTech Connect

    Lee, S; Richard Dimenna, R; David Tamburello, D

    2008-11-13

    The process of recovering the waste in storage tanks at the Savannah River Site (SRS) typically requires mixing the contents of the tank with one to four dual-nozzle jet mixers located within the tank. The typical criteria to establish a mixed condition in a tank are based on the number of pumps in operation and the time duration of operation. To ensure that a mixed condition is achieved, operating times are set conservatively long. This approach results in high operational costs because of the long mixing times and high maintenance and repair costs for the same reason. A significant reduction in both of these costs might be realized by reducing the required mixing time based on calculating a reliable indicator of mixing with a suitably validated computer code. The work described in this report establishes the basis for further development of the theory leading to the identified mixing indicators, the benchmark analyses demonstrating their consistency with widely accepted correlations, and the application of those indicators to SRS waste tanks to provide a better, physically based estimate of the required mixing time. Waste storage tanks at SRS contain settled sludge which varies in height from zero to 10 ft. The sludge has been characterized and modeled as micron-sized solids, typically 1 to 5 microns, at weight fractions as high as 20 to 30 wt%, specific gravities to 1.4, and viscosities up to 64 cp during motion. The sludge is suspended and mixed through the use of submersible slurry jet pumps. To suspend settled sludge, water is added to the tank as a slurry medium and stirred with the jet pump. Although there is considerable technical literature on mixing and solid suspension in agitated tanks, very little literature has been published on jet mixing in a large-scale tank. If shorter mixing times can be shown to support Defense Waste Processing Facility (DWPF) or other feed requirements, longer pump lifetimes can be achieved with associated operational cost and schedule savings. The focus of the present work is to establish mixing criteria associated with the waste processing at SRS and to quantify the mixing time required to suspend sludge particles with the submersible jet pump. Literature results for a turbulent jet flow are reviewed briefly, since the decay of the axial jet velocity and the evolution of the jet flow patterns are important phenomena affecting sludge suspension and mixing operations. One of the main objectives in the waste processing is to provide the DWPF a uniform slurry composition at a certain weight percentage (typically {approx}13 wt%) over an extended period of time. In preparation of the sludge for slurrying to DWPF, several important questions have been raised with regard to sludge suspension and mixing of the solid suspension in the bulk of the tank: (1) How much time is required to prepare a slurry with a uniform solid composition for DWPF? (2) How long will it take to suspend and mix the sludge for uniform composition in any particular waste tank? (3) What are good mixing indicators to answer the questions concerning sludge mixing stated above in a general fashion applicable to any waste tank/slurry pump geometry and fluid/sludge combination? Grenville and Tilton (1996) investigated the mixing process by giving a pulse of tracer (electrolyte) through the submersible jet nozzle and by monitoring the conductivity at three locations within the cylindrical tank. They proposed that the mixing process was controlled by the turbulent kinetic energy dissipation rate in the region far away from the jet entrance. They took the energy dissipation rates in the regions remote from the nozzle to be proportional to jet velocity and jet diameter at that location. The reduction in the jet velocity was taken to be proportional to the nozzle velocity and distance from the nozzle. Based on their analysis, a correlation was proposed. The proposed correlation was shown to be valid over a wide range of Reynolds numbers (50,000 to 300,000) with a relative standard deviation of {+-} 11.83%. An improved correlat

  19. AGRICULTURAL SIMULATION MODEL (AGSIM)

    EPA Science Inventory

    AGSIM is a large-scale econometric simulation model of regional crop and national livestock production in the United States. The model was initially developed to analyze the aggregate economic impacts of a wide variety issues facing agriculture, such as technological change, pest...

  20. Naval threat countermeasure simulator and the IR_CRUISE_missiles models for the generation of infrared (IR) videos of maritime targets and background for input into advanced imaging IR seekers

    NASA Astrophysics Data System (ADS)

    Taczak, Thomas M.; Dries, John W.; Gover, Robert E.; Snapp, Mary Ann; Williams, Elmer F.; Cahill, Colin P.

    2002-07-01

    A new hardware-in-the-loop modeling technique was developed at the US Naval Research Laboratory (NRL) for the evaluation of IR countermeasures against advanced IR imaging anti-ship cruise missiles. The research efforts involved the creation of tools to generate accurate IR imagery and synthesize video to inject in to real-world threat simulators. A validation study was conducted to verify the accuracy and limitations of the techniques that were developed.

  1. Interoperable mesh and geometry tools for advanced petascale simulations

    NASA Astrophysics Data System (ADS)

    Diachin, L.; Bauer, A.; Fix, B.; Kraftcheck, J.; Jansen, K.; Luo, X.; Miller, M.; Ollivier-Gooch, C.; Shephard, M. S.; Tautges, T.; Trease, H.

    2007-07-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and data-structure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications.

  2. Interoperable mesh and geometry tools for advanced petascale simulations

    SciTech Connect

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M; Tautges, T; Trease, H

    2007-07-04

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and datastructure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications.

  3. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-04-15

    The recent Nevada Earthquake (M=6) produced an extraordinary set of crustal guided waves. In this study, we examine the three-component data at all the USArray stations in terms of how well existing models perform in predicting the various phases, Rayleigh waves, Love waves, and Pnl waves. To establish the source parameters, we applied the Cut and Paste Code up to distance of 5° for an average local crustal model which produced a normal mechanism (strike=35°,dip=41°,rake=-85°) at a depth of 9 km and Mw=5.9. Assuming this mechanism, we generated synthetics at all distances for a number of 1D and 3D models. The Pnl observations fit the synthetics for the simple models well both in timing (VPn=7.9km/s) and waveform fits out to a distance of about 5°. Beyond this distance a great deal of complexity can be seen to the northwest apparently caused by shallow subducted slab material. These paths require considerable crustal thinning and higher P-velocities. Small delays and advances outline the various tectonic province to the south, Colorado Plateau, etc. with velocities compatible with that reported on by Song et al.(1996). Five-second Rayleigh waves (Airy Phase) can be observed throughout the whole array and show a great deal of variation ( up to 30s). In general, the Love waves are better behaved than the Rayleigh waves. We are presently adding higher frequency to the source description by including source complexity. Preliminary inversions suggest rupture to northeast with a shallow asperity. We are, also, inverting the aftershocks to extend the frequencies to 2 Hz and beyond following the calibration method outlined in Tan and Helmberger (2007). This will allow accurate directivity measurements for events with magnitude larger than 3.5. Thus, we will address the energy decay with distance as s function of frequency band for the various source types.

  4. A Virtual Engineering Framework for Simulating Advanced Power System

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Stanislav Borodai

    2008-06-18

    In this report is described the work effort performed to provide NETL with VE-Suite based Virtual Engineering software and enhanced equipment models to support NETL's Advanced Process Engineering Co-simulation (APECS) framework for advanced power generation systems. Enhancements to the software framework facilitated an important link between APECS and the virtual engineering capabilities provided by VE-Suite (e.g., equipment and process visualization, information assimilation). Model enhancements focused on improving predictions for the performance of entrained flow coal gasifiers and important auxiliary equipment (e.g., Air Separation Units) used in coal gasification systems. In addition, a Reduced Order Model generation tool and software to provide a coupling between APECS/AspenPlus and the GE GateCycle simulation system were developed. CAPE-Open model interfaces were employed where needed. The improved simulation capability is demonstrated on selected test problems. As part of the project an Advisory Panel was formed to provide guidance on the issues on which to focus the work effort. The Advisory Panel included experts from industry and academics in gasification, CO2 capture issues, process simulation and representatives from technology developers and the electric utility industry. To optimize the benefit to NETL, REI coordinated its efforts with NETL and NETL funded projects at Iowa State University, Carnegie Mellon University and ANSYS/Fluent, Inc. The improved simulation capabilities incorporated into APECS will enable researchers and engineers to better understand the interactions of different equipment components, identify weaknesses and processes needing improvement and thereby allow more efficient, less expensive plants to be developed and brought on-line faster and in a more cost-effective manner. These enhancements to APECS represent an important step toward having a fully integrated environment for performing plant simulation and engineering. Furthermore, with little effort the modeling capabilities described in this report can be extended to support other DOE programs, such as ultra super critical boiler development, oxy-combustion boiler development or modifications to existing plants to include CO2 capture and sequestration.

  5. Advanced Virtual Reality Simulations in Aerospace Education and Research

    NASA Astrophysics Data System (ADS)

    Plotnikova, L.; Trivailo, P.

    2002-01-01

    Recent research developments at Aerospace Engineering, RMIT University have demonstrated great potential for using Virtual Reality simulations as a very effective tool in advanced structures and dynamics applications. They have also been extremely successful in teaching of various undergraduate and postgraduate courses for presenting complex concepts in structural and dynamics designs. Characteristic examples are related to the classical orbital mechanics, spacecraft attitude and structural dynamics. Advanced simulations, reflecting current research by the authors, are mainly related to the implementation of various non-linear dynamic techniques, including using Kane's equations to study dynamics of space tethered satellite systems and the Co-rotational Finite Element method to study reconfigurable robotic systems undergoing large rotations and large translations. The current article will describe the numerical implementation of the modern methods of dynamics, and will concentrate on the post-processing stage of the dynamic simulations. Numerous examples of building Virtual Reality stand-alone animations, designed by the authors, will be discussed in detail. These virtual reality examples will include: The striking feature of the developed technology is the use of the standard mathematical packages, like MATLAB, as a post-processing tool to generate Virtual Reality Modelling Language files with brilliant interactive, graphics and audio effects. These stand-alone demonstration files can be run under Netscape or Microsoft Explorer and do not require MATLAB. Use of this technology enables scientists to easily share their results with colleagues using the Internet, contributing to the flexible learning development at schools and Universities.

  6. Precision Casting via Advanced Simulation and Manufacturing

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A two-year program was conducted to develop and commercially implement selected casting manufacturing technologies to enable significant reductions in the costs of castings, increase the complexity and dimensional accuracy of castings, and reduce the development times for delivery of high quality castings. The industry-led R&D project was cost shared with NASA's Aerospace Industry Technology Program (AITP). The Rocketdyne Division of Boeing North American, Inc. served as the team lead with participation from Lockheed Martin, Ford Motor Company, Howmet Corporation, PCC Airfoils, General Electric, UES, Inc., University of Alabama, Auburn University, Robinson, Inc., Aracor, and NASA-LeRC. The technical effort was organized into four distinct tasks. The accomplishments reported herein. Task 1.0 developed advanced simulation technology for core molding. Ford headed up this task. On this program, a specialized core machine was designed and built. Task 2.0 focused on intelligent process control for precision core molding. Howmet led this effort. The primary focus of these experimental efforts was to characterize the process parameters that have a strong impact on dimensional control issues of injection molded cores during their fabrication. Task 3.0 developed and applied rapid prototyping to produce near net shape castings. Rocketdyne was responsible for this task. CAD files were generated using reverse engineering, rapid prototype patterns were fabricated using SLS and SLA, and castings produced and evaluated. Task 4.0 was aimed at developing technology transfer. Rocketdyne coordinated this task. Casting related technology, explored and evaluated in the first three tasks of this program, was implemented into manufacturing processes.

  7. New Developments in the Simulation of Advanced Accelerator Concepts

    SciTech Connect

    Paul, K.; Cary, J.R.; Cowan, B.; Bruhwiler, D.L.; Geddes, C.G.R.; Mullowney, P.J.; Messmer, P.; Esarey, E.; Cormier-Michel, E.; Leemans, W.P.; Vay, J.-L.

    2008-09-10

    Improved computational methods are essential to the diverse and rapidly developing field of advanced accelerator concepts. We present an overview of some computational algorithms for laser-plasma concepts and high-brightness photocathode electron sources. In particular, we discuss algorithms for reduced laser-plasma models that can be orders of magnitude faster than their higher-fidelity counterparts, as well as important on-going efforts to include relevant additional physics that has been previously neglected. As an example of the former, we present 2D laser wakefield accelerator simulations in an optimal Lorentz frame, demonstrating>10 GeV energy gain of externally injected electrons over a 2 m interaction length, showing good agreement with predictions from scaled simulations and theory, with a speedup factor of ~;;2,000 as compared to standard particle-in-cell.

  8. New Developments in the Simulation of Advanced Accelerator Concepts

    SciTech Connect

    Bruhwiler, David L.; Cary, John R.; Cowan, Benjamin M.; Paul, Kevin; Mullowney, Paul J.; Messmer, Peter; Geddes, Cameron G. R.; Esarey, Eric; Cormier-Michel, Estelle; Leemans, Wim; Vay, Jean-Luc

    2009-01-22

    Improved computational methods are essential to the diverse and rapidly developing field of advanced accelerator concepts. We present an overview of some computational algorithms for laser-plasma concepts and high-brightness photocathode electron sources. In particular, we discuss algorithms for reduced laser-plasma models that can be orders of magnitude faster than their higher-fidelity counterparts, as well as important on-going efforts to include relevant additional physics that has been previously neglected. As an example of the former, we present 2D laser wakefield accelerator simulations in an optimal Lorentz frame, demonstrating >10 GeV energy gain of externally injected electrons over a 2 m interaction length, showing good agreement with predictions from scaled simulations and theory, with a speedup factor of {approx}2,000 as compared to standard particle-in-cell.

  9. Parallel methods for the flight simulation model

    SciTech Connect

    Xiong, Wei Zhong; Swietlik, C.

    1994-06-01

    The Advanced Computer Applications Center (ACAC) has been involved in evaluating advanced parallel architecture computers and the applicability of these machines to computer simulation models. The advanced systems investigated include parallel machines with shared. memory and distributed architectures consisting of an eight processor Alliant FX/8, a twenty four processor sor Sequent Symmetry, Cray XMP, IBM RISC 6000 model 550, and the Intel Touchstone eight processor Gamma and 512 processor Delta machines. Since parallelizing a truly efficient application program for the parallel machine is a difficult task, the implementation for these machines in a realistic setting has been largely overlooked. The ACAC has developed considerable expertise in optimizing and parallelizing application models on a collection of advanced multiprocessor systems. One of aspect of such an application model is the Flight Simulation Model, which used a set of differential equations to describe the flight characteristics of a launched missile by means of a trajectory. The Flight Simulation Model was written in the FORTRAN language with approximately 29,000 lines of source code. Depending on the number of trajectories, the computation can require several hours to full day of CPU time on DEC/VAX 8650 system. There is an impetus to reduce the execution time and utilize the advanced parallel architecture computing environment available. ACAC researchers developed a parallel method that allows the Flight Simulation Model to be able to run in parallel on the multiprocessor system. For the benchmark data tested, the parallel Flight Simulation Model implemented on the Alliant FX/8 has achieved nearly linear speedup. In this paper, we describe a parallel method for the Flight Simulation Model. We believe the method presented in this paper provides a general concept for the design of parallel applications. This concept, in most cases, can be adapted to many other sequential application programs.

  10. Brush seal numerical simulation: Concepts and advances

    NASA Technical Reports Server (NTRS)

    Braun, M. J.; Kudriavtsev, V. V.

    1994-01-01

    The development of the brush seal is considered to be most promising among the advanced type seals that are presently in use in the high speed turbomachinery. The brush is usually mounted on the stationary portions of the engine and has direct contact with the rotating element, in the process of limiting the 'unwanted' leakage flows between stages, or various engine cavities. This type of sealing technology is providing high (in comparison with conventional seals) pressure drops due mainly to the high packing density (around 100 bristles/sq mm), and brush compliance with the rotor motions. In the design of modern aerospace turbomachinery leakage flows between the stages must be minimal, thus contributing to the higher efficiency of the engine. Use of the brush seal instead of the labyrinth seal reduces the leakage flow by one order of magnitude. Brush seals also have been found to enhance dynamic performance, cost less, and are lighter than labyrinth seals. Even though industrial brush seals have been successfully developed through extensive experimentation, there is no comprehensive numerical methodology for the design or prediction of their performance. The existing analytical/numerical approaches are based on bulk flow models and do not allow the investigation of the effects of brush morphology (bristle arrangement), or brushes arrangement (number of brushes, spacing between them), on the pressure drops and flow leakage. An increase in the brush seal efficiency is clearly a complex problem that is closely related to the brush geometry and arrangement, and can be solved most likely only by means of a numerically distributed model.

  11. Brush seal numerical simulation: Concepts and advances

    NASA Astrophysics Data System (ADS)

    Braun, M. J.; Kudriavtsev, V. V.

    1994-07-01

    The development of the brush seal is considered to be most promising among the advanced type seals that are presently in use in the high speed turbomachinery. The brush is usually mounted on the stationary portions of the engine and has direct contact with the rotating element, in the process of limiting the 'unwanted' leakage flows between stages, or various engine cavities. This type of sealing technology is providing high (in comparison with conventional seals) pressure drops due mainly to the high packing density (around 100 bristles/sq mm), and brush compliance with the rotor motions. In the design of modern aerospace turbomachinery leakage flows between the stages must be minimal, thus contributing to the higher efficiency of the engine. Use of the brush seal instead of the labyrinth seal reduces the leakage flow by one order of magnitude. Brush seals also have been found to enhance dynamic performance, cost less, and are lighter than labyrinth seals. Even though industrial brush seals have been successfully developed through extensive experimentation, there is no comprehensive numerical methodology for the design or prediction of their performance. The existing analytical/numerical approaches are based on bulk flow models and do not allow the investigation of the effects of brush morphology (bristle arrangement), or brushes arrangement (number of brushes, spacing between them), on the pressure drops and flow leakage. An increase in the brush seal efficiency is clearly a complex problem that is closely related to the brush geometry and arrangement, and can be solved most likely only by means of a numerically distributed model.

  12. Interim Service ISDN Satellite (ISIS) simulator development for advanced satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The simulation development associated with the network models of both the Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) and the Full Service ISDN Satellite (FSIS) architectures is documented. The ISIS Network Model design represents satellite systems like the Advanced Communications Technology Satellite (ACTS) orbiting switch. The FSIS architecture, the ultimate aim of this element of the Satellite Communications Applications Research (SCAR) Program, moves all control and switching functions on-board the next generation ISDN communications satellite. The technical and operational parameters for the advanced ISDN communications satellite design will be obtained from the simulation of ISIS and FSIS engineering software models for their major subsystems. Discrete event simulation experiments will be performed with these models using various traffic scenarios, design parameters, and operational procedures. The data from these simulations will be used to determine the engineering parameters for the advanced ISDN communications satellite.

  13. Advanced radiometric and interferometric milimeter-wave scene simulations

    NASA Technical Reports Server (NTRS)

    Hauss, B. I.; Moffa, P. J.; Steele, W. G.; Agravante, H.; Davidheiser, R.; Samec, T.; Young, S. K.

    1993-01-01

    Smart munitions and weapons utilize various imaging sensors (including passive IR, active and passive millimeter-wave, and visible wavebands) to detect/identify targets at short standoff ranges and in varied terrain backgrounds. In order to design and evaluate these sensors under a variety of conditions, a high-fidelity scene simulation capability is necessary. Such a capability for passive millimeter-wave scene simulation exists at TRW. TRW's Advanced Radiometric Millimeter-Wave Scene Simulation (ARMSS) code is a rigorous, benchmarked, end-to-end passive millimeter-wave scene simulation code for interpreting millimeter-wave data, establishing scene signatures and evaluating sensor performance. In passive millimeter-wave imaging, resolution is limited due to wavelength and aperture size. Where high resolution is required, the utility of passive millimeter-wave imaging is confined to short ranges. Recent developments in interferometry have made possible high resolution applications on military platforms. Interferometry or synthetic aperture radiometry allows the creation of a high resolution image with a sparsely filled aperture. Borrowing from research work in radio astronomy, we have developed and tested at TRW scene reconstruction algorithms that allow the recovery of the scene from a relatively small number of spatial frequency components. In this paper, the TRW modeling capability is described and numerical results are presented.

  14. REFERENCES FOR SIMULATED CASES Anesthesia advanced circulatory life

    E-print Network

    Peak, Derek

    REFERENCES FOR SIMULATED CASES Anesthesia advanced circulatory life://lifeinthefastlane.com/tamara-hills-smacc-gold/ 2. Anesthesia Non Technical Skills Flin R et al, Anaesthetists' non, Beyond monitoring: Distributed situational awareness in anesthesia. British Journal

  15. Virtual Simulator for Advanced Geotechnical Laboratory Testing Dayakar Penumadu1

    E-print Network

    Prashant, Amit

    Virtual Simulator for Advanced Geotechnical Laboratory Testing Dayakar Penumadu1 and Amit Prashant2 behavior. With an objective of overcoming the limitations of existing pedagogy of geotechnical laboratory by the authors for performing virtual geotechnical laboratory experiments associated with evaluating

  16. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Advanced Simulation H Appendix H to Part... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Pt. 121, App. H Appendix H to Part 121—Advanced... ensure that all instructors and check airmen used in appendix H training and checking are...

  17. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Advanced Simulation H Appendix H to Part... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Pt. 121, App. H Appendix H to Part 121—Advanced... ensure that all instructors and check airmen used in appendix H training and checking are...

  18. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Advanced Simulation H Appendix H to Part... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Pt. 121, App. H Appendix H to Part 121—Advanced... airmen used in appendix H training and checking are highly qualified to provide the training required...

  19. Advanced Modeling of Micromirror Devices

    NASA Technical Reports Server (NTRS)

    Michalicek, M. Adrian; Sene, Darren E.; Bright, Victor M.

    1995-01-01

    The flexure-beam micromirror device (FBMD) is a phase only piston style spatial light modulator demonstrating properties which can be used for phase adaptive corrective optics. This paper presents a complete study of a square FBMD, from advanced model development through final device testing and model verification. The model relates the electrical and mechanical properties of the device by equating the electrostatic force of a parallel-plate capacitor with the counter-acting spring force of the device's support flexures. The capacitor solution is derived via the Schwartz-Christoffel transformation such that the final solution accounts for non-ideal electric fields. The complete model describes the behavior of any piston-style device, given its design geometry and material properties. It includes operational parameters such as drive frequency and temperature, as well as fringing effects, mirror surface deformations, and cross-talk from neighboring devices. The steps taken to develop this model can be applied to other micromirrors, such as the cantilever and torsion-beam designs, to produce an advanced model for any given device. The micromirror devices studied in this paper were commercially fabricated in a surface micromachining process. A microscope-based laser interferometer is used to test the device in which a beam reflected from the device modulates a fixed reference beam. The mirror displacement is determined from the relative phase which generates a continuous set of data for each selected position on the mirror surface. Plots of this data describe the localized deflection as a function of drive voltage.

  20. Revolutions in energy through modeling and simulation

    SciTech Connect

    Tatro, M.; Woodard, J.

    1998-08-01

    The development and application of energy technologies for all aspects from generation to storage have improved dramatically with the advent of advanced computational tools, particularly modeling and simulation. Modeling and simulation are not new to energy technology development, and have been used extensively ever since the first commercial computers were available. However, recent advances in computing power and access have broadened the extent and use, and, through increased fidelity (i.e., accuracy) of the models due to greatly enhanced computing power, the increased reliance on modeling and simulation has shifted the balance point between modeling and experimentation. The complex nature of energy technologies has motivated researchers to use these tools to understand better performance, reliability and cost issues related to energy. The tools originated in sciences such as the strength of materials (nuclear reactor containment vessels); physics, heat transfer and fluid flow (oil production); chemistry, physics, and electronics (photovoltaics); and geosciences and fluid flow (oil exploration and reservoir storage). Other tools include mathematics, such as statistics, for assessing project risks. This paper describes a few advancements made possible by these tools and explores the benefits and costs of their use, particularly as they relate to the acceleration of energy technology development. The computational complexity ranges from basic spreadsheets to complex numerical simulations using hardware ranging from personal computers (PCs) to Cray computers. In all cases, the benefits of using modeling and simulation relate to lower risks, accelerated technology development, or lower cost projects.

  1. Advanced numerical methods for the simulation of alloy solidification with high

    E-print Network

    Jimack, Peter

    26 Advanced numerical methods for the simulation of alloy solidification with high Lewis number J research. Keywords: Numerical methods, alloy solidification. 1. Introduction In order to model and simulate Conference on Solidification Processing, Sheffield, July 2007 demonstrate that it is possible, and indeed

  2. Advanced Mirror & Modelling Technology Development

    NASA Technical Reports Server (NTRS)

    Effinger, Michael; Stahl, H. Philip; Abplanalp, Laura; Maffett, Steven; Egerman, Robert; Eng, Ron; Arnold, William; Mosier, Gary; Blaurock, Carl

    2014-01-01

    The 2020 Decadal technology survey is starting in 2018. Technology on the shelf at that time will help guide selection to future low risk and low cost missions. The Advanced Mirror Technology Development (AMTD) team has identified development priorities based on science goals and engineering requirements for Ultraviolet Optical near-Infrared (UVOIR) missions in order to contribute to the selection process. One key development identified was lightweight mirror fabrication and testing. A monolithic, stacked, deep core mirror was fused and replicated twice to achieve the desired radius of curvature. It was subsequently successfully polished and tested. A recently awarded second phase to the AMTD project will develop larger mirrors to demonstrate the lateral scaling of the deep core mirror technology. Another key development was rapid modeling for the mirror. One model focused on generating optical and structural model results in minutes instead of months. Many variables could be accounted for regarding the core, face plate and back structure details. A portion of a spacecraft model was also developed. The spacecraft model incorporated direct integration to transform optical path difference to Point Spread Function (PSF) and between PSF to modulation transfer function. The second phase to the project will take the results of the rapid mirror modeler and integrate them into the rapid spacecraft modeler.

  3. Multi-physics nuclear reactor simulator for advanced nuclear engineering education

    SciTech Connect

    Yamamoto, A.

    2012-07-01

    Multi-physics nuclear reactor simulator, which aims to utilize for advanced nuclear engineering education, is being introduced to Nagoya Univ.. The simulator consists of the 'macroscopic' physics simulator and the 'microscopic' physics simulator. The former performs real time simulation of a whole nuclear power plant. The latter is responsible to more detail numerical simulations based on the sophisticated and precise numerical models, while taking into account the plant conditions obtained in the macroscopic physics simulator. Steady-state and kinetics core analyses, fuel mechanical analysis, fluid dynamics analysis, and sub-channel analysis can be carried out in the microscopic physics simulator. Simulation calculations are carried out through dedicated graphical user interface and the simulation results, i.e., spatial and temporal behaviors of major plant parameters are graphically shown. The simulator will provide a bridge between the 'theories' studied with textbooks and the 'physical behaviors' of actual nuclear power plants. (authors)

  4. Hybrid and Electric Advanced Vehicle Systems Simulation

    NASA Technical Reports Server (NTRS)

    Beach, R. F.; Hammond, R. A.; Mcgehee, R. K.

    1985-01-01

    Predefined components connected to represent wide variety of propulsion systems. Hybrid and Electric Advanced Vehicle System (HEAVY) computer program is flexible tool for evaluating performance and cost of electric and hybrid vehicle propulsion systems. Allows designer to quickly, conveniently, and economically predict performance of proposed drive train.

  5. Recent Advances in Binary Black Hole Merger Simulations

    NASA Technical Reports Server (NTRS)

    Barker, John

    2006-01-01

    Recent advances in numerical simulation techniques have lead to dramatic progress in understanding binary black hole merger radiation. I present recent results from simulations performed at Goddard, focusing on the gravitational radiation waveforms, and the application of these results to gravitational wave observations.

  6. UTILITY OF MECHANISTIC MODELS FOR DIRECTING ADVANCED SEPARATIONS RESEARCH & DEVELOPMENT ACTIVITIES: Electrochemically Modulated Separation Example

    SciTech Connect

    Schwantes, Jon M.

    2009-06-01

    The objective for this work was to demonstrate the utility of mechanistic computer models designed to simulate actinide behavior for use in efficiently and effectively directing advanced laboratory R&D activities associated with developing advanced separations methods.

  7. Graphical simulation environments for modelling and simulation of integrative physiology.

    PubMed

    Mangourova, Violeta; Ringwood, John; Van Vliet, Bruce

    2011-06-01

    Guyton's original integrative physiology model was a milestone in integrative physiology, combining significant physiological knowledge with an engineering perspective to develop a computational diagrammatic model. It is still used in research and teaching, with a small number of variants on the model also in circulation. However, though new research has added significantly to the knowledge represented by Guyton's model, and significant advances have been made in computing and simulation software, an accepted common platform to integrate this new knowledge has not emerged. This paper discusses the issues in the selection of a suitable platform, together with a number of current possibilities, and suggests a graphical computing environment for modelling and simulation. By way of example, a validated version of Guyton's 1992 model, implemented in the ubiquitous Simulink environment, is presented which provides a hierarchical representation amenable to extension and suitable for teaching and research uses. It is designed to appeal to the biomedical engineer and physiologist alike. PMID:20576310

  8. Unraveling the hydrology of a Himalayan catchment through integration of high resolution in situ data and remote sensing with an advanced simulation model

    NASA Astrophysics Data System (ADS)

    Ragettli, S.; Pellicciotti, F.; Immerzeel, W. W.; Miles, E. S.; Petersen, L.; Heynen, M.; Shea, J. M.; Stumm, D.; Joshi, S.; Shrestha, A.

    2015-04-01

    The hydrology of high-elevation watersheds of the Hindu Kush-Himalaya region (HKH) is poorly known. The correct representation of internal states and process dynamics in glacio-hydrological models can often not be verified due to missing in situ measurements. We use a new set of detailed ground data from the upper Langtang valley in Nepal to systematically guide a state-of-the art glacio-hydrological model through a parameter assigning process with the aim to understand the hydrology of the catchment and contribution of snow and ice processes to runoff. 14 parameters are directly calculated on the basis of local data, and 13 parameters are calibrated against 5 different datasets of in situ or remote sensing data. Spatial fields of debris thickness are reconstructed through a novel approach that employs data from an Unmanned Aerial Vehicle (UAV), energy balance modeling and statistical techniques. The model is validated against measured catchment runoff (Nash-Sutcliffe efficiency 0.87) and modeled snow cover is compared to Landsat snow cover. The advanced representation of processes allowed assessing the role played by avalanching for runoff for the first time for a Himalayan catchment (5% of annual water inputs to the hydrological system are due to snow redistribution) and to quantify the hydrological significance of sub-debris ice melt (9% of annual water inputs). Snowmelt is the most important contributor to total runoff during the hydrological year 2012/2013 (representing 40% of all sources), followed by rainfall (34%) and ice melt (26%). A sensitivity analysis is used to assess the efficiency of the monitoring network and identify the timing and location of field measurements that constrain model uncertainty. The methodology to set up a glacio-hydrological model in high-elevation regions presented in this study can be regarded as a benchmark for modelers in the HKH seeking to evaluate their calibration approach, their experimental setup and thus to reduce the predictive model uncertainty.

  9. State of the Art Assessment of Simulation in Advanced Materials Development

    NASA Technical Reports Server (NTRS)

    Wise, Kristopher E.

    2008-01-01

    Advances in both the underlying theory and in the practical implementation of molecular modeling techniques have increased their value in the advanced materials development process. The objective is to accelerate the maturation of emerging materials by tightly integrating modeling with the other critical processes: synthesis, processing, and characterization. The aims of this report are to summarize the state of the art of existing modeling tools and to highlight a number of areas in which additional development is required. In an effort to maintain focus and limit length, this survey is restricted to classical simulation techniques including molecular dynamics and Monte Carlo simulations.

  10. Advanced modeling of prompt fission neutrons

    SciTech Connect

    Talou, Patrick

    2009-01-01

    Theoretical and numerical studies of prompt fission neutrons are presented. The main results of the Los Alamos model often used in nuclear data evaluation work are reviewed briefly, and a preliminary assessment of uncertainties associated with the evaluated prompt fission neutron spectrum for n (0.5 MeV)+{sup 239}Pu is discussed. Advanced modeling of prompt fission neutrons is done by Monte Carlo simulations of the evaporation process of the excited primary fission fragments. The successive emissions of neutrons are followed in the statistical formalism framework, and detailed information, beyond average quantities, can be inferred. This approach is applied to the following reactions: {sup 252}Cf (sf), n{sub th} + {sup 239}Pu, n (0.5 MeV)+{sup 235}U, and {sup 236}Pu (sf). A discussion on the merits and present limitations of this approach concludes this presentation.

  11. Molecular dynamics simulations: advances and applications

    PubMed Central

    Hospital, Adam; Goñi, Josep Ramon; Orozco, Modesto; Gelpí, Josep L

    2015-01-01

    Molecular dynamics simulations have evolved into a mature technique that can be used effectively to understand macromolecular structure-to-function relationships. Present simulation times are close to biologically relevant ones. Information gathered about the dynamic properties of macromolecules is rich enough to shift the usual paradigm of structural bioinformatics from studying single structures to analyze conformational ensembles. Here, we describe the foundations of molecular dynamics and the improvements made in the direction of getting such ensemble. Specific application of the technique to three main issues (allosteric regulation, docking, and structure refinement) is discussed. PMID:26604800

  12. Advanced Monte Carlo Methods: Direct Simulation

    E-print Network

    Mascagni, Michael

    of Comets A Long-period Comet Described as a sequence of elliptic orbits Sun at one focus of the orbit ellipse Energy of a comet is inversely proportional to the length of the semi-major axis of the ellipse Simulation of the Lifetime of Comets (Cont.) Behavior of the Comet Most of the time Moves at a great distance

  13. An Advanced Scattered Moonlight Model

    NASA Astrophysics Data System (ADS)

    Jones, Amy; Noll, Stefan; Kausch, Wolfgang; Kimeswenger, Stefan

    2014-06-01

    In the current era of precision astronomy, a complete sky background model is crucial, especially as the telescopes become even larger in the next decade. Such a model is needed for planning observations as well as understanding and correcting the data for the sky background. We have developed a sky model for this purpose, and it is the most complete and universal sky model that we know of to date (Noll et al. 2012). It covers a wide range of wavelengths from 0.3 to 30 microns up to a resolution of 1,000,000 and is instrument independent. Currently it is optimized for the telescopes at Cerro Paranal and the future site Cerro Armazones in Chile. Its original purpose was to improve the ESO (European Southern Observatory) ETC (Exposure Time Calculator) used for predicting exposure times of observations with a given signal to noise ratio for a set of conditions, as part of the Austrian ascension to ESO. Improving the ETC allows for better scheduling and telescope efficiency, and our new sky model has already been implemented by ESO.The brightest natural source of optical light at night is the Moon, and it is the major contributor to the astronomical sky background. We have an improved scattered moonlight model (Jones et al. 2013), where all of the components are computed with physical processes or observational data with less empirical parametrizations. This model is spectroscopic from 0.3 to 2.5 microns and was studied with a FORS1 (Patat et al. 2008) and dedicated X-Shooter data set. To our knowledge, this is the first spectroscopic model extending into the infrared. It includes fully 3-D single and double scattering calculations with extrapolations to higher orders (Noll et al. 2012), a complex treatment for aerosol scattering (Jones et al. 2013), and a lunar fit based on the ROLO survey (Kieffer & Stone 2005). In addition to its original astronomical purpose, since the model is more physical, we can use the scattered moonlight to probe the properties of the atmosphere, such as the distribution of aerosols.We present the current status of the advanced scattered moonlight model as well as its performance in the optical and near-infrared.

  14. Advanced modeling of high intensity accelerators

    SciTech Connect

    Ryne, R.D.; Habib, S.; Wangler, T.P.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The goals of this project were three-fold: (1) to develop a new capability, based on high performance (parallel) computers, to perform large scale simulations of high intensity accelerators; (2) to apply this capability to modeling high intensity accelerators under design at LANL; and (3) to use this new capability to improve the understanding of the physics of intense charge particle beams, especially in regard to the issue of beam halo formation. All of these goals were met. In particular, the authors introduced split-operator methods as a powerful and efficient means to simulate intense beams in the presence of rapidly varying accelerating and focusing fields. They then applied these methods to develop scaleable, parallel beam dynamics codes for modeling intense beams in linacs, and in the process they implemented a new three-dimensional space charge algorithm. They also used the codes to study a number of beam dynamics issues related to the Accelerator Production of Tritium (APT) project, and in the process performed the largest simulations to date for any accelerator design project. Finally, they used the new modeling capability to provide direction and validation to beam physics studies, helping to identify beam mismatch as a major source of halo formation in high intensity accelerators. This LDRD project ultimately benefited not only LANL but also the US accelerator community since, by promoting expertise in high performance computing and advancing the state-of-the-art in accelerator simulation, its accomplishments helped lead to approval of a new DOE Grand Challenge in Computational Accelerator Physics.

  15. Using CONFIG for Simulation of Operation of Water Recovery Subsystems for Advanced Control Software Evaluation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Flores, Luis; Fleming, Land; Throop, Daiv

    2002-01-01

    A hybrid discrete/continuous simulation tool, CONFIG, has been developed to support evaluation of the operability life support systems. CON FIG simulates operations scenarios in which flows and pressures change continuously while system reconfigurations occur as discrete events. In simulations, intelligent control software can interact dynamically with hardware system models. CONFIG simulations have been used to evaluate control software and intelligent agents for automating life support systems operations. A CON FIG model of an advanced biological water recovery system has been developed to interact with intelligent control software that is being used in a water system test at NASA Johnson Space Center

  16. Measurement and modeling of advanced coal conversion processes

    SciTech Connect

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G.; Smoot, L.D.; Brewster, B.S. Brigham Young Univ., Provo, UT )

    1991-01-01

    The overall objective of this program is the development of predictive capability for the design, scale up, simulation, control and feedstock evaluation in advanced coal conversion devices. This program will merge significant advances made in measuring and quantitatively describing the mechanisms in coal conversion behavior. Comprehensive computer codes for mechanistic modeling of entrained-bed gasification. Additional capabilities in predicting pollutant formation will be implemented and the technology will be expanded to fixed-bed reactors.

  17. Advances in Sun-Earth Connection Modeling

    NASA Astrophysics Data System (ADS)

    Ganguli, S. B.; Gavrishchaka, V. V.

    2003-06-01

    Space weather forecasting is a focus of a multidisciplinary research effort motivated by a sensitive dependence of many modern technologies on geospace conditions. Adequate understanding of the physics of the Sun-Earth connection and associated multi-scale magnetospheric and ionospheric processes is an essential part of this effort. Modern physical simulation models such as multimoment multifluid models with effective coupling from small-scale kinetic processes can provide valuable insight into the role of various physical mechanisms operating during geomagnetic storm/substorm activity. However, due to necessary simplifying assumptions, physical models are still not well suited for accurate real-time forecasting. Complimentary approach includes data-driven models capable of efficient processing of multi-scale spatio-temporal data. However, the majority of advanced nonlinear algorithms, including neural networks (NN), can encounter a set of problems called dimensionality curse when applied to high-dimensional data. Forecasting of rare/extreme events such as large geomagnetic storms/substorms is of the most practical importance but is also very challenging for many existing models. A very promising algorithm that combines the power of the best nonlinear techniques and tolerance to high-dimensional and incomplete data is support vector machine (SVM). We have summarized advantages of the SVM and described a hybrid model based on SVM and extreme value theory (EVT) for rare event forecasting. Results of the SVM application to substorm forecasting and future directions are discussed.

  18. An advanced terrain modeler for an autonomous planetary rover

    NASA Technical Reports Server (NTRS)

    Hunter, E. L.

    1980-01-01

    A roving vehicle capable of autonomously exploring the surface of an alien world is under development and an advanced terrain modeler to characterize the possible paths of the rover as hazardous or safe is presented. This advanced terrain modeler has several improvements over the Troiani modeler that include: a crosspath analysis, better determination of hazards on slopes, and methods for dealing with missing returns at the extremities of the sensor field. The results from a package of programs to simulate the roving vehicle are then examined and compared to results from the Troiani modeler.

  19. A survey of Existing V&V, UQ and M&S Data and Knowledge Bases in Support of the Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS)

    SciTech Connect

    Hyung Lee; Rich Johnson, Ph.D.; Kimberlyn C. Moussesau

    2011-12-01

    The Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Oak Ridge National Laboratory, Utah State University and others. The objective of this consortium is to establish a comprehensive knowledge base to provide Verification and Validation (V&V) and Uncertainty Quantification (UQ) and other resources for advanced modeling and simulation (M&S) in nuclear reactor design and analysis. NE-KAMS will become a valuable resource for the nuclear industry, the national laboratories, the U.S. NRC and the public to help ensure the safe operation of existing and future nuclear reactors. A survey and evaluation of the state-of-the-art of existing V&V and M&S databases, including the Department of Energy and commercial databases, has been performed to ensure that the NE-KAMS effort will not be duplicating existing resources and capabilities and to assess the scope of the effort required to develop and implement NE-KAMS. The survey and evaluation have indeed highlighted the unique set of value-added functionality and services that NE-KAMS will provide to its users. Additionally, the survey has helped develop a better understanding of the architecture and functionality of these data and knowledge bases that can be used to leverage the development of NE-KAMS.

  20. Advanced Tsunami Numerical Simulations and Energy Considerations by use of 3D-2D Coupled Models: The October 11, 1918, Mona Passage Tsunami

    NASA Astrophysics Data System (ADS)

    López-Venegas, Alberto M.; Horrillo, Juan; Pampell-Manis, Alyssa; Huérfano, Victor; Mercado, Aurelio

    2015-06-01

    The most recent tsunami observed along the coast of the island of Puerto Rico occurred on October 11, 1918, after a magnitude 7.2 earthquake in the Mona Passage. The earthquake was responsible for initiating a tsunami that mostly affected the northwestern coast of the island. Runup values from a post-tsunami survey indicated the waves reached up to 6 m. A controversy regarding the source of the tsunami has resulted in several numerical simulations involving either fault rupture or a submarine landslide as the most probable cause of the tsunami. Here we follow up on previous simulations of the tsunami from a submarine landslide source off the western coast of Puerto Rico as initiated by the earthquake. Improvements on our previous study include: (1) higher-resolution bathymetry; (2) a 3D-2D coupled numerical model specifically developed for the tsunami; (3) use of the non-hydrostatic numerical model NEOWAVE (non-hydrostatic evolution of ocean WAVE) featuring two-way nesting capabilities; and (4) comprehensive energy analysis to determine the time of full tsunami wave development. The three-dimensional Navier-Stokes model tsunami solution using the Navier-Stokes algorithm with multiple interfaces for two fluids (water and landslide) was used to determine the initial wave characteristic generated by the submarine landslide. Use of NEOWAVE enabled us to solve for coastal inundation, wave propagation, and detailed runup. Our results were in agreement with previous work in which a submarine landslide is favored as the most probable source of the tsunami, and improvement in the resolution of the bathymetry yielded inundation of the coastal areas that compare well with values from a post-tsunami survey. Our unique energy analysis indicates that most of the wave energy is isolated in the wave generation region, particularly at depths near the landslide, and once the initial wave propagates from the generation region its energy begins to stabilize.

  1. Micromechanical modeling of advanced materials

    SciTech Connect

    Silling, S.A.; Taylor, P.A.; Wise, J.L.; Furnish, M.D.

    1994-04-01

    Funded as a laboratory-directed research and development (LDRD) project, the work reported here focuses on the development of a computational methodology to determine the dynamic response of heterogeneous solids on the basis of their composition and microstructural morphology. Using the solid dynamics wavecode CTH, material response is simulated on a scale sufficiently fine to explicitly represent the material`s microstructure. Conducting {open_quotes}numerical experiments{close_quotes} on this scale, the authors explore the influence that the microstructure exerts on the material`s overall response. These results are used in the development of constitutive models that take into account the effects of microstructure without explicit representation of its features. Applying this methodology to a glass-reinforced plastic (GRP) composite, the authors examined the influence of various aspects of the composite`s microstructure on its response in a loading regime typical of impact and penetration. As a prerequisite to the microscale modeling effort, they conducted extensive materials testing on the constituents, S-2 glass and epoxy resin (UF-3283), obtaining the first Hugoniot and spall data for these materials. The results of this work are used in the development of constitutive models for GRP materials in transient-dynamics computer wavecodes.

  2. Electricity Portfolio Simulation Model

    Energy Science and Technology Software Center (ESTSC)

    2005-09-01

    Stakeholders often have competing interests when selecting or planning new power plants. The purpose of developing this preliminary Electricity Portfolio Simulation Model (EPSim) is to provide a first cut, dynamic methodology and approach to this problem, that can subsequently be refined and validated, that may help energy planners, policy makers, and energy students better understand the tradeoffs associated with competing electricity portfolios. EPSim allows the user to explore competing electricity portfolios annually from 2002 tomore »2025 in terms of five different criteria: cost, environmental impacts, energy dependence, health and safety, and sustainability. Four additional criteria (infrastructure vulnerability, service limitations, policy needs and science and technology needs) may be added in future versions of the model. Using an analytic hierarchy process (AHP) approach, users or groups of users apply weights to each of the criteria. The default energy assumptions of the model mimic Department of Energy?s (DOE) electricity portfolio to 2025 (EIA, 2005). At any time, the user can compare alternative portfolios to this reference case portfolio.« less

  3. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts, and tool maturity. This paper presents a brief introduction to the functionality available in ACTS.

  4. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-06-17

    Earthquake source parameters underpin several aspects of nuclear explosion monitoring. Such aspects are: calibration of moment magnitudes (including coda magnitudes) and magnitude and distance amplitude corrections (MDAC); source depths; discrimination by isotropic moment tensor components; and waveform modeling for structure (including waveform tomography). This project seeks to improve methods for and broaden the applicability of estimating source parameters from broadband waveforms using the Cut-and-Paste (CAP) methodology. The CAP method uses a library of Green’s functions for a one-dimensional (1D, depth-varying) seismic velocity model. The method separates the main arrivals of the regional waveform into 5 windows: Pnl (vertical and radial components), Rayleigh (vertical and radial components) and Love (transverse component). Source parameters are estimated by grid search over strike, dip, rake and depth and seismic moment or equivalently moment magnitude, MW, are adjusted to fit the amplitudes. Key to the CAP method is allowing the synthetic seismograms to shift in time relative to the data in order to account for path-propagation errors (delays) in the 1D seismic velocity model used to compute the Green’s functions. The CAP method has been shown to improve estimates of source parameters, especially when delay and amplitude biases are calibrated using high signal-to-noise data from moderate earthquakes, CAP+.

  5. Advanced Simulation Capability for Environmental Management (ASCEM) Phase II Demonstration

    SciTech Connect

    Freshley, M.; Hubbard, S.; Flach, G.; Freedman, V.; Agarwal, D.; Andre, B.; Bott, Y.; Chen, X.; Davis, J.; Faybishenko, B.; Gorton, I.; Murray, C.; Moulton, D.; Meyer, J.; Rockhold, M.; Shoshani, A.; Steefel, C.; Wainwright, H.; Waichler, S.

    2012-09-28

    In 2009, the National Academies of Science (NAS) reviewed and validated the U.S. Department of Energy Office of Environmental Management (EM) Technology Program in its publication, Advice on the Department of Energy’s Cleanup Technology Roadmap: Gaps and Bridges. The NAS report outlined prioritization needs for the Groundwater and Soil Remediation Roadmap, concluded that contaminant behavior in the subsurface is poorly understood, and recommended further research in this area as a high priority. To address this NAS concern, the EM Office of Site Restoration began supporting the development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific approach that uses an integration of toolsets for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM modeling toolset is modular and open source. It is divided into three thrust areas: Multi-Process High Performance Computing (HPC), Platform and Integrated Toolsets, and Site Applications. The ASCEM toolsets will facilitate integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. During fiscal year 2012, the ASCEM project continued to make significant progress in capabilities development. Capability development occurred in both the Platform and Integrated Toolsets and Multi-Process HPC Simulator areas. The new Platform and Integrated Toolsets capabilities provide the user an interface and the tools necessary for end-to-end model development that includes conceptual model definition, data management for model input, model calibration and uncertainty analysis, and model output processing including visualization. The new HPC Simulator capabilities target increased functionality of process model representations, toolsets for interaction with the Platform, and model confidence testing and verification for quality assurance. The Platform and HPC capabilities are being tested and evaluated for EM applications through a suite of demonstrations being conducted by the Site Applications Thrust. In 2010, the Phase I Demonstration focused on testing initial ASCEM capabilities. The Phase II Demonstration, completed in September 2012, focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site Deep Vadose Zone (BC Cribs) served as an application site for an end-to-end demonstration of ASCEM capabilities on a site with relatively sparse data, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations included in this Phase II report included addressing attenuation-based remedies at the Savannah River Site F-Area, to exercise linked ASCEM components under data-dense and complex geochemical conditions, and conducting detailed simulations of a representative waste tank. This report includes descriptive examples developed by the Hanford Site Deep Vadose Zone, the SRS F-Area Attenuation-Based Remedies for the Subsurface, and the Waste Tank Performance Assessment working groups. The integrated Phase II Demonstration provides test cases to accompany distribution of the initial user release (Version 1.0) of the ASCEM software tools to a limited set of users in 2013. These test cases will be expanded with each new release, leading up to the release of a version that is qualified for regulatory applications in the 2015 time frame.

  6. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Advanced Simulation H Appendix H to Part 121 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIR CARRIERS AND OPERATORS FOR COMPENSATION OR HIRE: CERTIFICATION AND OPERATIONS OPERATING REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS...

  7. Advanced Visualization Technology for Terascale Particle Accelerator Simulations

    E-print Network

    Ma, Kwan-Liu

    Advanced Visualization Technology for Terascale Particle Accelerator Simulations Kwan-Liu Ma £ Greg-performance computing, particle accelerators, perception, point-based rendering, scientific visualization, field lines Introduction Particle accelerators have helped enable some of the most remarkable discoveries of the 20th

  8. PIXE simulation: Models, methods and technologies

    SciTech Connect

    Batic, M.; Pia, M. G.; Saracco, P.; Weidenspointner, G.

    2013-04-19

    The simulation of PIXE (Particle Induced X-ray Emission) is discussed in the context of general-purpose Monte Carlo systems for particle transport. Dedicated PIXE codes are mainly concerned with the application of the technique to elemental analysis, but they lack the capability of dealing with complex experimental configurations. General-purpose Monte Carlo codes provide powerful tools to model the experimental environment in great detail, but so far they have provided limited functionality for PIXE simulation. This paper reviews recent developments that have endowed the Geant4 simulation toolkit with advanced capabilities for PIXE simulation, and related efforts for quantitative validation of cross sections and other physical parameters relevant to PIXE simulation.

  9. PIXE simulation: Models, methods and technologies

    NASA Astrophysics Data System (ADS)

    Batic, M.; Pia, M. G.; Saracco, P.; Weidenspointner, G.

    2013-04-01

    The simulation of PIXE (Particle Induced X-ray Emission) is discussed in the context of general-purpose Monte Carlo systems for particle transport. Dedicated PIXE codes are mainly concerned with the application of the technique to elemental analysis, but they lack the capability of dealing with complex experimental configurations. General-purpose Monte Carlo codes provide powerful tools to model the experimental environment in great detail, but so far they have provided limited functionality for PIXE simulation. This paper reviews recent developments that have endowed the Geant4 simulation toolkit with advanced capabilities for PIXE simulation, and related efforts for quantitative validation of cross sections and other physical parameters relevant to PIXE simulation.

  10. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  11. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-10-17

    This quarter, we have focused on several tasks: (1) Building a high-quality catalog of earthquake source parameters for the Middle East and East Asia. In East Asia, we computed source parameters using the CAP method for a set of events studied by Herrman et al., (MRR, 2006) using a complete waveform technique. Results indicated excellent agreement with the moment magnitudes in the range 3.5 -5.5. Below magnitude 3.5 the scatter increases. For events with more than 2-3 observations at different azimuths, we found good agreement of focal mechanisms. Depths were generally consistent, although differences of up to 10 km were found. These results suggest that CAP modeling provides estimates of source parameters at least as reliable as complete waveform modeling techniques. However, East Asia and the Yellow Sea Korean Paraplatform (YSKP) region studied are relatively laterally homogeneous and may not benefit from the CAP method’s flexibility to shift waveform segments to account for path-dependent model errors. A more challenging region to study is the Middle East where strong variations in sedimentary basin, crustal thickness and crustal and mantle seismic velocities greatly impact regional wave propagation. We applied the CAP method to a set of events in and around Iran and found good agreement between estimated focal mechanisms and those reported by the Global Centroid Moment Tensor (CMT) catalog. We found a possible bias in the moment magnitudes that may be due to the thick low-velocity crust in the Iranian Plateau. (2) Testing Methods on a Lifetime Regional Data Set. In particular, the recent 2/21/08 Nevada Event and Aftershock Sequence occurred in the middle of USArray, producing over a thousand records per event. The tectonic setting is quite similar to Central Iran and thus provides an excellent testbed for CAP+ at ranges out to 10°, including extensive observations of crustal thinning and thickening and various Pnl complexities. Broadband modeling in 1D, 2D, and 3D will be presented. (3) Shallow Crustal Structure and Sparse Network Source Inversions for Southern California. We conducted a detailed test of a recently developed technique, CAPloc, in recovering source parameters including location and depth based on tomographic maps. We tested two-station solutions against 160 well determined events which worked well except for paths crossing deep basins and along mountain ridges.

  12. Interim Service ISDN Satellite (ISIS) network model for advanced satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.; Hager, E. Paul

    1991-01-01

    The Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) Network Model for Advanced Satellite Designs and Experiments describes a model suitable for discrete event simulations. A top-down model design uses the Advanced Communications Technology Satellite (ACTS) as its basis. The ISDN modeling abstractions are added to permit the determination and performance for the NASA Satellite Communications Research (SCAR) Program.

  13. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, D; Tromp, J; Rodgers, A

    2007-07-16

    Comprehensive test ban monitoring in terms of location and discrimination has progressed significantly in recent years. However, the characterization of sources and the estimation of low yields remains a particular challenge. As the recent Korean shot demonstrated, we can probably expect to have a small set of teleseismic, far-regional and high-frequency regional data to analyze in estimating the yield of an event. Since stacking helps to bring signals out of the noise, it becomes useful to conduct comparable analyses on neighboring events, earthquakes in this case. If these auxiliary events have accurate moments and source descriptions, we have a means of directly comparing effective source strengths. Although we will rely on modeling codes, 1D, 2D, and 3D, we will also apply a broadband calibration procedure to use longer periods (P>5s) waveform data to calibrate short-period (P between .5 to 2 Hz) and high-frequency (P between 2 to 10 Hz) as path specify station corrections from well-known regional sources. We have expanded our basic Cut-and-Paste (CAP) methodology to include not only timing shifts but also amplitude (f) corrections at recording sites. The name of this method was derived from source inversions that allow timing shifts between 'waveform segments' (or cutting the seismogram up and re-assembling) to correct for crustal variation. For convenience, we will refer to these f-dependent refinements as CAP+ for (SP) and CAP++ for still higher frequency. These methods allow the retrieval of source parameters using only P-waveforms where radiation patterns are obvious as demonstrated in this report and are well suited for explosion P-wave data. The method is easily extended to all distances because it uses Green's function although there may be some changes required in t* to adjust for offsets between local vs. teleseismic distances. In short, we use a mixture of model-dependent and empirical corrections to tackle the path effects. Although we reply on the large TriNet array as a testbed for refining methods, we will present some preliminary results on Korea and Iran.

  14. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  15. Nuclear rocket simulation system for the development of advanced control

    SciTech Connect

    Walter, P.B.; Edwards, R.M.

    1994-12-31

    An integrated control and health monitoring architecture is being developed for the Pratt and Whitney XNR2000, fast spectrum, CERMET fueled nuclear rocket. To permit testing and module development of this control architecture, a dynamic simulation system for modeling nuclear rockets has been developed. This paper described the simulation system and the resulting model of the rocket.

  16. Recent Advances on Agent Based Tsunami Evacuation Simulation: Case Studies at Indonesia, Thailand, Japan

    E-print Network

    Recent Advances on Agent Based Tsunami Evacuation Simulation: Case Studies at Indonesia, Thailand Through tsunami numerical models, tsunami researchers have worked on the understanding of the physics of tsunami events. Efforts to comprehend not only the natural phenomena but also the social complex behavior

  17. Evaluating uncertainty in simulation models

    SciTech Connect

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  18. NEAMS FPL M2 Milestone Report: Development of a UO? Grain Size Model using Multicale Modeling and Simulation

    SciTech Connect

    Michael R Tonks; Yongfeng Zhang; Xianming Bai

    2014-06-01

    This report summarizes development work funded by the Nuclear Energy Advanced Modeling Simulation program's Fuels Product Line (FPL) to develop a mechanistic model for the average grain size in UO? fuel. The model is developed using a multiscale modeling and simulation approach involving atomistic simulations, as well as mesoscale simulations using INL's MARMOT code.

  19. Lessons Learned From Dynamic Simulations of Advanced Fuel Cycles

    SciTech Connect

    Steven J. Piet; Brent W. Dixon; Jacob J. Jacobson; Gretchen E. Matthern; David E. Shropshire

    2009-04-01

    Years of performing dynamic simulations of advanced nuclear fuel cycle options provide insights into how they could work and how one might transition from the current once-through fuel cycle. This paper summarizes those insights from the context of the 2005 objectives and goals of the Advanced Fuel Cycle Initiative (AFCI). Our intent is not to compare options, assess options versus those objectives and goals, nor recommend changes to those objectives and goals. Rather, we organize what we have learned from dynamic simulations in the context of the AFCI objectives for waste management, proliferation resistance, uranium utilization, and economics. Thus, we do not merely describe “lessons learned” from dynamic simulations but attempt to answer the “so what” question by using this context. The analyses have been performed using the Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics (VISION). We observe that the 2005 objectives and goals do not address many of the inherently dynamic discriminators among advanced fuel cycle options and transitions thereof.

  20. Transactive Modeling and Simulation Capabilities

    E-print Network

    Transactive Modeling and Simulation Capabilities NIST Transactive Energy Challenge Preparatory Power Systems Loads Markets Unifies models of the key elements of a smart grid: 3 GridLAB-D is an open behavior (seconds to hours). Captures seasonal effects (days to years). Simulates control systems

  1. National Research Council Dialogue to Assess Progress on NASA's Advanced Modeling, Simulation and Analysis Capability and Systems Engineering Capability Roadmap Development

    NASA Technical Reports Server (NTRS)

    Aikins, Jan

    2005-01-01

    Contents include the following: General Background and Introduction of Capability Roadmaps. Agency Objective. Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date).

  2. Requirements for advanced simulation of nuclear reactor and chemicalseparation plants.

    SciTech Connect

    Palmiotti, G.; Cahalan, J.; Pfeiffer, P.; Sofu, T.; Taiwo, T.; Wei,T.; Yacout, A.; Yang, W.; Siegel, A.; Insepov, Z.; Anitescu, M.; Hovland,P.; Pereira, C.; Regalbuto, M.; Copple, J.; Willamson, M.

    2006-12-11

    This report presents requirements for advanced simulation of nuclear reactor and chemical processing plants that are of interest to the Global Nuclear Energy Partnership (GNEP) initiative. Justification for advanced simulation and some examples of grand challenges that will benefit from it are provided. An integrated software tool that has its main components, whenever possible based on first principles, is proposed as possible future approach for dealing with the complex problems linked to the simulation of nuclear reactor and chemical processing plants. The main benefits that are associated with a better integrated simulation have been identified as: a reduction of design margins, a decrease of the number of experiments in support of the design process, a shortening of the developmental design cycle, and a better understanding of the physical phenomena and the related underlying fundamental processes. For each component of the proposed integrated software tool, background information, functional requirements, current tools and approach, and proposed future approaches have been provided. Whenever possible, current uncertainties have been quoted and existing limitations have been presented. Desired target accuracies with associated benefits to the different aspects of the nuclear reactor and chemical processing plants were also given. In many cases the possible gains associated with a better simulation have been identified, quantified, and translated into economical benefits.

  3. Advanced Modeling of Thermal Plasmas for Industrial Applications

    NASA Astrophysics Data System (ADS)

    Colombo, Vittorio; Ghedini, Emanuele

    2006-10-01

    Modeling results are presented for different industrial thermal plasma sources using a customized version of the commercial code FLUENT capable of 2D and 3D transient simulation with advanced CFD models that take into account turbulence effects using different approaches (Reynolds Stress Model and Large Eddy Simulation), transport of species and radiation (Discrete Ordinate Model with interaction between radiation and solid surfaces). Simulations results are presented in order to show the capabilities of this modeling tool, which is very useful for the design of a wide range of atmospheric pressure thermal plasmas devices and related assisted processes, such as: ICPTs with injection of powders for spheroidization, DC twin-torch transferred arc plasma systems for waste treatment, DC non-transferred arc torch for plasma spraying and DC transferred arc torch for high quality plasma cutting.

  4. Genome Reshuffling for Advanced Intercross Permutation (GRAIP): Simulation and permutation for advanced intercross population analysis

    SciTech Connect

    Pierce, Jeremy; Broman, Karl; Lu, Lu; Chesler, Elissa J; Zhou, Guomin; Airey, David; Birmingham, Amanda; Williams, Robert

    2008-04-01

    Background: Advanced intercross lines (AIL) are segregating populations created using a multi-generation breeding protocol for fine mapping complex trait loci (QTL) in mice and other organisms. Applying QTL mapping methods for intercross and backcross populations, often followed by na ve permutation of individuals and phenotypes, does not account for the effect of AIL family structure in which final generations have been expanded and leads to inappropriately low significance thresholds. The critical problem with na ve mapping approaches in AIL populations is that the individual is not an exchangeable unit. Methodology/Principal Findings: The effect of family structure has immediate implications for the optimal AIL creation (many crosses, few animals per cross, and population expansion before the final generation) and we discuss these and the utility of AIL populations for QTL fine mapping. We also describe Genome Reshuffling for Advanced Intercross Permutation, (GRAIP) a method for analyzing AIL data that accounts for family structure. GRAIP permutes a more interchangeable unit in the final generation crosses - the parental genome - and simulating regeneration of a permuted AIL population based on exchanged parental identities. GRAIP determines appropriate genome-wide significance thresholds and locus-specific Pvalues for AILs and other populations with similar family structures. We contrast GRAIP with na ve permutation using a large densely genotyped mouse AIL population (1333 individuals from 32 crosses). A na ve permutation using coat color as a model phenotype demonstrates high false-positive locus identification and uncertain significance levels, which are corrected using GRAIP. GRAIP also detects an established hippocampus weight locus and a new locus, Hipp9a. Conclusions and Significance: GRAIP determines appropriate genome-wide significance thresholds and locus-specific Pvalues for AILs and other populations with similar family structures. The effect of family structure has immediate implications for the optimal AIL creation and we discuss these and the utility of AIL populations.

  5. Modeling and Simulation of Fluid Mixing Laser Experiments and Supernova

    SciTech Connect

    Glimm, James

    2008-06-24

    The three year plan for this project is to develop novel theories and advanced simulation methods leading to a systematic understanding of turbulent mixing. A primary focus is the comparison of simulation models (both Direct Numerical Simulation and subgrid averaged models) to experiments. The comprehension and reduction of experimental and simulation data are central goals of this proposal. We will model 2D and 3D perturbations of planar interfaces. We will compare these tests with models derived from averaged equations (our own and those of others). As a second focus, we will develop physics based subgrid simulation models of diffusion across an interface, with physical but no numerical mass diffusion. We will conduct analytic studies of mix, in support of these objectives. Advanced issues, including multiple layers and reshock, will be considered.

  6. Vol. 63 No. 8 JOM 59www.tms.org/jom.html Research SummaryAdvanced Fuel Performance: Modeling and Simulation

    E-print Network

    Motta, Arthur T.

    to waterside corrosion (defined here as uniform corrosion by coolant water, as opposed to inner-diameter stress cor- rosion cracking or to localized forms of corrosion such as nodular corro- sion) originates from and Simulation Waterside Corrosion in Zirconium Alloys Arthur T. Motta How would you... ...describe the overall

  7. Simulating DNLS models

    E-print Network

    Mulansky, Mario

    2013-01-01

    We present different techniques to numerically solve the equations of motion for the widely studied Discrete Nonlinear Schroedinger equation (DNLS). Being a Hamiltonian system, the DNLS requires symplectic routines for an efficient numerical treatment. Here, we introduce different such schemes in detail and compare their performance and accuracy by extensive numerical simulations.

  8. Simulation modeling of estuarine ecosystems

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1980-01-01

    A simulation model has been developed of Galveston Bay, Texas ecosystem. Secondary productivity measured by harvestable species (such as shrimp and fish) is evaluated in terms of man-related and controllable factors, such as quantity and quality of inlet fresh-water and pollutants. This simulation model used information from an existing physical parameters model as well as pertinent biological measurements obtained by conventional sampling techniques. Predicted results from the model compared favorably with those from comparable investigations. In addition, this paper will discuss remotely sensed and conventional measurements in the framework of prospective models that may be used to study estuarine processes and ecosystem productivity.

  9. Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.; Sostaric, Ronald r.; Johnson, Andrew E.

    2008-01-01

    Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.

  10. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  11. Modeling and Simulation at NASA

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2009-01-01

    This slide presentation is composed of two topics. The first reviews the use of modeling and simulation (M&S) particularly as it relates to the Constellation program and discrete event simulation (DES). DES is defined as a process and system analysis, through time-based and resource constrained probabilistic simulation models, that provide insight into operation system performance. The DES shows that the cycles for a launch from manufacturing and assembly to launch and recovery is about 45 days and that approximately 4 launches per year are practicable. The second topic reviews a NASA Standard for Modeling and Simulation. The Columbia Accident Investigation Board made some recommendations related to models and simulations. Some of the ideas inherent in the new standard are the documentation of M&S activities, an assessment of the credibility, and reporting to decision makers, which should include the analysis of the results, a statement as to the uncertainty in the results,and the credibility of the results. There is also discussion about verification and validation (V&V) of models. There is also discussion about the different types of models and simulation.

  12. Modeling and Simulation of Fluid Mixing Laser Experiments and Supernova

    SciTech Connect

    James Glimm

    2009-06-04

    The three year plan for this project was to develop novel theories and advanced simulation methods leading to a systematic understanding of turbulent mixing. A primary focus is the comparison of simulation models (Direct Numerical Simulation (DNS), Large Eddy Simulations (LES), full two fluid simulations and subgrid averaged models) to experiments. The comprehension and reduction of experimental and simulation data are central goals of this proposal. We model 2D and 3D perturbations of planar or circular interfaces. We compare these tests with models derived from averaged equations (our own and those of others). As a second focus, we develop physics based subgrid simulation models of diffusion across an interface, with physical but no numerical mass diffusion. Multiple layers and reshock are considered here.

  13. Safety Assessment of Advanced Imaging Sequences II: Simulations.

    PubMed

    Jensen, Jo Arendt

    2016-01-01

    An automatic approach for simulating the emitted pressure, intensity, and mechanical index (MI) of advanced ultrasound imaging sequences is presented. It is based on a linear simulation of pressure fields using Field II, and it is hypothesized that linear simulation can attain the needed accuracy for predicting MI and [Formula: see text] as required by FDA. The method is performed on four different imaging schemes and compared to measurements conducted using the SARUS experimental scanner. The sequences include focused emissions with an F-number of 2 with 64 elements that generate highly nonlinear fields. The simulation time is between 0.67 and 2.8 ms per emission and imaging point, making it possible to simulate even complex emission sequences in less than 1 s for a single spatial position. The linear simulations yield a relative accuracy on MI between -12.1% and 52.3% and for [Formula: see text] between -38.6% and 62.6%, when using the impulse response of the probe estimated from an independent measurement. The accuracy is increased to between -22% and 24.5% for MI and between -33.2% and 27.0% for [Formula: see text], when using the pressure response measured at a single point to scale the simulation. The spatial distribution of MI and Ita.3 closely matches that for the measurement, and simulations can, therefore, be used to select the region for measuring the intensities, resulting in a significant reduction in measurement time. It can validate emission sequences by showing symmetry of emitted pressure fields, focal position, and intensity distribution. PMID:26571524

  14. An Advanced Scattered Moonlight Model

    NASA Astrophysics Data System (ADS)

    Jones, Amy M.; Noll, Stefan; Kausch, Wolfgang; Szyszka, Cezary; Kimeswenger, Stefan

    2014-06-01

    Correcting and predicting the flux coming from the background sky is a crucial aspect of observational astronomy. We have developed a sky background model for this purpose, and it is the most complete and universal sky model that we know of to date. The largest natural source of light at night in the optical is the Moon, and it is a major contributor to the astronomical sky background. An improved spectroscopic scattered moonlight model, which is applicable from 0.3 to 2.5 ?m has been developed and studied with a set of FORS1 spectra and a dedicated X-shooter dataset. To our knowledge, this is the first spectroscopic model extending into the infrared and it has been tested for many lunar phases and geometries of the Moon and target observations.

  15. Simulation of critical IC fabrication processes using advanced physical and numerical methods

    NASA Astrophysics Data System (ADS)

    Juengling, W.; Pichler, P.; Selberherr, S.; Guerrero, E.; Poetzl, H. W.

    1985-02-01

    Critical steps of IC fabrication are simulated by one- and two-dimensional computer programs using advanced physical models. These codes deal with an arbitrary number of physical quantities such as concentrations of dopants, vacancies, interstitials and clusters, the electrostatic potential, and so on. Furthermore, they easily permit the exchange or variation of the physical models under consideration. As typical applications, phenomena of coupled diffusion in one and two dimensions and dynamic arsenic clustering are investigated. The differences caused by the models of the zero space-charge approximation and the solution of the exact Poisson equation are studied by examples of As-B diffusion with various doping concentrations at different temperatures. A dynamic cluster model developed for the simulation of thermally annealed As implantations is compared to measured data of laser annealing experiments. A short outline of the mathematical and the numerical problems is given to show the amount of sophistication necessary for up-date process simulation.

  16. Controls on advance of tidewater glaciers: results from numerical modeling applied to Columbia Glacier

    E-print Network

    Nick, F. M.; van der Veen, Cornelis J.; Oerlemans, J.

    2007-07-11

    A one-dimensional numerical ice flow model is used to study the advance of a tidewater glacier into deep water. Starting with ice-free conditions, the model simulates glacier growth at higher elevations followed by advance on land to the head...

  17. Simulation of an advanced small aperture track system

    NASA Astrophysics Data System (ADS)

    Williams, Tommy J.; Crockett, Gregg A.; Brunson, Richard L.; Beatty, Brad; Zahirniak, Daniel R.; Deuto, Bernard G.

    2001-08-01

    Simulation development for EO Systems has progressed to new levels with the advent of COTS software tools such as Matlab/Simulink. These tools allow rapid reuse of simulation library routines. We have applied these tools to newly emerging Acquisition Tracking and Pointing (ATP) systems using many routines developed through a legacy to High Energy Laser programs such as AirBorne Laser, Space Based Laser, Tactical High Energy Laser, and The Air Force Research Laboratory projects associated with the Starfire Optical Range. The simulation architecture allows ease in testing various track algorithms under simulated scenes with the ability to rapidly vary system hardware parameters such as track sensor and track loop control systems. The atmospheric turbulence environment and associated optical distortion is simulated to high fidelity levels through the application of an atmospheric phase screen model to produce scintillation of the laser illuminator uplink. The particular ATP system simulated is a small transportable system for tracking satellites in a daytime environment and projects a low power laser and receives laser return from retro-reflector equipped satellites. The primary application of the ATP system (and therefore the simulation) is the determination of the illuminator beam profile, jitter, and scintillation of the low power laser at the satellite. The ATP system will serve as a test bed for satellite tracking in a high background during daytime. Of particular interest in this simulation is the ability to emulate the hardware modelogic within the simulation to test and refine system states and mode change decisions. Additionally, the simulation allows data from the hardware system tests to be imported into Matlab and to thereby drive the simulation or to be easily compared to simulation results.

  18. Simulation Models for Improved Water Heating Systems

    E-print Network

    Lutz, Jim

    2014-01-01

    systems, energy efficiency, simulation modeling, validation,simulation models for modeling water heating equipment and distribution systemsimulation models for water heaters and hot water distribution systems and to recommend strategies for better modeling those systems.

  19. At the Biological Modeling and Simulation Frontier

    E-print Network

    2009-01-01

    modeling and simulation (M&S) of biological systems. Wesystem, observed from particular, The Modeling and Simulationmodeling and simulation: integrating discrete event and continuous complex dynamic systems.

  20. The Advanced Gamma-ray Imaging System (AGIS): Simulation studies

    SciTech Connect

    Maier, G.; Buckley, J.; Bugaev, V.; Fegan, S.; Funk, S.; Konopelko, A.; Vassiliev, V.V.; /UCLA

    2011-06-14

    The Advanced Gamma-ray Imaging System (AGIS) is a next-generation ground-based gamma-ray observatory being planned in the U.S. The anticipated sensitivity of AGIS is about one order of magnitude better than the sensitivity of current observatories, allowing it to measure gamma-ray emission from a large number of Galactic and extra-galactic sources. We present here results of simulation studies of various possible designs for AGIS. The primary characteristics of the array performance - collecting area, angular resolution, background rejection, and sensitivity - are discussed.

  1. The Advanced Gamma-ray Imaging System (AGIS) - Simulation Studies

    SciTech Connect

    Maier, G.; Buckley, J.; Bugaev, V.; Fegan, S.; Vassiliev, V. V.; Funk, S.; Konopelko, A.

    2008-12-24

    The Advanced Gamma-ray Imaging System (AGIS) is a US-led concept for a next-generation instrument in ground-based very-high-energy gamma-ray astronomy. The most important design requirement for AGIS is a sensitivity of about 10 times greater than current observatories like Veritas, H.E.S.S or MAGIC. We present results of simulation studies of various possible designs for AGIS. The primary characteristics of the array performance, collecting area, angular resolution, background rejection, and sensitivity are discussed.

  2. Genome Reshuffling for Advanced Intercross Permutation (GRAIP): Simulation and permutation for advanced intercross population analysis

    SciTech Connect

    Pierce, Jeremy; Broman, Karl; Chesler, Elissa J; Zhou, Guomin; Airey, David; Birmingham, Amanda; Williams, Robert

    2008-01-01

    Abstract Background Advanced intercross lines (AIL) are segregating populations created using a multigeneration breeding protocol for fine mapping complex traits in mice and other organisms. Applying quantitative trait locus (QTL) mapping methods for intercross and backcross populations, often followed by na ve permutation of individuals and phenotypes, does not account for the effect of family structure in AIL populations in which final generations have been expanded and leads to inappropriately low significance thresholds. The critical problem with a na ve mapping approach in such AIL populations is that the individual is not an exchangeable unit given the family structure. Methodology/Principal Findings The effect of family structure has immediate implications for the optimal AIL creation (many crosses, few animals per cross, and population expansion before the final generation) and we discuss these and the utility of AIL populations for QTL fine mapping. We also describe Genome Reshuffling for Advanced Intercross Permutation, (GRAIP) a method for analyzing AIL data that accounts for family structure. RAIP permutes a more interchangeable unit in the final generation crosses - the parental genome - and simulating regeneration of a permuted AIL population based on exchanged parental identities. GRAIP determines appropriate genome- ide significance thresholds and locus-specific P-values for AILs and other populations with similar family structures. We contrast GRAIP with na ve permutation using a large densely genotyped mouse AIL population (1333 individuals from 32 crosses). A na ve permutation using coat color as a model phenotype demonstrates high false-positive locus identification and uncertain significance levels in our AIL population, which are corrected by use of GRAIP. We also show that GRAIP detects an established hippocampus weight locus and a new locus, Hipp9a. Conclusions and Significance GRAIP determines appropriate genome-wide significance thresholds and locus- specific P-values for AILs and other populations with similar family structures. The effect of family structure has immediate implications for the optimal AIL creation (many crosses, few animals per cross, and population expansion before the final generation) and we discuss these and the utility of AIL populations.

  3. Model Standards Advance the Profession

    ERIC Educational Resources Information Center

    Journal of Staff Development, 2011

    2011-01-01

    Leadership by teachers is essential to serving the needs of students, schools, and the teaching profession. To that end, the Teacher Leadership Exploratory Consortium has developed Teacher Leader Model Standards to codify, promote, and support teacher leadership as a vehicle to transform schools for the needs of the 21st century. The Teacher…

  4. Recent advances of strong strong beam beam simulation

    NASA Astrophysics Data System (ADS)

    Qiang, Ji; Furman, Miguel A.; Ryne, Robert D.; Fischer, Wolfram; Ohmi, Kazuhito

    2006-03-01

    In this paper, we report on recent advances in strong-strong beam-beam simulation. Numerical methods used in the calculation of the beam-beam forces are reviewed. A new computational method to solve the Poisson equation on nonuniform grid is presented. This method reduces the computational cost by a half compared with the standard FFT based method on uniform grid. It also appears to be more accurate than the standard method for a colliding beam with low transverse aspect ratio. In applications, we present the study of coherent modes with multi-bunch, multi-collision beam-beam interactions at RHIC. We also present the strong-strong simulation of the luminosity evolution at KEKB with and without finite crossing angle.

  5. Recent advances of strong-strong beam-beam simulation

    SciTech Connect

    Qiang, Ji; Furman, Miguel A.; Ryne, Robert D.; Fischer, Wolfram; Ohmi,Kazuhito

    2004-09-15

    In this paper, we report on recent advances in strong-strong beam-beam simulation. Numerical methods used in the calculation of the beam-beam forces are reviewed. A new computational method to solve the Poisson equation on nonuniform grid is presented. This method reduces the computational cost by a half compared with the standard FFT based method on uniform grid. It is also more accurate than the standard method for a colliding beam with low transverse aspect ratio. In applications, we present the study of coherent modes with multi-bunch, multi-collision beam-beam interactions at RHIC. We also present the strong-strong simulation of the luminosity evolution at KEKB with and without finite crossing angle.

  6. Advanced simulations of optical transition and diffraction radiation

    NASA Astrophysics Data System (ADS)

    Aumeyr, T.; Billing, M. G.; Bobb, L. M.; Bolzon, B.; Bravin, E.; Karataev, P.; Kruchinin, K.; Lefevre, T.; Mazzoni, S.

    2015-04-01

    Charged particle beam diagnostics is a key task in modern and future accelerator installations. The diagnostic tools are practically the "eyes" of the operators. The precision and resolution of the diagnostic equipment are crucial to define the performance of the accelerator. Transition and diffraction radiation (TR and DR) are widely used for electron beam parameter monitoring. However, the precision and resolution of those devices are determined by how well the production, transport and detection of these radiation types are understood. This paper reports on simulations of TR and DR spatial-spectral characteristics using the physical optics propagation (POP) mode of the Zemax advanced optics simulation software. A good consistency with theory is demonstrated. Also, realistic optical system alignment issues are discussed.

  7. I. Introduction Simulation Modeling

    E-print Network

    Tesfatsion, Leigh

    -scale system studies such as environmental systems [29]or nuclear fuel waste management programs [7], which notice and the title of the publication and its date appear, and notice is given that copying different types of models. Several guidelines have already been made public by or for different agencies

  8. Recent advances in modeling stellar interiors (u)

    SciTech Connect

    Guzik, Joyce Ann

    2010-01-01

    Advances in stellar interior modeling are being driven by new data from large-scale surveys and high-precision photometric and spectroscopic observations. Here we focus on single stars in normal evolutionary phases; we will not discuss the many advances in modeling star formation, interacting binaries, supernovae, or neutron stars. We review briefly: (1) updates to input physics of stellar models; (2) progress in two and three-dimensional evolution and hydrodynamic models; (3) insights from oscillation data used to infer stellar interior structure and validate model predictions (asteroseismology). We close by highlighting a few outstanding problems, e.g., the driving mechanisms for hybrid {gamma} Dor/{delta} Sct star pulsations, the cause of giant eruptions seen in luminous blue variables such as {eta} Car and P Cyg, and the solar abundance problem.

  9. Recent advances in modeling stellar interiors

    NASA Astrophysics Data System (ADS)

    Guzik, Joyce Ann

    2011-11-01

    Advances in stellar interior modeling are being driven by new data from large-scale surveys and high-precision photometric and spectroscopic observations. Here we focus on single stars in normal evolutionary phases; we will not discuss the many advances in modeling star formation, interacting binaries, supernovae, or neutron stars. We review briefly: (1) updates to input physics of stellar models; (2) progress in two and three-dimensional evolution and hydrodynamic models; (3) insights from oscillation data used to infer stellar interior structure and validate model predictions (asteroseismology). We close by highlighting a few outstanding problems, e.g., the driving mechanisms for hybrid ? Dor/ ? Sct star pulsations, the cause of giant eruptions seen in luminous blue variables such as ? Car and P Cyg, and the solar abundance problem.

  10. Graphics simulation and training aids for advanced teleoperation

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Schenker, Paul S.; Bejczy, Antal K.

    1993-01-01

    Graphics displays can be of significant aid in accomplishing a teleoperation task throughout all three phases of off-line task analysis and planning, operator training, and online operation. In the first phase, graphics displays provide substantial aid to investigate work cell layout, motion planning with collision detection and with possible redundancy resolution, and planning for camera views. In the second phase, graphics displays can serve as very useful tools for introductory training of operators before training them on actual hardware. In the third phase, graphics displays can be used for previewing planned motions and monitoring actual motions in any desired viewing angle, or, when communication time delay prevails, for providing predictive graphics overlay on the actual camera view of the remote site to show the non-time-delayed consequences of commanded motions in real time. This paper addresses potential space applications of graphics displays in all three operational phases of advanced teleoperation. Possible applications are illustrated with techniques developed and demonstrated in the Advanced Teleoperation Laboratory at JPL. The examples described include task analysis and planning of a simulated Solar Maximum Satellite Repair task, a novel force-reflecting teleoperation simulator for operator training, and preview and predictive displays for on-line operations.

  11. Co-Simulation for Advanced Process Design and Optimization

    SciTech Connect

    Stephen E. Zitney

    2009-01-01

    Meeting the increasing demand for clean, affordable, and secure energy is arguably the most important challenge facing the world today. Fossil fuels can play a central role in a portfolio of carbon-neutral energy options provided CO{sub 2} emissions can be dramatically reduced by capturing CO{sub 2} and storing it safely and effectively. Fossil energy industry faces the challenge of meeting aggressive design goals for next-generation power plants with CCS. Process designs will involve large, highly-integrated, and multipurpose systems with advanced equipment items with complex geometries and multiphysics. APECS is enabling software to facilitate effective integration, solution, and analysis of high-fidelity process/equipment (CFD) co-simulations. APECS helps to optimize fluid flow and related phenomena that impact overall power plant performance. APECS offers many advanced capabilities including ROMs, design optimization, parallel execution, stochastic analysis, and virtual plant co-simulations. NETL and its collaborative R&D partners are using APECS to reduce the time, cost, and technical risk of developing high-efficiency, zero-emission power plants with CCS.

  12. The Impact of the Assimilation of Hyperspectral Infrared Retrieved Profiles on Advanced Weather and Research Model Simulations of a Non-Convective Wind Event

    NASA Technical Reports Server (NTRS)

    Brendt. Emily; Zavodsky, Bradley; Jedlovec, Gary; Elmer, Nicholas

    2014-01-01

    Tropopause folds are identified by warm, dry, high-potential vorticity, ozone-rich air and are one explanation for damaging non-convective wind events. Could improved model representation of stratospheric air and associated tropopause folding improve non-convective wind forecasts and high wind warnings? The goal of this study is to assess the impact of assimilating Hyperspectral Infrared (IR) profiles on forecasting stratospheric air, tropopause folds, and associated non-convective winds: (1) AIRS: Atmospheric Infrared Sounder (2) IASI: Infrared Atmospheric Sounding Interferometer (3) CrIMSS: Cross-track Infrared and Microwave Sounding Suite

  13. Applying Model Abstraction Techniques to the Advanced Low Altitude Radar Model (ALARM)

    NASA Astrophysics Data System (ADS)

    Plotz, Gary; Dibble, Serena

    2002-10-01

    Modeling of real systems relies on the arduous task of describing the physical phenomena in terms of mathematical models, which often require excessive amounts of computation time when used in simulations. In the last few years there has been a growing acceptance of model abstraction whose emphasis rests on the development of more manageable models. Abstraction refers to the intelligent capture of the essence of the behavior of a model, without all the details. In the past, model abstraction techniques have been applied to complex models, such as Advanced Low Altitude Radar Model (ALARM) to simplify analysis. The scope of this effort is to apply model abstraction techniques to ALARM; a DoD prototype radar model for simulating the volume detection capability of low flying targets within a digitally simulated environment. Due to the complexity of these models, it is difficult to capture and assess the relationship between the model parameters and the performance of the simulation. Under this effort, ALARM parameters were modified and/or deleted and the impact on the simulation run time assessed. In addition, several meta-models were developed and used to assess the impact of ALARM parameters on the simulation run time. This report establishes a baseline for ALARM from which additional meta-models can be compared and analyzed.

  14. Applying model abstraction techniques to the Advanced Low Altitude Radar Model (ALARM)

    NASA Astrophysics Data System (ADS)

    Plotz, Gary A.; Dibble, Serena

    2002-07-01

    Modeling of real systems relies on the arduous task of describing the physical phenomena in terms of mathematical models, which often require excessive amounts of computation time when used in simulations. In the last few years there has been a growing acceptance of model abstraction whose emphasis rests on the development of more manageable models. Abstraction refers to the intelligent capture of the essence of the behavior of a model, without all the details. In the past, model abstraction techniques have been applied to complex models, such as Advanced Low Altitude Radar Model (ALARM) to simplify analysis. The scope of this effort is to apply model abstraction techniques to ALARM; a DoD prototype radar model for simulating the volume detection capability of low flying targets within a digitally simulated environment. Due to the complexity of these models it is difficult to capture and assess the relationship between the model parameters and the performance of the simulation. Under this effort ALARM parameters were modified and/or deleted and the impact on the simulation run time assessed. In addition, several meta-models were developed and used to assess the impact of ALARM parameters on the simulation run time. This paper establishes a baseline for ALARM from which additional meta-models can be compared and analyzed.

  15. Advances in Modeling Exploding Bridgewire Initiation

    SciTech Connect

    Hrousis, C A; Christensen, J S

    2010-03-10

    There is great interest in applying magnetohydrodynamic (MHD) simulation techniques to the designs of electrical high explosive (HE) initiators, for the purpose of better understanding a design's sensitivities, optimizing its performance, and/or predicting its useful lifetime. Two MHD-capable LLNL codes, CALE and ALE3D, are being used to simulate the process of ohmic heating, vaporization, and plasma formation in exploding bridgewires (EBW). Initiation of the HE is simulated using Ignition & Growth reactive flow models. 1-D, 2-D and 3-D models have been constructed and studied. The models provide some intuitive explanation of the initiation process and are useful for evaluating the potential impact of identified aging mechanisms (such as the growth of intermetallic compounds or powder sintering). The end product of this work is a simulation capability for evaluating margin in proposed, modified or aged initiation system designs.

  16. Stochastic models: theory and simulation.

    SciTech Connect

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  17. VTI Driving Simulator: Mathematical Model of a Four-wheeled Vehicle for Simulation in Real Time. VTI Rapport 267A.

    ERIC Educational Resources Information Center

    Nordmark, Staffan

    1984-01-01

    This report contains a theoretical model for describing the motion of a passenger car. The simulation program based on this model is used in conjunction with an advanced driving simulator and run in real time. The mathematical model is complete in the sense that the dynamics of the engine, transmission and steering system is described in some…

  18. At the Biological Modeling and Simulation Frontier

    E-print Network

    2009-01-01

    modeling and simulation (M&S) of biological systems. Wemodeling and simulation: integrating discrete event and continuous complex dynamic systems.system, observed from particular, The Modeling and Simulation

  19. Modeling & Simulation Data Analysis and Modeling & Simulation for the

    E-print Network

    , the code is much easier to use, manipulate, and modify than codes based on programming languages to the problem at hand. Data analysis and modeling and simulations provide powerful tools to help accomplish that scales from bench-top chemical kinetic and thermodynamic experiments to pilot- and plant-sized programs

  20. Surrogate Model Development for Fuels for Advanced Combustion Engines

    SciTech Connect

    Anand, Krishnasamy; Ra, youngchul; Reitz, Rolf; Bunting, Bruce G

    2011-01-01

    The fuels used in internal-combustion engines are complex mixtures of a multitude of different types of hydrocarbon species. Attempting numerical simulations of combustion of real fuels with all of the hydrocarbon species included is highly unrealistic. Thus, a surrogate model approach is generally adopted, which involves choosing a few representative hydrocarbon species whose overall behavior mimics the characteristics of the target fuel. The present study proposes surrogate models for the nine fuels for advanced combustion engines (FACE) that have been developed for studying low-emission, high-efficiency advanced diesel engine concepts. The surrogate compositions for the fuels are arrived at by simulating their distillation profiles to within a maximum absolute error of 4% using a discrete multi-component (DMC) fuel model that has been incorporated in the multi-dimensional computational fluid dynamics (CFD) code, KIVA-ERC-CHEMKIN. The simulated surrogate compositions cover the range and measured concentrations of the various hydrocarbon classes present in the fuels. The fidelity of the surrogate fuel models is judged on the basis of matching their specific gravity, lower heating value, hydrogen/carbon (H/C) ratio, cetane number, and cetane index with the measured data for all nine FACE fuels.

  1. Software Partitioning Schemes for Advanced Simulation Computer Systems. Final Report.

    ERIC Educational Resources Information Center

    Clymer, S. J.

    Conducted to design software partitioning techniques for use by the Air Force to partition a large flight simulator program for optimal execution on alternative configurations, this study resulted in a mathematical model which defines characteristics for an optimal partition, and a manually demonstrated partitioning algorithm design which…

  2. Algorithmic implementations of domain decomposition methods for the diffraction simulation of advanced photomasks

    NASA Astrophysics Data System (ADS)

    Adam, Konstantinos; Neureuther, Andrew R.

    2002-07-01

    The domain decomposition method developed in [1] is examined in more detail. This method enables rapid computer simulation of advanced photomask (alt. PSM, masks with OPC) scattering and transmission properties. Compared to 3D computer simulation, speed-up factors of approximately 400, and up to approximately 200,000 when using the look-up table approach, are possible. Combined with the spatial frequency properties of projection printing systems, it facilitates accurate computer simulation of the projected image (normalized mean square error of a typical image is only a fraction of 1%). Some esoteric accuracy issues of the method are addressed and the way to handle arbitrary, Manhattan-type mask layouts is presented. The method is shown to be valid for off-axis incidence. The cross-talk model developed in [1] is used in 3D mask simulations (2D layouts).

  3. Combustion modeling in advanced gas turbine systems

    SciTech Connect

    Smoot, L.D.; Hedman, P.O.; Fletcher, T.H.; Brewster, B.S.; Kramer, S.K.

    1995-12-31

    Goal of DOE`s Advanced Turbine Systems program is to develop and commercialize ultra-high efficiency, environmentally superior, cost competitive gas turbine systems for base-load applications in utility, independent power producer, and industrial markets. Primary objective of the program here is to develop a comprehensive combustion model for advanced gas turbine combustion systems using natural gas (coal gasification or biomass fuels). The efforts included code evaluation (PCGC-3), coherent anti-Stokes Raman spectroscopy, laser Doppler anemometry, and laser-induced fluorescence.

  4. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  5. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that was very successful in delivering an initial capability to one that is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools. ASC must continue to meet three objectives: (1) Robust Tools--Develop robust models, codes, and computational techniques to support stockpile needs such as refurbishments, SFIs, LEPs, annual assessments, and evolving future requirements. (2) Prediction through Simulation--Deliver validated physics and engineering tools to enable simulations of nuclear-weapons performances in a variety of operational environments and physical regimes and to enable risk-informed decisions about the performance, safety, and reliability of the stockpile. (3) Balanced Operational Infrastructure--Implement a balanced computing platform acquisition strategy and operational infrastructure to meet Directed Stockpile Work (DSW) and SSP needs for capacity and high-end simulation capabilities.

  6. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that was very successful in delivering an initial capability to one that is integrated and focused on requirements driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools. ASC must continue to meet three objectives: Objective 1. Robust Tools--Develop robust models, codes, and computational techniques to support stockpile needs such as refurbishments, SFIs, LEPs, annual assessments, and evolving future requirements. Objective 2. Prediction through Simulation--Deliver validated physics and engineering tools to enable simulations of nuclear-weapons performances in a variety of operational environments and physical regimes and to enable risk informed decisions about the performance, safety, and reliability of the stockpile. Objective 3. Balanced Operational Infrastructure--Implement a balanced computing platform acquisition strategy and operational infrastructure to meet Directed Stockpile Work (DSW) and SSP needs for capacity and high-end simulation capabilities.

  7. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that was very successful in delivering an initial capability to one that is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools. ASC must continue to meet three objectives: (1) Robust Tools - Develop robust models, codes, and computational techniques to support stockpile needs such as refurbishments, SFIs, LEPs, annual assessments, and evolving future requirements; (2) Prediction through Simulation - Deliver validated physics and engineering tools to enable simulations of nuclear weapons performance in a variety of operational environments and physical regimes and to enable risk-informed decisions about the performance, safety, and reliability of the stockpile; and (3) Balanced Operational Infrastructure - Implement a balanced computing platform acquisition strategy and operational infrastructure to meet Directed Stockpile Work (DSW) and SSP needs for capacity and high-end simulation capabilities.

  8. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that was very successful in delivering an initial capability to one that is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools. ASC must continue to meet three objectives: Objective 1 - Robust Tools. Develop robust models, codes, and computational techniques to support stockpile needs such as refurbishments, SFIs, LEPs, annual assessments, and evolving future requirements. Objective 2 - Prediction through Simulation. Deliver validated physics and engineering tools to enable simulations of nuclear weapons performance in a variety of operational environments and physical regimes and to enable risk-informed decisions about the performance, safety, and reliability of the stockpile. Objective 3 - Balanced Operational Infrastructure. Implement a balanced computing platform acquisition strategy and operational infrastructure to meet Directed Stockpile Work (DSW) and SSP needs for capacity and high-end simulation capabilities.

  9. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that was very successful in delivering an initial capability to one that is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools. ASC must continue to meet three objectives: (1) Robust Tools - Develop robust models, codes, and computational techniques to support stockpile needs such as refurbishments, SFIs, LEPs, annual assessments, and evolving future requirements; (2) Prediction through Simulation - Deliver validated physics and engineering tools to enable simulations of nuclear weapons performance in a variety of operational environments and physical regimes and to enable risk-informed decisions about the performance, safety, and reliability of the stockpile; and (3) Balanced Operational Infrastructure - Implement a balanced computing platform acquisition strategy and operational infrastructure to meet Directed Stockpile Work (DSW) and SSP needs for capacity and high-end simulation capabilities.

  10. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that was very successful in delivering an initial capability to one that is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools. ASC must continue to meet three objectives: Objective 1. Robust Tools--Develop robust models, codes, and computational techniques to support stockpile needs such as refurbishments, SFIs, LEPs, annual assessments, and evolving future requirements. Objective 2. Prediction through Simulation--Deliver validated physics and engineering tools to enable simulations of nuclear weapons performance in a variety of operational environments and physical regimes and to enable risk-informed decisions about the performance, safety, and reliability of the stockpile. Objective 3. Balanced Operational Infrastructure--Implement a balanced computing platform acquisition strategy and operational infrastructure to meet Directed Stockpile Work (DSW) and SSP needs for capacity and high-end simulation capabilities.

  11. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that was very successful in delivering an initial capability to one that is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools. ASC must continue to meet three objectives: Objective 1 Robust Tools--Develop robust models, codes, and computational techniques to support stockpile needs such as refurbishments, SFIs, LEPs, annual assessments, and evolving future requirements. Objective 2 Prediction through Simulation--Deliver validated physics and engineering tools to enable simulations of nuclear weapons performance in a variety of operational environments and physical regimes and to enable risk-informed decisions about the performance, safety, and reliability of the stockpile. Objective 3 Balanced Operational Infrastructure--Implement a balanced computing platform acquisition strategy and operational infrastructure to meet Directed Stockpile Work (DSW) and SSP needs for capacity and high-end simulation capabilities.

  12. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that was very successful in delivering an initial capability to one that is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools. ASC must continue to meet three objectives: Objective 1. Robust Tools--Develop robust models, codes, and computational techniques to support stockpile needs such as refurbishments, SFIs, LEPs, annual assessments, and evolving future requirements. Objective 2--Prediction through Simulation. Deliver validated physics and engineering tools to enable simulations of nuclear-weapons performances in a variety of operational environments and physical regimes and to enable risk-informed decisions about the performance, safety, and reliability of the stockpile. Objective 3--Balanced Operational Infrastructure. Implement a balanced computing platform acquisition strategy and operational infrastructure to meet Directed Stockpile Work (DSW) and SSP needs for capacity and high-end simulation capabilities.

  13. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that was very successful in delivering an initial capability to one that is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools. ASC must continue to meet three objectives: Objective 1. Robust Tools--Develop robust models, codes, and computational techniques to support stockpile needs such as refurbishments, SFIs, LEPs, annual assessments, and evolving future requirements. Objective 2--Prediction through Simulation. Deliver validated physics and engineering tools to enable simulations of nuclear-weapons performances in a variety of operational environments and physical regimes and to enable risk-informed decisions about the performance, safety, and reliability of the stockpile. Objective 3. Balanced Operational Infrastructure--Implement a balanced computing platform acquisition strategy and operational infrastructure to meet Directed Stockpile Work (DSW) and SSP needs for capacity and high-end simulation capabilities.

  14. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  15. Acoustic test and analyses of three advanced turboprop models

    NASA Technical Reports Server (NTRS)

    Brooks, B. M.; Metzger, F. B.

    1980-01-01

    Results of acoustic tests of three 62.2 cm (24.5 inch) diameter models of the prop-fan (a small diameter, highly loaded. Multi-bladed variable pitch advanced turboprop) are presented. Results show that there is little difference in the noise produced by unswept and slightly swept designs. However, the model designed for noise reduction produces substantially less noise at test conditions simulating 0.8 Mach number cruise speed or at conditions simulating takeoff and landing. In the near field at cruise conditions the acoustically designed. In the far field at takeoff and landing conditions the acoustically designed model is 5 db quieter than unswept or slightly swept designs. Correlation between noise measurement and theoretical predictions as well as comparisons between measured and predicted acoustic pressure pulses generated by the prop-fan blades are discussed. The general characteristics of the pulses are predicted. Shadowgraph measurements were obtained which showed the location of bow and trailing waves.

  16. Advanced visualization technology for terascale particle accelerator simulations

    SciTech Connect

    Ma, K-L; Schussman, G.; Wilson, B.; Ko, K.; Qiang, J.; Ryne, R.

    2002-11-16

    This paper presents two new hardware-assisted rendering techniques developed for interactive visualization of the terascale data generated from numerical modeling of next generation accelerator designs. The first technique, based on a hybrid rendering approach, makes possible interactive exploration of large-scale particle data from particle beam dynamics modeling. The second technique, based on a compact texture-enhanced representation, exploits the advanced features of commodity graphics cards to achieve perceptually effective visualization of the very dense and complex electromagnetic fields produced from the modeling of reflection and transmission properties of open structures in an accelerator design. Because of the collaborative nature of the overall accelerator modeling project, the visualization technology developed is for both desktop and remote visualization settings. We have tested the techniques using both time varying particle data sets containing up to one billion particle s per time step and electromagnetic field data sets with millions of mesh elements.

  17. Maturity Model for Advancing Smart Grid Interoperability

    SciTech Connect

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  18. Modeling and Simulation of Biochemical Processes Using Stochastic Hybrid Systems: The

    E-print Network

    Koutsoukos, Xenofon D.

    Modeling and Simulation of Biochemical Processes Using Stochastic Hybrid Systems: The Sugar research advances there is an increasing need to model and simulate more complicated systems to better, there is an increasing need to model and simulate these systems to better understand them. Since biochemical processes

  19. ADVANCED ELECTRIC AND MAGNETIC MATERIAL MODELS FOR FDTD ELECTROMAGNETIC CODES

    SciTech Connect

    Poole, B R; Nelson, S D; Langdon, S

    2005-05-05

    The modeling of dielectric and magnetic materials in the time domain is required for pulse power applications, pulsed induction accelerators, and advanced transmission lines. For example, most induction accelerator modules require the use of magnetic materials to provide adequate Volt-sec during the acceleration pulse. These models require hysteresis and saturation to simulate the saturation wavefront in a multipulse environment. In high voltage transmission line applications such as shock or soliton lines the dielectric is operating in a highly nonlinear regime, which require nonlinear models. Simple 1-D models are developed for fast parameterization of transmission line structures. In the case of nonlinear dielectrics, a simple analytic model describing the permittivity in terms of electric field is used in a 3-D finite difference time domain code (FDTD). In the case of magnetic materials, both rate independent and rate dependent Hodgdon magnetic material models have been implemented into 3-D FDTD codes and 1-D codes.

  20. A Standard Kinematic Model for Flight Simulation at NASA Ames

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. E.

    1975-01-01

    A standard kinematic model for aircraft simulation exists at NASA-Ames on a variety of computer systems, one of which is used to control the flight simulator for advanced aircraft (FSAA). The derivation of the kinematic model is given and various mathematical relationships are presented as a guide. These include descriptions of standardized simulation subsystems such as the atmospheric turbulence model and the generalized six-degrees-of-freedom trim routine, as well as an introduction to the emulative batch-processing system which enables this facility to optimize its real-time environment.

  1. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  2. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    NASA Astrophysics Data System (ADS)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  3. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  4. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  5. Theory, modeling and simulation: Annual report 1993

    SciTech Connect

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  6. A Simulation and Modeling Framework for Space Situational Awareness

    SciTech Connect

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  7. Modeling and Simulation for Safeguards

    SciTech Connect

    Swinhoe, Martyn T.

    2012-07-26

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R&D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  8. Multiscale Stochastic Simulation and Modeling

    SciTech Connect

    James Glimm; Xiaolin Li

    2006-01-10

    Acceleration driven instabilities of fluid mixing layers include the classical cases of Rayleigh-Taylor instability, driven by a steady acceleration and Richtmyer-Meshkov instability, driven by an impulsive acceleration. Our program starts with high resolution methods of numerical simulation of two (or more) distinct fluids, continues with analytic analysis of these solutions, and the derivation of averaged equations. A striking achievement has been the systematic agreement we obtained between simulation and experiment by using a high resolution numerical method and improved physical modeling, with surface tension. Our study is accompanies by analysis using stochastic modeling and averaged equations for the multiphase problem. We have quantified the error and uncertainty using statistical modeling methods.

  9. Using advanced technology repositories to accelerate simulation scenario development

    NASA Astrophysics Data System (ADS)

    Trias, Eric; Mathias, Karl S.

    2002-07-01

    One of the more difficult problems facing an analyst wishing to use a simulation is the task of collecting data and transforming it into a correctly formatted scenario. Raw data is often available from a variety of sources: multi-spectral force deployment (MSFD) documents, the electronic warfare integrated reprogramming database (EWIRDB), free text documents such as intelligence reports, pre-existing simulation scenarios, and scenarios taken from other simulations. The task of transforming this data into a usable scenario involves searching for the relevant information, followed by a manual transformation of the original format to the correct simulation format. This problem can be greatly alleviated by using a combination of three technologies: automatic parser generation, repository architectures using extensible markup language (XML), and information retrieval (IR) techniques. Automatic parser generation tools like JavaCC can automatically generate source code capable of reading data sources such as old Joint Integrated Mission Model (JIMM) or Suppressor input files. For simulations that regularly add scenario keywords to support changing needs, this can greatly reduce redevelopment time and cost for supporting tools. The objects parsed by this source can then be encapsulated in XML and stored into a repository. Using information retrieval techniques, objects can then be queried from the repository and transformed into the appropriate format for use in a scenario.

  10. Recent modelling advances for ultrasonic TOFD inspections

    SciTech Connect

    Darmon, Michel; Ferrand, Adrien; Dorval, Vincent; Chatillon, Sylvain; Lonné, Sébastien

    2015-03-31

    The ultrasonic TOFD (Time of Flight Diffraction) Technique is commonly used to detect and characterize disoriented cracks using their edge diffraction echoes. An overview of the models integrated in the CIVA software platform and devoted to TOFD simulation is presented. CIVA allows to predict diffraction echoes from complex 3D flaws using a PTD (Physical Theory of Diffraction) based model. Other dedicated developments have been added to simulate lateral waves in 3D on planar entry surfaces and in 2D on irregular surfaces by a ray approach. Calibration echoes from Side Drilled Holes (SDHs), specimen echoes and shadowing effects from flaws can also been modelled. Some examples of theoretical validation of the models are presented. In addition, experimental validations have been performed both on planar blocks containing calibration holes and various notches and also on a specimen with an irregular entry surface and allow to draw conclusions on the validity of all the developed models.

  11. CAPE-OPEN Integration for Advanced Process Engineering Co-Simulation

    SciTech Connect

    Zitney, S.E.

    2006-11-01

    This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to comply with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.

  12. An advanced leakage scheme for neutrino treatment in astrophysical simulations

    E-print Network

    Perego, Albino; Käppeli, Roger

    2015-01-01

    We present an Advanced Spectral Leakage (ASL) scheme to model neutrinos in the context of core-collapse supernovae and compact binary mergers. Based on previous gray leakage schemes, the ASL scheme computes the neutrino cooling rates by interpolating local production and diffusion rates (relevant in optically thin and thick regimes, respectively), separately for discretized values of the neutrino energy. Neutrino trapped components are also modeled, based on equilibrium and timescale arguments. The better accuracy achieved by the spectral treatment allows a more reliable computation of neutrino heating rates in optically thin conditions. The scheme has been calibrated and tested against Boltzmann transport in the context of Newtonian spherically symmetric models of core-collapse supernovae. ASL shows a very good qualitative and a partial quantitative agreement, for key quantities from collapse to a few hundreds of milliseconds after core bounce. We have proved the adaptability and flexibility of our ASL schem...

  13. The ModelAssembler Community Modeling Environment (MA-CME): Expanded Access to Advanced Seismic Computation

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Larsen, S.

    2006-12-01

    We introduce MA-CME, an open-source environment for all PCs, Macs, and workstations that configures advanced seismic modeling runs. It is intended for use by seismologists, engineers, and students. The environment combines geologic and geotechnical data sets with gridding, modeling, and output specifications into portal packs for execution on standalone workstations, clusters, and mega-facilities such as Sun Grid. A tutorial interface helps the user scale the grid to the facilities available, from small test runs to efforts requiring major resources. The input geologic data are kept in open, editable forms to promote the creation of models for new areas, the regional extension of existing grids, and the detailing of critical features within current models. MA-CME currently drives computations with the E3D and the open-source E3D/CODE3 advanced simulation platforms; additional platforms will be added. The ability of MA-CME to configure computations at a range of scales and model complexity is intended to promote wide use of advanced seismic modeling. Wide community use may lead to breakthrough insights into how geology controls earthquake ground motion. Advanced seismic modeling platforms, coupled with increasing availability of faster clusters, have rapidly improved the realism of such deterministic simulations. Yet the number of people able to configure and successfully run simulations through complex geology has not grown. Ground-motion simulations have been published only for a few scenarios in a limited number of urban areas. MA-CME has been used to configure simulations to 2-Hz frequency for the Reno and Las Vegas, Nevada; Grenoble, France; and Wellington, New Zealand regions including multiple basins, detailed geotechnical maps, and attenuation. The package is freely available on the web. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  14. Assessment of Molecular Modeling & Simulation

    SciTech Connect

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  15. A review on recent advances in the numerical simulation for coalbed-methane-recovery process

    SciTech Connect

    Wei, X.R.; Wang, G.X.; Massarotto, P.; Golding, S.D.; Rudolph, V.

    2007-12-15

    The recent advances in numerical simulation for primary coalbed methane (CBM) recovery and enhanced coalbed-methane recovery (ECBMR) processes are reviewed, primarily focusing on the progress that has occurred since the late 1980s. Two major issues regarding the numerical modeling will be discussed in this review: first, multicomponent gas transport in in-situ bulk coal and, second, changes of coal properties during methane (CH{sub 4}) production. For the former issues, a detailed review of more recent advances in modeling gas and water transport within a coal matrix is presented. Further, various factors influencing gas diffusion through the coal matrix will be highlighted as well, such as pore structure, concentration and pressure, and water effects. An ongoing bottleneck for evaluating total mass transport rate is developing a reasonable representation of multiscale pore space that considers coal type and rank. Moreover, few efforts have been concerned with modeling water-flow behavior in the coal matrix and its effects on CH{sub 4} production and on the exchange of carbon dioxide (CO{sub 2}) and CH{sub 4}. As for the second issue, theoretical coupled fluid-flow and geomechanical models have been proposed to describe the evolution of pore structure during CH{sub 4} production, instead of traditional empirical equations. However, there is currently no effective coupled model for engineering applications. Finally, perspectives on developing suitable simulation models for CBM production and for predicting CO{sub 2}-sequestration ECBMR are suggested.

  16. A simulation study of crew performance in operating an advanced transport aircraft in an automated terminal area environment

    NASA Technical Reports Server (NTRS)

    Houck, J. A.

    1983-01-01

    A simulation study assessing crew performance operating an advanced transport aircraft in an automated terminal area environment is described. The linking together of the Langley Advanced Transport Operating Systems Aft Flight Deck Simulator with the Terminal Area Air Traffic Model Simulation was required. The realism of an air traffic control (ATC) environment with audio controller instructions for the flight crews and the capability of inserting a live aircraft into the terminal area model to interact with computer generated aircraft was provided. Crew performance using the advanced displays and two separate control systems (automatic and manual) in flying area navigation routes in the automated ATC environment was assessed. Although the crews did not perform as well using the manual control system, their performances were within acceptable operational limits with little increase in workload. The crews favored using the manual control system and felt they were more alert and aware of their environment when using it.

  17. Weapons Activities/ Advanced Simulation and Computing Campaign FY 2011 Congressional Budget

    E-print Network

    Weapons Activities/ Advanced Simulation and Computing Campaign FY 2011 Congressional Budget weapons assessment and certification requirements including weapon codes, weapons science, computing testing to determine weapon behavior. As such, ASC simulations are central to our national security. Our

  18. VISION: Verifiable Fuel Cycle Simulation Model

    SciTech Connect

    Jacob J. Jacobson; Abdellatif M. Yacout; Gretchen E. Matthern; Steven J. Piet; David E. Shropshire

    2009-04-01

    The nuclear fuel cycle is a very complex system that includes considerable dynamic complexity as well as detail complexity. In the nuclear power realm, there are experts and considerable research and development in nuclear fuel development, separations technology, reactor physics and waste management. What is lacking is an overall understanding of the entire nuclear fuel cycle and how the deployment of new fuel cycle technologies affects the overall performance of the fuel cycle. The Advanced Fuel Cycle Initiative’s systems analysis group is developing a dynamic simulation model, VISION, to capture the relationships, timing and delays in and among the fuel cycle components to help develop an understanding of how the overall fuel cycle works and can transition as technologies are changed. This paper is an overview of the philosophy and development strategy behind VISION. The paper includes some descriptions of the model and some examples of how to use VISION.

  19. VISION: Verifiable Fuel Cycle Simulation Model

    SciTech Connect

    Jacob Jacobson; A. M. Yacout; Gretchen Matthern; Steven Piet; David Shropshire; Tyler Schweitzer

    2010-11-01

    The nuclear fuel cycle consists of a set of complex components that work together in unison. In order to support the nuclear renaissance, it is necessary to understand the impacts of changes and timing of events in any part of the fuel cycle system. The Advanced Fuel Cycle Initiative’s systems analysis group is developing a dynamic simulation model, VISION, to capture the relationships, timing, and changes in and among the fuel cycle components to help develop an understanding of how the overall fuel cycle works. This paper is an overview of the philosophy and development strategy behind VISION. The paper includes some descriptions of the model components and some examples of how to use VISION.

  20. Sunspot Modeling: From Simplified Models to Radiative MHD Simulations

    NASA Astrophysics Data System (ADS)

    Rempel, Matthias; Schlichenmaier, Rolf

    2011-09-01

    We review our current understanding of sunspots from the scales of their fine structure to their large scale (global) structure including the processes of their formation and decay. Recently, sunspot models have undergone a dramatic change. In the past, several aspects of sunspot structure have been addressed by static MHD models with parametrized energy transport. Models of sunspot fine structure have been relying heavily on strong assumptions about flow and field geometry (e.g., flux-tubes, "gaps", convective rolls), which were motivated in part by the observed filamentary structure of penumbrae or the necessity of explaining the substantial energy transport required to maintain the penumbral brightness. However, none of these models could self-consistently explain all aspects of penumbral structure (energy transport, filamentation, Evershed flow). In recent years, 3D radiative MHD simulations have been advanced dramatically to the point at which models of complete sunspots with sufficient resolution to capture sunspot fine structure are feasible. Here overturning convection is the central element responsible for energy transport, filamentation leading to fine-structure and the driving of strong outflows. On the larger scale these models are also in the progress of addressing the subsurface structure of sunspots as well as sunspot formation. With this shift in modeling capabilities and the recent advances in high resolution observations, the future research will be guided by comparing observation and theory.

  1. Simulating spin models on GPU

    E-print Network

    Weigel, Martin

    2011-01-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  2. Simulating spin models on GPU

    NASA Astrophysics Data System (ADS)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  3. Simulating spin models on GPU

    E-print Network

    Martin Weigel

    2011-06-07

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  4. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

    SciTech Connect

    Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

    2012-07-31

    This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

  5. Simulation for supporting scale-up of a fluidized bed reactor for advanced water oxidation.

    PubMed

    Tisa, Farhana; Raman, Abdul Aziz Abdul; Daud, Wan Mohd Ashri Wan

    2014-01-01

    Simulation of fluidized bed reactor (FBR) was accomplished for treating wastewater using Fenton reaction, which is an advanced oxidation process (AOP). The simulation was performed to determine characteristics of FBR performance, concentration profile of the contaminants, and various prominent hydrodynamic properties (e.g., Reynolds number, velocity, and pressure) in the reactor. Simulation was implemented for 2.8 L working volume using hydrodynamic correlations, continuous equation, and simplified kinetic information for phenols degradation as a model. The simulation shows that, by using Fe(3+) and Fe(2+) mixtures as catalyst, TOC degradation up to 45% was achieved for contaminant range of 40-90 mg/L within 60 min. The concentration profiles and hydrodynamic characteristics were also generated. A subsequent scale-up study was also conducted using similitude method. The analysis shows that up to 10 L working volume, the models developed are applicable. The study proves that, using appropriate modeling and simulation, data can be predicted for designing and operating FBR for wastewater treatment. PMID:25309949

  6. Simulation for Supporting Scale-Up of a Fluidized Bed Reactor for Advanced Water Oxidation

    PubMed Central

    Abdul Raman, Abdul Aziz; Daud, Wan Mohd Ashri Wan

    2014-01-01

    Simulation of fluidized bed reactor (FBR) was accomplished for treating wastewater using Fenton reaction, which is an advanced oxidation process (AOP). The simulation was performed to determine characteristics of FBR performance, concentration profile of the contaminants, and various prominent hydrodynamic properties (e.g., Reynolds number, velocity, and pressure) in the reactor. Simulation was implemented for 2.8?L working volume using hydrodynamic correlations, continuous equation, and simplified kinetic information for phenols degradation as a model. The simulation shows that, by using Fe3+ and Fe2+ mixtures as catalyst, TOC degradation up to 45% was achieved for contaminant range of 40–90?mg/L within 60?min. The concentration profiles and hydrodynamic characteristics were also generated. A subsequent scale-up study was also conducted using similitude method. The analysis shows that up to 10?L working volume, the models developed are applicable. The study proves that, using appropriate modeling and simulation, data can be predicted for designing and operating FBR for wastewater treatment. PMID:25309949

  7. Rule-based simulation models

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Seraphine, Kathleen M.

    1991-01-01

    Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.

  8. A numerical investigation on the efficiency of range extending systems using Advanced Vehicle Simulator

    NASA Astrophysics Data System (ADS)

    Varnhagen, Scott; Same, Adam; Remillard, Jesse; Park, Jae Wan

    2011-03-01

    Series plug-in hybrid electric vehicles of varying engine configuration and battery capacity are modeled using Advanced Vehicle Simulator (ADVISOR). The performance of these vehicles is analyzed on the bases of energy consumption and greenhouse gas emissions on the tank-to-wheel and well-to-wheel paths. Both city and highway driving conditions are considered during the simulation. When simulated on the well-to-wheel path, it is shown that the range extender with a Wankel rotary engine consumes less energy and emits fewer greenhouse gases compared to the other systems with reciprocating engines during many driving cycles. The rotary engine has a higher power-to-weight ratio and lower noise, vibration and harshness compared to conventional reciprocating engines, although performs less efficiently. The benefits of a Wankel engine make it an attractive option for use as a range extender in a plug-in hybrid electric vehicle.

  9. Measurement and modeling of advanced coal conversion processes

    SciTech Connect

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. ); Smoot, L.D.; Brewster, B.S. )

    1992-01-01

    The overall objective of this program is the development of predictive capability for the design, scale up, simulation, control and feedstock evaluation in advanced coal conversion devices. This technology is important to reduce the technical and economic risks inherent in utilizing coal, a feedstock whose variable and often unexpected behavior presents a significant challenge. This program will merge significant advances made at Advanced Fuel Research, Inc. (AFR) in measuring and quantitatively describing the mechanisms in coal conversion behavior, with technology being developed at Brigham Young University (BYU) in comprehensive computer codes for mechanistic modeling of entrained-bed gasification. Additional capabilities in predicting pollutant formation will be implemented and the technology will be expanded to fixed-bed reactors. The foundation to describe coal-specific conversion behavior is AFR's Functional Group (FG) and Devolatilization, Vaporization and Crosslinking (DVC) models, developed under previous and on-going METC sponsored programs. These models have demonstrated the capability to describe the time dependent evolution of individual gas species, and the amount and characteristics of tar and char. The combined FG-DVC model will be integrated with BYU's comprehensive two-dimensional reactor model, PCGC-2, which is currently the most widely used reactor simulation for combustion or gasification. The program includes: (i) validation of the submodels by comparison with laboratory data obtained in this program, (ii) extensive validation of the modified comprehensive code by comparison of predicted results with data from bench-scale and process scale investigations of gasification, mild gasification and combustion of coal or coal-derived products in heat engines, and (iii) development of well documented user friendly software applicable to a workstation'' environment.

  10. OXYGEN UTILIZATION IN ACTIVATED SLUDGE PLANTS: SIMULATION AND MODEL CALIBRATION

    EPA Science Inventory

    The objective of the research described in the report is to apply recent advances in activated sludge process modeling to the simulation of oxygen utilization rates in full scale activated sludge treatment plants. This is accomplished by calibrating the International Association ...

  11. Advances in aerothermal modeling for TMT

    NASA Astrophysics Data System (ADS)

    Vogiatzis, Konstantinos

    2008-07-01

    The performance requirements of the Thirty Meter Telescope (TMT) dictate, among others, a thorough understanding of the flow field inside and around the observatory. Mirror and dome seeing as well as dynamic wind loading on the optics, telescope structure and enclosure constitute significant sources of image degradation. A summary of the current status of Computational Fluid Dynamics (CFD) simulations for TMT is presented, with special attention given to the choice of thermal boundary conditions. Detailed simulations of the mirror support assemblies determine the direction of heat flow from important heat sources and provide feedback to the design. They also provide estimates of the heat transfer coefficients for the solid thermal models. A transient radiation model has also been developed for the enclosure and telescope surfaces in order to estimate the heat flux exchange with the air volume. It also provides estimates of the effective emissivity for the solid thermal models. Finally, a complete model of the observatory on a candidate summit is used to calculate air velocity, pressure and temperature for a matrix of given telescope orientations and enclosure configurations. Calculated wind velocity spectra above M1 and around M2 as well as the wind force on the enclosure are used as inputs in the TMT integrated dynamic model. The temperature and flux output of the aforementioned thermal models are used as input surface boundary conditions in the CFD model. Generated records of temperature variations inside the air volume of the optical paths are fed into the TMT thermal seeing model.

  12. Development of Fuzzy Logic and Neural Network Control and Advanced Emissions Modeling for Parallel Hybrid Vehicles

    SciTech Connect

    Rajagopalan, A.; Washington, G.; Rizzoni, G.; Guezennec, Y.

    2003-12-01

    This report describes the development of new control strategies and models for Hybrid Electric Vehicles (HEV) by the Ohio State University. The report indicates results from models created in NREL's ADvanced VehIcle SimulatOR (ADVISOR 3.2), and results of a scalable IC Engine model, called in Willan's Line technique, implemented in ADVISOR 3.2.

  13. Advanced Small Modular Reactor Economics Model Development

    SciTech Connect

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis of the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.

  14. Advanced thermal energy management: A thermal test bed and heat pipe simulation

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.

    1986-01-01

    Work initiated on a common-module thermal test simulation was continued, and a second project on heat pipe simulation was begun. The test bed, constructed from surplus Skylab equipment, was modeled and solved for various thermal load and flow conditions. Low thermal load caused the radiator fluid, Coolanol 25, to thicken due to its temperature avoided by using a regenerator-heat-exchanger. Other possible solutions modeled include a radiator heater and shunting heat from the central thermal bus to the radiator. Also, module air temperature can become excessive with high avionics load. A second preoject concerning advanced heat pipe concepts was initiated. A program was written which calculates fluid physical properties, liquid and vapor pressure in the evaporator and condenser, fluid flow rates, and thermal flux. The program is directed to evaluating newer heat pipe wicks and geometries, especially water in an artery surrounded by six vapor channels. Effects of temperature, groove and slot dimensions, and wick properties are reported.

  15. A Flight Software Development and Simulation Framework for Advanced Space Systems

    E-print Network

    A Flight Software Development and Simulation Framework for Advanced Space Systems John P. Enright for the degree of Doctor of Philosophy at the Massachusetts Institute of Technology. #12;2 #12;A Flight Software Development and Simulation Framework for Advanced Space Systems by JOHN ENRIGHT Submitted to the Department

  16. Electricity Generation Cost Simulation Model

    Energy Science and Technology Software Center (ESTSC)

    2003-04-25

    The Electricity Generation Cost Simulation Model (GENSIM) is a user-friendly, high-level dynamic simulation model that calculates electricity production costs for variety of electricity generation technologies, including: pulverized coal, gas combustion turbine, gas combined cycle, nuclear, solar (PV and thermal), and wind. The model allows the user to quickly conduct sensitivity analysis on key variables, including: capital, O&M, and fuel costs; interest rates; construction time; heat rates; and capacity factors. The model also includes consideration ofmore »a wide range of externality costs and pollution control options for carbon dioxide, nitrogen oxides, sulfur dioxide, and mercury. Two different data sets are included in the model; one from the U.S. Department of Energy (DOE) and the other from Platt's Research Group. Likely users of this model include executives and staff in the Congress, the Administration and private industry (power plant builders, industrial electricity users and electric utilities). The model seeks to improve understanding of the economic viability of various generating technologies and their emission trade-offs. The base case results using the DOE data, indicate that in the absence of externality costs, or renewable tax credits, pulverized coal and gas combined cycle plants are the least cost alternatives at 3.7 and 3.5 cents/kwhr, respectively. A complete sensitivity analysis on fuel, capital, and construction time shows that these results coal and gas are much more sensitive to assumption about fuel prices than they are to capital costs or construction times. The results also show that making nuclear competitive with coal or gas requires significant reductions in capital costs, to the $1000/kW level, if no other changes are made. For renewables, the results indicate that wind is now competitive with the nuclear option and is only competitive with coal and gas for grid connected applications if one includes the federal production tax credit of 1.8 cents/kwhr.« less

  17. Thermochemical modelling of advanced CANDU reactor fuel

    NASA Astrophysics Data System (ADS)

    Corcoran, Emily Catherine

    2009-04-01

    With an aging fleet of nuclear generating facilities, the imperative to limit the use of non-renewal fossil fuels and the inevitable need for additional electricity to power Canada's economy, a renaissance in the use of nuclear technology in Canada is at hand. The experience and knowledge of over 40 years of CANDU research, development and operation in Ontario and elsewhere has been applied to a new generation of CANDU, the Advanced CANDU Reactor (ACR). Improved fuel design allows for an extended burnup, which is a significant improvement, enhancing the safety and the economies of the ACR. The use of a Burnable Neutron Absorber (BNA) material and Low Enriched Uranium (LEU) fuel has created a need to understand better these novel materials and fuel types. This thesis documents a work to advance the scientific and technological knowledge of the ACR fuel design with respect to thermodynamic phase stability and fuel oxidation modelling. For the BNA material, a new (BNA) model is created based on the fundamental first principles of Gibbs energy minimization applied to material phase stability. For LEU fuel, the methodology used for the BNA model is applied to the oxidation of irradiated fuel. The pertinent knowledge base for uranium, oxygen and the major fission products is reviewed, updated and integrated to create a model that is applicable to current and future CANDU fuel designs. As part of this thesis, X-Ray Diffraction (XRD) and Coulombic Titration (CT) experiments are compared to the BNA and LEU models, respectively. From the analysis of the CT results, a number of improvements are proposed to enhance the LEU model and provide confidence in its application to ACR fuel. A number of applications for the potential use of these models are proposed and discussed. Keywords: CANDU Fuel, Gibbs Energy Mimimization, Low Enriched Uranium (LEU) Fuel, Burnable Neutron Absorber (BNA) Material, Coulometric Titration, X-Ray Diffraction

  18. Using an advanced vehicle simulator (ADVISOR) to guide hybrid vehicle propulsion system development

    SciTech Connect

    Wipke, K.B.; Cuddy, M.R.

    1996-08-01

    An advanced vehicle simulator model called ADVISOR has been developed at the National Renewable Energy Laboratory to allow system-level analysis and trade-off studies of advanced vehicles. Because of ADVISOR`s fast execution speed and the open programming environment of MATLAB/Simulink, the simulator is ideally suited for doing parametric studies to map out the design space of potential high fuel economy vehicles (3X) consistent with the goals of the Partnership for New Generation of Vehicles (PNGV). Five separate vehicle configurations have been modeled including 3 lightweight vehicles (parallel, series, and conventional drivetrains) along with 2 vehicles with 1996 vehicle weights (parallel and conventional drivetrains). The sensitivity of each vehicle`s fuel economy to critical vehicle parameters is then examined and regions of interest for the vehicles mapped out through parametric studies. Using the simulation results for these vehicles, the effect of hybridization is isolated and analyzed and the trade-offs between series and parallel designs are illustrated.

  19. An efficient time advancing strategy for energy-preserving simulations

    NASA Astrophysics Data System (ADS)

    Capuano, F.; Coppola, G.; de Luca, L.

    2015-08-01

    Energy-conserving numerical methods are widely employed within the broad area of convection-dominated systems. Semi-discrete conservation of energy is usually obtained by adopting the so-called skew-symmetric splitting of the non-linear convective term, defined as a suitable average of the divergence and advective forms. Although generally allowing global conservation of kinetic energy, it has the drawback of being roughly twice as expensive as standard divergence or advective forms alone. In this paper, a general theoretical framework has been developed to derive an efficient time-advancement strategy in the context of explicit Runge-Kutta schemes. The novel technique retains the conservation properties of skew-symmetric-based discretizations at a reduced computational cost. It is found that optimal energy conservation can be achieved by properly constructed Runge-Kutta methods in which only divergence and advective forms for the convective term are used. As a consequence, a considerable improvement in computational efficiency over existing practices is achieved. The overall procedure has proved to be able to produce new schemes with a specified order of accuracy on both solution and energy. The effectiveness of the method as well as the asymptotic behavior of the schemes is demonstrated by numerical simulation of Burgers' equation.

  20. Mission simulation as an approach to develop requirements for automation in advanced life support systems

    NASA Astrophysics Data System (ADS)

    Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.

    This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.

  1. SEMI Modeling and Simulation Roadmap

    SciTech Connect

    Hermina, W.L.

    2000-10-02

    With the exponential growth in the power of computing hardware and software, modeling and simulation is becoming a key enabler for the rapid design of reliable Microsystems. One vision of the future microsystem design process would include the following primary software capabilities: (1) The development of 3D part design, through standard CAD packages, with automatic design rule checks that guarantee the manufacturability and performance of the microsystem. (2) Automatic mesh generation, for 3D parts as manufactured, that permits computational simulation of the process steps, and the performance and reliability analysis for the final microsystem. (3) Computer generated 2D layouts for process steps that utilize detailed process models to generate the layout and process parameter recipe required to achieve the desired 3D part. (4) Science-based computational tools that can simulate the process physics, and the coupled thermal, fluid, structural, solid mechanics, electromagnetic and material response governing the performance and reliability of the microsystem. (5) Visualization software that permits the rapid visualization of 3D parts including cross-sectional maps, performance and reliability analysis results, and process simulation results. In addition to these desired software capabilities, a desired computing infrastructure would include massively parallel computers that enable rapid high-fidelity analysis, coupled with networked compute servers that permit computing at a distance. We now discuss the individual computational components that are required to achieve this vision. There are three primary areas of focus: design capabilities, science-based capabilities and computing infrastructure. Within each of these areas, there are several key capability requirements.

  2. RIGOROUS MODELING AND SIMULATION OF MECHATRONIC SYSTEMS

    E-print Network

    Taylor, James H.

    RIGOROUS MODELING AND SIMULATION OF MECHATRONIC SYSTEMS James H. Taylor Professor Emeritus, Systems mentioned above for mechatronic systems. Key Words: Mechatronic systems, modeling, simulation, numerical CANADA E3B 5A3 E-mail: jtaylor@unb.ca ABSTRACT: A brief overview of modeling and simulation (m & s

  3. Advances in the simulation and automated measurement of well-sorted granular material: 1. Simulation

    USGS Publications Warehouse

    Daniel Buscombe; Rubin, David M.

    2012-01-01

    1. In this, the first of a pair of papers which address the simulation and automated measurement of well-sorted natural granular material, a method is presented for simulation of two-phase (solid, void) assemblages of discrete non-cohesive particles. The purpose is to have a flexible, yet computationally and theoretically simple, suite of tools with well constrained and well known statistical properties, in order to simulate realistic granular material as a discrete element model with realistic size and shape distributions, for a variety of purposes. The stochastic modeling framework is based on three-dimensional tessellations with variable degrees of order in particle-packing arrangement. Examples of sediments with a variety of particle size distributions and spatial variability in grain size are presented. The relationship between particle shape and porosity conforms to published data. The immediate application is testing new algorithms for automated measurements of particle properties (mean and standard deviation of particle sizes, and apparent porosity) from images of natural sediment, as detailed in the second of this pair of papers. The model could also prove useful for simulating specific depositional structures found in natural sediments, the result of physical alterations to packing and grain fabric, using discrete particle flow models. While the principal focus here is on naturally occurring sediment and sedimentary rock, the methods presented might also be useful for simulations of similar granular or cellular material encountered in engineering, industrial and life sciences.

  4. A Social Diffusion Model with an Application on Election Simulation

    PubMed Central

    Wang, Fu-Min; Hung, San-Chuan; Kung, Perng-Hwa; Lin, Shou-De

    2014-01-01

    Issues about opinion diffusion have been studied for decades. It has so far no empirical approach to model the interflow and formation of crowd's opinion in elections due to two reasons. First, unlike the spread of information or flu, individuals have their intrinsic attitudes to election candidates in advance. Second, opinions are generally simply assumed as single values in most diffusion models. However, in this case, an opinion should represent preference toward multiple candidates. Previously done models thus may not intuitively interpret such scenario. This work is to design a diffusion model which is capable of managing the aforementioned scenario. To demonstrate the usefulness of our model, we simulate the diffusion on the network built based on a publicly available bibliography dataset. We compare the proposed model with other well-known models such as independent cascade. It turns out that our model consistently outperforms other models. We additionally investigate electoral issues with our model simulator. PMID:24995351

  5. Simulated annealing model of acupuncture

    NASA Astrophysics Data System (ADS)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  6. Advanced co-simulation for computer-aided process design and optimization of fossil energy systems with carbon capture

    SciTech Connect

    Zitney, S.

    2009-01-01

    In this paper, we describe recent progress toward developing an Advanced Process Engineering Co-Simulator (APECS) for use in computer-aided design and optimization of fossil energy systems with carbon capture. The APECS system combines process simulation with multiphysicsbased equipment simulations, such as those based on computational fluid dynamics. These co-simulation capabilities enable design engineers to optimize overall process performance with respect to complex thermal and fluid flow phenomena arising in key plant equipment items. This paper also highlights ongoing co-simulation R&D activities in areas such as reduced order modeling, knowledge management, stochastic analysis and optimization, and virtual plant co-simulation. Continued progress in co-simulation technology— through improved integration, solution, deployment, and analysis —will have profound positive impacts on the design and optimization of high-efficiency, near-zero emission fossil energy systems.

  7. Operations planning simulation: Model study

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The use of simulation modeling for the identification of system sensitivities to internal and external forces and variables is discussed. The technique provides a means of exploring alternate system procedures and processes, so that these alternatives may be considered on a mutually comparative basis permitting the selection of a mode or modes of operation which have potential advantages to the system user and the operator. These advantages are measurements is system efficiency are: (1) the ability to meet specific schedules for operations, mission or mission readiness requirements or performance standards and (2) to accomplish the objectives within cost effective limits.

  8. Crashworthiness analysis using advanced material models in DYNA3D

    SciTech Connect

    Logan, R.W.; Burger, M.J.; McMichael, L.D.; Parkinson, R.D.

    1993-10-22

    As part of an electric vehicle consortium, LLNL and Kaiser Aluminum are conducting experimental and numerical studies on crashworthy aluminum spaceframe designs. They have jointly explored the effect of heat treat on crush behavior and duplicated the experimental behavior with finite-element simulations. The major technical contributions to the state of the art in numerical simulation arise from the development and use of advanced material model descriptions for LLNL`s DYNA3D code. Constitutive model enhancements in both flow and failure have been employed for conventional materials such as low-carbon steels, and also for lighter weight materials such as aluminum and fiber composites being considered for future vehicles. The constitutive model enhancements are developed as extensions from LLNL`s work in anisotropic flow and multiaxial failure modeling. Analysis quality as a function of level of simplification of material behavior and mesh is explored, as well as the penalty in computation cost that must be paid for using more complex models and meshes. The lightweight material modeling technology is being used at the vehicle component level to explore the safety implications of small neighborhood electric vehicles manufactured almost exclusively from these materials.

  9. A National Strategy for Advancing Climate Modeling

    SciTech Connect

    Dunlea, Edward; Elfring, Chris

    2012-12-04

    Climate models are the foundation for understanding and projecting climate and climate-related changes and are thus critical tools for supporting climate-related decision making. This study developed a holistic strategy for improving the nationâ??s capability to accurately simulate climate and related Earth system changes on decadal to centennial timescales. The committeeâ??s report is a high level analysis, providing a strategic framework to guide progress in the nationâ??s climate modeling enterprise over the next 10-20 years. This study was supported by DOE, NSF, NASA, NOAA, and the intelligence community.

  10. Detailed simulation of morphodynamics: 1. Hydrodynamic model

    NASA Astrophysics Data System (ADS)

    Nabi, M.; de Vriend, H. J.; Mosselman, E.; Sloff, C. J.; Shimizu, Y.

    2012-12-01

    We present a three-dimensional high-resolution hydrodynamic model for unsteady incompressible flow over an evolving bed topography. This is achieved by using a multilevel Cartesian grid technique that allows the grid to be refined in high-gradient regions and in the vicinity of the river bed. The grid can be locally refined and adapted to the bed geometry, managing the Cartesian grid cells and faces using a hierarchical tree data approach. A ghost-cell immersed-boundary technique is applied to cells intersecting the bed topography. The governing equations have been discretized using a finite-volume method on a staggered grid, conserving second-order accuracy in time and space. The solution advances in time using the fractional step approach. Large-eddy simulation is used as turbulence closure. We validate the model against several experiments and other results from literature. Model results for Stokes flow around a cylinder in the vicinity of a moving wall agree well with Wannier's analytical solution. At higher Reynolds numbers, computed trailing bubble length, separation angle, and drag coefficient compare favorably with experimental and previous computational results. Results for the flow over two- and three-dimensional dunes agree well with published data, including a fair reproduction of recirculation zones, horse-shoe structures, and boiling effects. This shows that the model is suitable for being used as a hydrodynamic submodel in the high-resolution modeling of sediment transport and formation and evolution of subaqueous ripples and dunes.

  11. Advanced practice nursing and conceptual models of nursing.

    PubMed

    Fawcett, Jacqueline; Newman, Diana M L; McAllister, Margaret

    2004-04-01

    This column focuses on advanced practice nursing. A definition and central competency of advanced practice are given and four roles assumed by advanced practice nurses are identified. Questions related primarily to the advanced practice role of nurse practitioner are raised. Two nurse scholars who teach and practice discuss their experiences as advanced practice nurses, with an emphasis on the importance of using a conceptual model of nursing as a guide for their practice. PMID:15090089

  12. AFDM: An Advanced Fluid-Dynamics Model

    SciTech Connect

    Wilhelm, D.

    1990-09-01

    This volume describes the Advanced Fluid-Dynamics Model (AFDM) for topologies, flow regimes, and interfacial areas. The objective of these models is to provide values for the interfacial areas between all components existing in a computational cell. The interfacial areas are then used to evaluate the mass, energy, and momentum transfer between the components. A new approach has been undertaken in the development of a model to convect the interfacial areas of the discontinuous velocity fields in the three-velocity-field environment of AFDM. These interfacial areas are called convectible surface areas. The continuous and discontinuous components are chosen using volume fraction and levitation criteria. This establishes so-called topologies for which the convectible surface areas can be determined. These areas are functions of space and time. Solid particulates that are limited to being discontinuous within the bulk fluid are assumed to have a constant size. The convectible surface areas are subdivided to model contacts between two discontinuous components or discontinuous components and the structure. The models have been written for the flow inside of large pools. Therefore, the structure is tracked only as a boundary to the fluid volume without having a direct influence on velocity or volume fraction distribution by means of flow regimes or boundary layer models. 17 refs., 7 tabs., 18 figs.

  13. Advanced Concepts for Underwater Acoustic Channel Modeling

    NASA Astrophysics Data System (ADS)

    Etter, P. C.; Haas, C. H.; Ramani, D. V.

    2014-12-01

    This paper examines nearshore underwater-acoustic channel modeling concepts and compares channel-state information requirements against existing modeling capabilities. This process defines a subset of candidate acoustic models suitable for simulating signal propagation in underwater communications. Underwater-acoustic communications find many practical applications in coastal oceanography, and networking is the enabling technology for these applications. Such networks can be formed by establishing two-way acoustic links between autonomous underwater vehicles and moored oceanographic sensors. These networks can be connected to a surface unit for further data transfer to ships, satellites, or shore stations via a radio-frequency link. This configuration establishes an interactive environment in which researchers can extract real-time data from multiple, but distant, underwater instruments. After evaluating the obtained data, control messages can be sent back to individual instruments to adapt the networks to changing situations. Underwater networks can also be used to increase the operating ranges of autonomous underwater vehicles by hopping the control and data messages through networks that cover large areas. A model of the ocean medium between acoustic sources and receivers is called a channel model. In an oceanic channel, characteristics of the acoustic signals change as they travel from transmitters to receivers. These characteristics depend upon the acoustic frequency, the distances between sources and receivers, the paths followed by the signals, and the prevailing ocean environment in the vicinity of the paths. Properties of the received signals can be derived from those of the transmitted signals using these channel models. This study concludes that ray-theory models are best suited to the simulation of acoustic signal propagation in oceanic channels and identifies 33 such models that are eligible candidates.

  14. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  15. Modeling of Accelerator Components for Advanced Radiography

    NASA Astrophysics Data System (ADS)

    Poole, B. R.

    1998-04-01

    Advanced flash radiography machines require multiple temporal pulses over several lines of sight. A single long pulse linear induction accelerator is used to produce an electron beam pulse that is injected into a precision beam kicker and septum to produce the required pulse sequence. For high current relativistic electron beams the effect of beam induced steering due to the associated wakefields needs to be understood to temporally control the transport of the beam through the kicker and septum regions. Detailed 3-D time domain electromagnetic modeling of the beam kicker and septum are used to determine the dipole and quadrupole wake impedances associated with these components. An analytic model of the beam induced fields in the beam kicker are used to determine the self-consistent beam induced steering. Comparisons with experiments will be provided.

  16. Prospects for Advanced RF Theory and Modeling

    SciTech Connect

    Batchelor, D.B.

    1999-04-12

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed.

  17. Prospects for advanced RF theory and modeling

    NASA Astrophysics Data System (ADS)

    Batchelor, D. B.

    1999-09-01

    This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed.

  18. A modular BLSS simulation model

    NASA Technical Reports Server (NTRS)

    Rummel, John D.; Volk, Tyler

    1987-01-01

    A bioregenerative life support system (BLSS) for extraterrestrial use will be faced with coordination problems more acute than those in any ecosystem found on Earth. A related problem in BLSS design is providing an interface between the various life support processors, one that will allow for their coordination while still allowing for system expansion. A modular model is presented of a BLSS that interfaces system processors only with the material storage reservoirs, allowing those reservoirs to act as the principal buffers in the system and thus minimizing difficulties with processor coordination. The modular nature of the model allows independent development of the detailed submodels that exist within the model framework. Using this model, BLSS dynamics were investigated under normal conditions and under various failure modes. Partial and complete failures of various components, such as the waste processors or the plants themselves, drive transient responses in the model system, allowing the examination of the effectiveness of the system reservoirs as buffers. The results from simulations help to determine control strategies and BLSS design requirements. An evolved version could be used as an interactive control aid in a future BLSS.

  19. Ubiquitin: molecular modeling and simulations.

    PubMed

    Ganoth, Assaf; Tsfadia, Yossi; Wiener, Reuven

    2013-11-01

    The synthesis and destruction of proteins are imperative for maintaining their cellular homeostasis. In the 1970s, Aaron Ciechanover, Avram Hershko, and Irwin Rose discovered that certain proteins are tagged by ubiquitin before degradation, a discovery that awarded them the 2004 Nobel Prize in Chemistry. Compelling data gathered during the last several decades show that ubiquitin plays a vital role not only in protein degradation but also in many cellular functions including DNA repair processes, cell cycle regulation, cell growth, immune system functionality, hormone-mediated signaling in plants, vesicular trafficking pathways, regulation of histone modification and viral budding. Due to the involvement of ubiquitin in such a large number of diverse cellular processes, flaws and impairments in the ubiquitin system were found to be linked to cancer, neurodegenerative diseases, genetic disorders, and immunological disorders. Hence, deciphering the dynamics and complexity of the ubiquitin system is of significant importance. In addition to experimental techniques, computational methodologies have been gaining increasing influence in protein research and are used to uncover the structure, stability, folding, mechanism of action and interactions of proteins. Notably, molecular modeling and molecular dynamics simulations have become powerful tools that bridge the gap between structure and function while providing dynamic insights and illustrating essential mechanistic characteristics. In this study, we present an overview of molecular modeling and simulations of ubiquitin and the ubiquitin system, evaluate the status of the field, and offer our perspective on future progress in this area of research. PMID:24113788

  20. Advanced Numerical Model for Irradiated Concrete

    SciTech Connect

    Giorla, Alain B.

    2015-03-01

    In this report, we establish a numerical model for concrete exposed to irradiation to address these three critical points. The model accounts for creep in the cement paste and its coupling with damage, temperature and relative humidity. The shift in failure mode with the loading rate is also properly represented. The numerical model for creep has been validated and calibrated against different experiments in the literature [Wittmann, 1970, Le Roy, 1995]. Results from a simplified model are shown to showcase the ability of numerical homogenization to simulate irradiation effects in concrete. In future works, the complete model will be applied to the analysis of the irradiation experiments of Elleuch et al. [1972] and Kelly et al. [1969]. This requires a careful examination of the experimental environmental conditions as in both cases certain critical information are missing, including the relative humidity history. A sensitivity analysis will be conducted to provide lower and upper bounds of the concrete expansion under irradiation, and check if the scatter in the simulated results matches the one found in experiments. The numerical and experimental results will be compared in terms of expansion and loss of mechanical stiffness and strength. Both effects should be captured accordingly by the model to validate it. Once the model has been validated on these two experiments, it can be applied to simulate concrete from nuclear power plants. To do so, the materials used in these concrete must be as well characterized as possible. The main parameters required are the mechanical properties of each constituent in the concrete (aggregates, cement paste), namely the elastic modulus, the creep properties, the tensile and compressive strength, the thermal expansion coefficient, and the drying shrinkage. These can be either measured experimentally, estimated from the initial composition in the case of cement paste, or back-calculated from mechanical tests on concrete. If some are unknown, a sensitivity analysis must be carried out to provide lower and upper bounds of the material behaviour. Finally, the model can be used as a basis to formulate a macroscopic material model for concrete subject to irradiation, which later can be used in structural analyses to estimate the structural impact of irradiation on nuclear power plants.

  1. Advancing an Information Model for Environmental Observations

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Hooper, R. P.; Lehnert, K. A.; Schreuders, K.; Tarboton, D. G.; Valentine, D. W.; Zaslavsky, I.

    2011-12-01

    Observational data are fundamental to hydrology and water resources, and the way they are organized, described, and shared either enables or inhibits the analyses that can be performed using the data. The CUAHSI Hydrologic Information System (HIS) project is developing cyberinfrastructure to support hydrologic science by enabling better access to hydrologic data. HIS is composed of three major components. HydroServer is a software stack for publishing time series of hydrologic observations on the Internet as well as geospatial data using standards-based web feature, map, and coverage services. HydroCatalog is a centralized facility that catalogs the data contents of individual HydroServers and enables search across them. HydroDesktop is a client application that interacts with both HydroServer and HydroCatalog to discover, download, visualize, and analyze hydrologic observations published on one or more HydroServers. All three components of HIS are founded upon an information model for hydrologic observations at stationary points that specifies the entities, relationships, constraints, rules, and semantics of the observational data and that supports its data services. Within this information model, observations are described with ancillary information (metadata) about the observations to allow them to be unambiguously interpreted and used, and to provide traceable heritage from raw measurements to useable information. Physical implementations of this information model include the Observations Data Model (ODM) for storing hydrologic observations, Water Markup Language (WaterML) for encoding observations for transmittal over the Internet, the HydroCatalog metadata catalog database, and the HydroDesktop data cache database. The CUAHSI HIS and this information model have now been in use for several years, and have been deployed across many different academic institutions as well as across several national agency data repositories. Additionally, components of the HIS have been modified to support data management for the Critical Zone Observatories (CZOs). This paper will present limitations of the existing information model used by the CUAHSI HIS that have been uncovered through its deployment and use, as well as new advances to the information model, including: better representation of both in situ observations from field sensors and observations derived from environmental samples, extensibility in attributes used to describe observations, and observation provenance. These advances have been developed by the HIS team and the broader scientific community and will enable the information model to accommodate and better describe wider classes of environmental observations and to better meet the needs of the hydrologic science and CZO communities.

  2. Efficient Multi-Dimensional Simulation of Quantum Confinement Effects in Advanced MOS Devices

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A.; Rafferty, Conor S.; Ancona, Mario G.; Yu, Zhi-Ping

    2000-01-01

    We investigate the density-gradient (DG) transport model for efficient multi-dimensional simulation of quantum confinement effects in advanced MOS devices. The formulation of the DG model is described as a quantum correction to the classical drift-diffusion model. Quantum confinement effects are shown to be significant in sub-100nm MOSFETs. In thin-oxide MOS capacitors, quantum effects may reduce gate capacitance by 25% or more. As a result, the inclusion or quantum effects in simulations dramatically improves the match between C-V simulations and measurements for oxide thickness down to 2 nm. Significant quantum corrections also occur in the I-V characteristics of short-channel (30 to 100 nm) n-MOSFETs, with current drive reduced by up to 70%. This effect is shown to result from reduced inversion charge due to quantum confinement of electrons in the channel. Also, subthreshold slope is degraded by 15 to 20 mV/decade with the inclusion of quantum effects via the density-gradient model, and short channel effects (in particular, drain-induced barrier lowering) are noticeably increased.

  3. Efficient Multi-Dimensional Simulation of Quantum Confinement Effects in Advanced MOS Devices

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A.; Ancona, Mario G.; Rafferty, Conor S.; Yu, Zhiping

    2000-01-01

    We investigate the density-gradient (DG) transport model for efficient multi-dimensional simulation of quantum confinement effects in advanced MOS devices. The formulation of the DG model is described as a quantum correction ot the classical drift-diffusion model. Quantum confinement effects are shown to be significant in sub-100nm MOSFETs. In thin-oxide MOS capacitors, quantum effects may reduce gate capacitance by 25% or more. As a result, the inclusion of quantum effects may reduce gate capacitance by 25% or more. As a result, the inclusion of quantum effects in simulations dramatically improves the match between C-V simulations and measurements for oxide thickness down to 2 nm. Significant quantum corrections also occur in the I-V characteristics of short-channel (30 to 100 nm) n-MOSFETs, with current drive reduced by up to 70%. This effect is shown to result from reduced inversion charge due to quantum confinement of electrons in the channel. Also, subthreshold slope is degraded by 15 to 20 mV/decade with the inclusion of quantum effects via the density-gradient model, and short channel effects (in particular, drain-induced barrier lowering) are noticeably increased.

  4. Advances in Homology Protein Structure Modeling

    PubMed Central

    Xiang, Zhexin

    2007-01-01

    Homology modeling plays a central role in determining protein structure in the structural genomics project. The importance of homology modeling has been steadily increasing because of the large gap that exists between the overwhelming number of available protein sequences and experimentally solved protein structures, and also, more importantly, because of the increasing reliability and accuracy of the method. In fact, a protein sequence with over 30% identity to a known structure can often be predicted with an accuracy equivalent to a low-resolution X-ray structure. The recent advances in homology modeling, especially in detecting distant homologues, aligning sequences with template structures, modeling of loops and side chains, as well as detecting errors in a model, have contributed to reliable prediction of protein structure, which was not possible even several years ago. The ongoing efforts in solving protein structures, which can be time-consuming and often difficult, will continue to spur the development of a host of new computational methods that can fill in the gap and further contribute to understanding the relationship between protein structure and function. PMID:16787261

  5. Advanced Simulation of Coupled Earthquake and Tsunami Events (ASCETE) - Simulation Techniques for Realistic Tsunami Process Studies

    NASA Astrophysics Data System (ADS)

    Behrens, Joern; Bader, Michael; Breuer, Alexander N.; van Dinther, Ylona; Gabriel, Alice-A.; Galvez Barron, Percy E.; Rahnema, Kaveh; Vater, Stefan; Wollherr, Stephanie

    2015-04-01

    At the End of phase 1 of the ASCETE project a simulation framework for coupled physics-based rupture generation with tsunami propagation and inundation is available. Adaptive mesh tsunami propagation and inundation by discontinuous Galerkin Runge-Kutta methods allows for accurate and conservative inundation schemes. Combined with a tree-based refinement strategy to highly optimize the code for high-performance computing architectures, a modeling tool for high fidelity tsunami simulations has been constructed. Validation results demonstrate the capacity of the software. Rupture simulation is performed by an unstructured tetrahedral discontinuous Galerking ADER discretization, which allows for accurate representation of complex geometries. The implemented code was nominated for and was selected as a finalist for the Gordon Bell award in high-performance computing. Highly realistic rupture events can be simulated with this modeling tool. The coupling of rupture induced wave activity and displacement with hydrodynamic equations still poses a major problem due to diverging time and spatial scales. Some insight from the ASCETE set-up could be gained and the presentation will focus on the coupled behavior of the simulation system. Finally, an outlook to phase 2 of the ASCETE project will be given in which further development of detailed physical processes as well as near-realistic scenario computations are planned. ASCETE is funded by the Volkswagen Foundation.

  6. Off-gas Adsorption Model and Simulation - OSPREY

    SciTech Connect

    Veronica J Rutledge

    2013-10-01

    The absence of industrial scale nuclear fuel reprocessing in the U.S. has precluded the necessary driver for developing the advanced simulation capability now prevalent in so many other countries. Thus, it is essential to model complex series of unit operations to simulate, understand, and predict inherent transient behavior. A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes is expected to provide substantial cost savings and many technical benefits. To support this capability, a modeling effort focused on the off-gas treatment system of a used nuclear fuel recycling facility is in progress. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and REcoverY (OSPREY) models the adsorption of offgas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas composition, sorbent and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data can be obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. In addition to concentration data, the model predicts temperature along the column length as a function of time and pressure drop along the column length. A description of the OSPREY model, results from krypton adsorption modeling and plans for modeling the behavior of iodine, xenon, and tritium will be discussed.

  7. USER'S GUIDE FOR THE ADVANCED STATISTICAL TRAJECTORY REGIONAL AIR POLLUTION (ASTRAP) MODEL

    EPA Science Inventory

    The Advanced Statistical Trajectory Regional Air Pollution (ASTRAP) model simulates long-range, long-term transport and deposition of air pollutants, primarily oxides of sulfur and nitrogen. The ASTRAP model is designed to combine ease of exercise with an appropriate detail of ph...

  8. Comparing Aerodynamic Models for Numerical Simulation of

    E-print Network

    Peraire, Jaime

    Comparing Aerodynamic Models for Numerical Simulation of Dynamics and Control of Aircraft and simulation of aircraft, yet other aerodynamics models exist that can provide more accurate results for certain simulations without a large increase in computational time. In this paper, sev- eral aerodynamics

  9. Fusion Simulation Project (Whole Tokamak Plasma Modeling)

    E-print Network

    Fusion Simulation Project (Whole Tokamak Plasma Modeling) FSP Committee and Panels Presented of which is develop an improved capacity for Integrated Simulation and Optimization of Fusion Systems on simulations of individual physical phenomena · Development of high fidelity physics models for individual

  10. [Research advance on lake ecosystem dynamic models].

    PubMed

    Liu, Yong; Guo, Huaicheng; Fan, Yingying; Wang, Lijing

    2005-06-01

    Starting with the role of system analysis in lake ecosystem research, this paper summarized the tentative procedures and softwares for studying the dynamics of lake ecosystem. There are several main stages in modeling the dynamics of lake ecosystem, namely, problems identification, mathematical formulation, computation, validation, sensitive analysis, calibration, and verification. In the modeling, selecting temporal and spatial scales is essential but complex. Since 1960s, a rapid progress has been made in modeling the dynamics of lake ecosystem, being developed from simple zero-dimension models to complex ecological-aquatic-hydrodynamic ones, among which, exergy was applied popularly as an objective function in modeling. In this paper, LakeWeb and LEEDS (Lake Eutrophication, Effect, Dose, and Sensitivity model) were analyzed as examples. In China, the development of lake ecosystem dynamic models could be traced back to 1980s, and most of them were focused on Lake Dianch, Lake Taihu, Lake Chaohu and Lake Donghu. Some softwares such as CE-QUAL-ICM, WASP, AQUATOX, PAMOLARE and CAEDYM were developed to simulate lake ecosystem dynamics, among which, CE-QUAL-ICM is more suitable for long and narrow water bodies. WASP consists of three parts, i. e., DYNHYD, EUTRO, and TOXI. AQUATOX is an ecological risk model, and the parameters are mainly calibrated in U. S. A, which has limited its further application in China. The software ECOPATH for simulating the energy flows in lakes was also described in this paper. There are still many shortages in the lake ecosystem dynamic models, e. g., the lack of sufficient monitoring data for validation, insufficient consideration of uncertainties and the role of bacteria, and inconsistent relationship with watershed changes. The uncertainties are mainly from the intrinsic uncertainties in aquatic ecosystem, in modeling, in parameters selection, and also in forecast and application. Setting up long-term monitoring and data sharing mechanism, using interpolation to make data more densely, introducing objective functions, dealing with uncertainties, and constructing watershed-lake ecosystem dynamic model could be the available ways for overcoming the shortages. PMID:16180776

  11. Advance finite element modeling of rotor blade aeroelasticity

    NASA Technical Reports Server (NTRS)

    Straub, F. K.; Sangha, K. B.; Panda, B.

    1994-01-01

    An advanced beam finite element has been developed for modeling rotor blade dynamics and aeroelasticity. This element is part of the Element Library of the Second Generation Comprehensive Helicopter Analysis System (2GCHAS). The element allows modeling of arbitrary rotor systems, including bearingless rotors. It accounts for moderately large elastic deflections, anisotropic properties, large frame motion for maneuver simulation, and allows for variable order shape functions. The effects of gravity, mechanically applied and aerodynamic loads are included. All kinematic quantities required to compute airloads are provided. In this paper, the fundamental assumptions and derivation of the element matrices are presented. Numerical results are shown to verify the formulation and illustrate several features of the element.

  12. Advanced virtual energy simulation training and research: IGCC with CO2 capture power plant

    SciTech Connect

    Zitney, S.; Liese, E.; Mahapatra, P.; Bhattacharyya, D.; Provost, G.

    2011-01-01

    In this presentation, we highlight the deployment of a real-time dynamic simulator of an integrated gasification combined cycle (IGCC) power plant with CO{sub 2} capture at the Department of Energy's (DOE) National Energy Technology Laboratory's (NETL) Advanced Virtual Energy Simulation Training and Research (AVESTARTM) Center. The Center was established as part of the DOE's accelerating initiative to advance new clean coal technology for power generation. IGCC systems are an attractive technology option, generating low-cost electricity by converting coal and/or other fuels into a clean synthesis gas mixture in a process that is efficient and environmentally superior to conventional power plants. The IGCC dynamic simulator builds on, and reaches beyond, conventional power plant simulators to merge, for the first time, a 'gasification with CO{sub 2} capture' process simulator with a 'combined-cycle' power simulator. Fueled with coal, petroleum coke, and/or biomass, the gasification island of the simulated IGCC plant consists of two oxygen-blown, downward-fired, entrained-flow, slagging gasifiers with radiant syngas coolers and two-stage sour shift reactors, followed by a dual-stage acid gas removal process for CO{sub 2} capture. The combined cycle island consists of two F-class gas turbines, steam turbine, and a heat recovery steam generator with three-pressure levels. The dynamic simulator can be used for normal base-load operation, as well as plant start-up and shut down. The real-time dynamic simulator also responds satisfactorily to process disturbances, feedstock blending and switchovers, fluctuations in ambient conditions, and power demand load shedding. In addition, the full-scope simulator handles a wide range of abnormal situations, including equipment malfunctions and failures, together with changes initiated through actions from plant field operators. By providing a comprehensive IGCC operator training system, the AVESTAR Center is poised to develop a workforce well-prepared to operate and control commercial-scale gasification-based power plants capable of 90% pre-combustion CO{sub 2} capture and compression, as well as low sulfur, mercury, and NOx emissions. With additional support from the NETL-Regional University Alliance (NETL-RUA), the Center will educate and train engineering students and researchers by providing hands-on 'learning by operating' experience The AVESTAR Center also offers unique collaborative R&D opportunities in high-fidelity dynamic modeling, advanced process control, real-time optimization, and virtual plant simulation. Objectives and goals are aimed at safe and effective management of power generation systems for optimal efficiency, while protecting the environment. To add another dimension of realism to the AVESTAR experience, NETL will introduce an immersive training system with innovative three-dimensional virtual reality technology. Wearing a stereoscopic headset or eyewear, trainees will enter an interactive virtual environment that will allow them to move freely throughout the simulated 3-D facility to study and learn various aspects of IGCC plant operation, control, and safety. Such combined operator and immersive training systems go beyond traditional simulation and include more realistic scenarios, improved communication, and collaboration among co-workers.

  13. Galaxy Alignments: Theory, Modelling & Simulations

    NASA Astrophysics Data System (ADS)

    Kiessling, Alina; Cacciato, Marcello; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D.; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L.; Rassat, Anais

    2015-11-01

    The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in the large-scale structure tend to align nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of both the shapes and angular momenta of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both N-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the field, providing a solid basis for future work.

  14. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  15. Advanced Method to Estimate Fuel Slosh Simulation Parameters

    NASA Technical Reports Server (NTRS)

    Schlee, Keith; Gangadharan, Sathya; Ristow, James; Sudermann, James; Walker, Charles; Hubert, Carl

    2005-01-01

    The nutation (wobble) of a spinning spacecraft in the presence of energy dissipation is a well-known problem in dynamics and is of particular concern for space missions. The nutation of a spacecraft spinning about its minor axis typically grows exponentially and the rate of growth is characterized by the Nutation Time Constant (NTC). For launch vehicles using spin-stabilized upper stages, fuel slosh in the spacecraft propellant tanks is usually the primary source of energy dissipation. For analytical prediction of the NTC this fuel slosh is commonly modeled using simple mechanical analogies such as pendulums or rigid rotors coupled to the spacecraft. Identifying model parameter values which adequately represent the sloshing dynamics is the most important step in obtaining an accurate NTC estimate. Analytic determination of the slosh model parameters has met with mixed success and is made even more difficult by the introduction of propellant management devices and elastomeric diaphragms. By subjecting full-sized fuel tanks with actual flight fuel loads to motion similar to that experienced in flight and measuring the forces experienced by the tanks these parameters can be determined experimentally. Currently, the identification of the model parameters is a laborious trial-and-error process in which the equations of motion for the mechanical analog are hand-derived, evaluated, and their results are compared with the experimental results. The proposed research is an effort to automate the process of identifying the parameters of the slosh model using a MATLAB/SimMechanics-based computer simulation of the experimental setup. Different parameter estimation and optimization approaches are evaluated and compared in order to arrive at a reliable and effective parameter identification process. To evaluate each parameter identification approach, a simple one-degree-of-freedom pendulum experiment is constructed and motion is induced using an electric motor. By applying the estimation approach to a simple, accurately modeled system, its effectiveness and accuracy can be evaluated. The same experimental setup can then be used with fluid-filled tanks to further evaluate the effectiveness of the process. Ultimately, the proven process can be applied to the full-sized spinning experimental setup to quickly and accurately determine the slosh model parameters for a particular spacecraft mission. Automating the parameter identification process will save time, allow more changes to be made to proposed designs, and lower the cost in the initial design stages.

  16. Use of Advanced Meteorological Model Output for Coastal Ocean Modeling in Puget Sound

    SciTech Connect

    Yang, Zhaoqing; Khangaonkar, Tarang; Wang, Taiping

    2011-06-01

    It is a great challenge to specify meteorological forcing in estuarine and coastal circulation modeling using observed data because of the lack of complete datasets. As a result of this limitation, water temperature is often not simulated in estuarine and coastal modeling, with the assumption that density-induced currents are generally dominated by salinity gradients. However, in many situations, temperature gradients could be sufficiently large to influence the baroclinic motion. In this paper, we present an approach to simulate water temperature using outputs from advanced meteorological models. This modeling approach was applied to simulate annual variations of water temperatures of Puget Sound, a fjordal estuary in the Pacific Northwest of USA. Meteorological parameters from North American Region Re-analysis (NARR) model outputs were evaluated with comparisons to observed data at real-time meteorological stations. Model results demonstrated that NARR outputs can be used to drive coastal ocean models for realistic simulations of long-term water-temperature distributions in Puget Sound. Model results indicated that the net flux from NARR can be further improved with the additional information from real-time observations.

  17. Advancements in real-time engine simulation technology

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.

    1982-01-01

    The approaches used to develop real-time engine simulations are reviewed. Both digital and hybrid (analog and digital) techniques are discussed and specific examples of each are cited. These approaches are assessed from the standpoint of their usefulness for digital engine control development. A number of NASA-sponsored simulation research activities, aimed at exploring real-time simulation techniques, are described. These include the development of a microcomputer-based, parallel processor system for real-time engine simulation.

  18. Combining Genetic Algorithms & Simulation to Search for Failure Scenarios in System Models

    E-print Network

    Combining Genetic Algorithms & Simulation to Search for Failure Scenarios in System Models The 5th International Conference on Advances in System Simulation Oct. 27-Nov. 1, 2013 Project Team: Kevin Mills Algorithm (GA) steers a population of simulators to search for parameter combinations that lead to system

  19. Simulations of Failure via Three-Dimensional Cracking in Fuel Cladding for Advanced Nuclear Fuels

    SciTech Connect

    Lu, Hongbing; Bukkapatnam, Satish; Harimkar, Sandip; Singh, Raman; Bardenhagen, Scott

    2014-01-09

    Enhancing performance of fuel cladding and duct alloys is a key means of increasing fuel burnup. This project will address the failure of fuel cladding via three-dimensional cracking models. Researchers will develop a simulation code for the failure of the fuel cladding and validate the code through experiments. The objective is to develop an algorithm to determine the failure of fuel cladding in the form of three-dimensional cracking due to prolonged exposure under varying conditions of pressure, temperature, chemical environment, and irradiation. This project encompasses the following tasks: 1. Simulate 3D crack initiation and growth under instantaneous and/or fatigue loads using a new variant of the material point method (MPM); 2. Simulate debonding of the materials in the crack path using cohesive elements, considering normal and shear traction separation laws; 3. Determine the crack propagation path, considering damage of the materials incorporated in the cohesive elements to allow the energy release rate to be minimized; 4. Simulate the three-dimensional fatigue crack growth as a function of loading histories; 5. Verify the simulation code by comparing results to theoretical and numerical studies available in the literature; 6. Conduct experiments to observe the crack path and surface profile in unused fuel cladding and validate against simulation results; and 7. Expand the adaptive mesh refinement infrastructure parallel processing environment to allow adaptive mesh refinement at the 3D crack fronts and adaptive mesh merging in the wake of cracks. Fuel cladding is made of materials such as stainless steels and ferritic steels with added alloying elements, which increase stability and durability under irradiation. As fuel cladding is subjected to water, chemicals, fission gas, pressure, high temperatures, and irradiation while in service, understanding performance is essential. In the fast fuel used in advanced burner reactors, simulations of the nuclear fuels are critical to understand the burnup, and thus the fuel efficiency.

  20. Experimental simulations of explosive loading on structural components : reinforced concrete columns with advanced composite jackets

    E-print Network

    Rodríguez-Nikl, Tonatiuh

    2006-01-01

    of a simulated blast event requires proper modelling of theModelling The phenomenological hysteresis model is intended to describe the force- de?ection curve of the programmer during a simulated blast

  1. Advancement of DOE's EnergyPlus Building Energy Simulation Payment

    SciTech Connect

    Lixing Gu; Don Shirey; Richard Raustad; Bereket Nigusse; Chandan Sharma; Linda Lawrie; Rich Strand; Curt Pedersen; Dan Fisher; Edwin Lee; Mike Witte; Jason Glazer; Chip Barnaby

    2011-03-31

    EnergyPlus{sup TM} is a new generation computer software analysis tool that has been developed, tested, and commercialized to support DOEâ??s Building Technologies (BT) Program in terms of whole-building, component, and systems R&D (http://www.energyplus.gov). It is also being used to support evaluation and decision making of zero energy building (ZEB) energy efficiency and supply technologies during new building design and existing building retrofits. The 5-year project was managed by the National Energy Technology Laboratory and was divided into 5 budget period between 2006 and 2011. During the project period, 11 versions of EnergyPlus were released. This report summarizes work performed by an EnergyPlus development team led by the University of Central Floridaâ??s Florida Solar Energy Center (UCF/FSEC). The team members consist of DHL Consulting, C. O. Pedersen Associates, University of Illinois at Urbana-Champaign, Oklahoma State University, GARD Analytics, Inc., and WrightSoft Corporation. The project tasks involved new feature development, testing and validation, user support and training, and general EnergyPlus support. The team developed 146 new features during the 5-year period to advance the EnergyPlus capabilities. Annual contributions of new features are 7 in budget period 1, 19 in period 2, 36 in period 3, 41 in period 4, and 43 in period 5, respectively. The testing and validation task focused on running test suite and publishing report, developing new IEA test suite cases, testing and validating new source code, addressing change requests, and creating and testing installation package. The user support and training task provided support for users and interface developers, and organized and taught workshops. The general support task involved upgrading StarTeam (team sharing) software and updating existing utility software. The project met the DOE objectives and completed all tasks successfully. Although the EnergyPlus software was enhanced significantly under this project, more enhancements are needed for further improvement to ensure that EnergyPlus is able to simulate the latest technologies and perform desired HAVC system operations for the development of next generation HVAC systems. Additional development will be performed under a new 5-year project managed by the National Renewable Energy Laboratory.

  2. An introduction to enterprise modeling and simulation

    SciTech Connect

    Ostic, J.K.; Cannon, C.E.

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  3. Advances in Swine Biomedical Model Genomics

    PubMed Central

    Lunney, Joan K.

    2007-01-01

    This review is a short update on the diversity of swine biomedical models and the importance of genomics in their continued development. The swine has been used as a major mammalian model for human studies because of the similarity in size and physiology, and in organ development and disease progression. The pig model allows for deliberately timed studies, imaging of internal vessels and organs using standard human technologies, and collection of repeated peripheral samples and, at kill, detailed mucosal tissues. The ability to use pigs from the same litter, or cloned or transgenic pigs, facilitates comparative analyses and genetic mapping. The availability of numerous well defined cell lines, representing a broad range of tissues, further facilitates testing of gene expression, drug susceptibility, etc. Thus the pig is an excellent biomedical model for humans. For genomic applications it is an asset that the pig genome has high sequence and chromosome structure homology with humans. With the swine genome sequence now well advanced there are improving genetic and proteomic tools for these comparative analyses. The review will discuss some of the genomic approaches used to probe these models. The review will highlight genomic studies of melanoma and of infectious disease resistance, discussing issues to consider in designing such studies. It will end with a short discussion of the potential for genomic approaches to develop new alternatives for control of the most economically important disease of pigs, porcine reproductive and respiratory syndrome (PRRS), and the potential for applying knowledge gained with this virus for human viral infectious disease studies. PMID:17384736

  4. Advances in swine biomedical model genomics.

    PubMed

    Lunney, Joan K

    2007-01-01

    This review is a short update on the diversity of swine biomedical models and the importance of genomics in their continued development. The swine has been used as a major mammalian model for human studies because of the similarity in size and physiology, and in organ development and disease progression. The pig model allows for deliberately timed studies, imaging of internal vessels and organs using standard human technologies, and collection of repeated peripheral samples and, at kill, detailed mucosal tissues. The ability to use pigs from the same litter, or cloned or transgenic pigs, facilitates comparative analyses and genetic mapping. The availability of numerous well defined cell lines, representing a broad range of tissues, further facilitates testing of gene expression, drug susceptibility, etc. Thus the pig is an excellent biomedical model for humans. For genomic applications it is an asset that the pig genome has high sequence and chromosome structure homology with humans. With the swine genome sequence now well advanced there are improving genetic and proteomic tools for these comparative analyses. The review will discuss some of the genomic approaches used to probe these models. The review will highlight genomic studies of melanoma and of infectious disease resistance, discussing issues to consider in designing such studies. It will end with a short discussion of the potential for genomic approaches to develop new alternatives for control of the most economically important disease of pigs, porcine reproductive and respiratory syndrome (PRRS), and the potential for applying knowledge gained with this virus for human viral infectious disease studies. PMID:17384736

  5. Modeling of advanced fossil fuel power plants

    NASA Astrophysics Data System (ADS)

    Zabihian, Farshid

    The first part of this thesis deals with greenhouse gas (GHG) emissions from fossil fuel-fired power stations. The GHG emission estimation from fossil fuel power generation industry signifies that emissions from this industry can be significantly reduced by fuel switching and adaption of advanced power generation technologies. In the second part of the thesis, steady-state models of some of the advanced fossil fuel power generation technologies are presented. The impacts of various parameters on the solid oxide fuel cell (SOFC) overpotentials and outputs are investigated. The detail analyses of operation of the hybrid SOFC-gas turbine (GT) cycle when fuelled with methane and syngas demonstrate that the efficiencies of the cycles with and without anode exhaust recirculation are close, but the specific power of the former is much higher. The parametric analysis of the performance of the hybrid SOFC-GT cycle indicates that increasing the system operating pressure and SOFC operating temperature and fuel utilization factor improves cycle efficiency, but the effects of the increasing SOFC current density and turbine inlet temperature are not favourable. The analysis of the operation of the system when fuelled with a wide range of fuel types demonstrates that the hybrid SOFC-GT cycle efficiency can be between 59% and 75%, depending on the inlet fuel type. Then, the system performance is investigated when methane as a reference fuel is replaced with various species that can be found in the fuel, i.e., H2, CO2, CO, and N 2. The results point out that influence of various species can be significant and different for each case. The experimental and numerical analyses of a biodiesel fuelled micro gas turbine indicate that fuel switching from petrodiesel to biodiesel can influence operational parameters of the system. The modeling results of gas turbine-based power plants signify that relatively simple models can predict plant performance with acceptable accuracy. The unique feature of these models is that they are developed based on similar assumptions and run at similar conditions; therefore, their results can be compared. This work demonstrates that, although utilization of fossil fuels for power generation is inevitable, at least in the short- and mid-term future, it is possible and practical to carry out such utilization more efficiently and in an environmentally friendlier manner.

  6. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  7. Some Specific CASL Requirements for Advanced Multiphase Flow Simulation of Light Water Reactors

    SciTech Connect

    R. A. Berry

    2010-11-01

    Because of the diversity of physical phenomena occuring in boiling, flashing, and bubble collapse, and of the length and time scales of LWR systems, it is imperative that the models have the following features: • Both vapor and liquid phases (and noncondensible phases, if present) must be treated as compressible. • Models must be mathematically and numerically well-posed. • The models methodology must be multi-scale. A fundamental derivation of the multiphase governing equation system, that should be used as a basis for advanced multiphase modeling in LWR coolant systems, is given in the Appendix using the ensemble averaging method. The remainder of this work focuses specifically on the compressible, well-posed, and multi-scale requirements of advanced simulation methods for these LWR coolant systems, because without these are the most fundamental aspects, without which widespread advancement cannot be claimed. Because of the expense of developing multiple special-purpose codes and the inherent inability to couple information from the multiple, separate length- and time-scales, efforts within CASL should be focused toward development of a multi-scale approaches to solve those multiphase flow problems relevant to LWR design and safety analysis. Efforts should be aimed at developing well-designed unified physical/mathematical and high-resolution numerical models for compressible, all-speed multiphase flows spanning: (1) Well-posed general mixture level (true multiphase) models for fast transient situations and safety analysis, (2) DNS (Direct Numerical Simulation)-like models to resolve interface level phenmena like flashing and boiling flows, and critical heat flux determination (necessarily including conjugate heat transfer), and (3) Multi-scale methods to resolve both (1) and (2) automatically, depending upon specified mesh resolution, and to couple different flow models (single-phase, multiphase with several velocities and pressures, multiphase with single velocity and pressure, etc.) A unified, multi-scale approach is advocated to extend the necessary foundations and build the capability to simultaneously solve the fluid dynamic interface problems (interface resolution) as well as multiphase mixtures (homogenization).

  8. Aeroacoustic simulation for phonation modeling

    NASA Astrophysics Data System (ADS)

    Irwin, Jeffrey; Hanford, Amanda; Craven, Brent; Krane, Michael

    2011-11-01

    The phonation process occurs as air expelled from the lungs creates a pressure drop and a subsequent air flow across the larynx. The fluid-structure interaction between the turbulent air flow and oscillating vocal folds, combined with additional resonance in the oral and nasal cavities, creates much of what we hear in the human voice. As many voice-related disorders can be traced to irregular vocal tract shape or motion, it is important to understand in detail the physics involved in the phonation process. To numerically compute the physics of phonation, a solver must be able to accurately model acoustic airflow through a moving domain. The open-source CFD package OpenFOAM is currently being used to evaluate existing solvers against simple acoustic test cases, including an open-ended resonator and an expansion chamber, both of which utilize boundary conditions simulating acoustic sources as well as anechoic termination. Results of these test cases will be presented and compared with theory, and the future development of a three-dimensional vocal tract model and custom-mode acoustic solver will be discussed. Acknowledge support of NIH grant 5R01DC005642 and ARL E&F program.

  9. Modeling and simulation of spacecraft power systems

    NASA Technical Reports Server (NTRS)

    Lee, J. R.; Cho, B. H.; Kim, S. J.; Lee, F. C.

    1987-01-01

    EASY5 modeling of a complete spacecraft power processing system is presented. Component models are developed, and several system models including a solar array switching system, a partially-shunted solar array system and COBE system are simulated. The power system's modes of operation, such as shunt mode, battery-charge mode, and battery-discharge mode, are simulated for a complete orbit cycle.

  10. A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL)

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Owen, Jeffrey E.

    1988-01-01

    A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL) is presented which overcomes the traditional disadvantages of simulations executed on a digital computer. The incorporation of parallel processing allows the mapping of simulations into a digital computer to be done in the same inherently parallel manner as they are currently mapped onto an analog computer. The direct-execution format maximizes the efficiency of the executed code since the need for a high level language compiler is eliminated. Resolution is greatly increased over that which is available with an analog computer without the sacrifice in execution speed normally expected with digitial computer simulations. Although this report covers all aspects of the new architecture, key emphasis is placed on the processing element configuration and the microprogramming of the ACLS constructs. The execution times for all ACLS constructs are computed using a model of a processing element based on the AMD 29000 CPU and the AMD 29027 FPU. The increase in execution speed provided by parallel processing is exemplified by comparing the derived execution times of two ACSL programs with the execution times for the same programs executed on a similar sequential architecture.

  11. Simulation-based evaluation of Advanced Traveler Information Services (ATIS)

    E-print Network

    Florian, Daniel George

    2004-01-01

    Drivers using information from an Advanced Traveler Information System (ATIS) could potentially make better travel decisions to reduce travel time and increase trip reliability, thereby benefiting both guided drivers as ...

  12. Measurement and modeling of advanced coal conversion processes. Annual report, October 1990--September 1991

    SciTech Connect

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G.; Smoot, L.D.; Brewster, B.S.

    1991-12-31

    The overall objective of this program is the development of predictive capability for the design, scale up, simulation, control and feedstock evaluation in advanced coal conversion devices. This program will merge significant advances made in measuring and quantitatively describing the mechanisms in coal conversion behavior. Comprehensive computer codes for mechanistic modeling of entrained-bed gasification. Additional capabilities in predicting pollutant formation will be implemented and the technology will be expanded to fixed-bed reactors.

  13. Survey of models/simulations at RADC

    NASA Astrophysics Data System (ADS)

    Denz, M. L.

    1982-11-01

    A survey was conducted to evaluate the current state of the art and technology of model/simulation capabilities at Rome Air Development Center, Griffiss AFB, NY. This memo presents a tabulation of 28 such models/simulations. These models/simulations are being used within RADC in the development and evaluations of Command, Control, Communications and Intelligence (C3I) technology. The results of this survey are incorporated in this memo.

  14. Modeling, Simulation and Analysis of Public Key Infrastructure

    NASA Technical Reports Server (NTRS)

    Liu, Yuan-Kwei; Tuey, Richard; Ma, Paul (Technical Monitor)

    1998-01-01

    Security is an essential part of network communication. The advances in cryptography have provided solutions to many of the network security requirements. Public Key Infrastructure (PKI) is the foundation of the cryptography applications. The main objective of this research is to design a model to simulate a reliable, scalable, manageable, and high-performance public key infrastructure. We build a model to simulate the NASA public key infrastructure by using SimProcess and MatLab Software. The simulation is from top level all the way down to the computation needed for encryption, decryption, digital signature, and secure web server. The application of secure web server could be utilized in wireless communications. The results of the simulation are analyzed and confirmed by using queueing theory.

  15. Theory, Modeling, and Simulation of Semiconductor Lasers

    NASA Technical Reports Server (NTRS)

    Ning, Cun-Zheng; Saini, Subbash (Technical Monitor)

    1998-01-01

    Semiconductor lasers play very important roles in many areas of information technology. In this talk, I will first give an overview of semiconductor laser theory. This will be followed by a description of different models and their shortcomings in modeling and simulation. Our recent efforts in constructing a fully space and time resolved simulation model will then be described. Simulation results based on our model will be presented. Finally the effort towards a self-consistent and comprehensive simulation capability for the opto-electronics integrated circuits (OEICs) will be briefly reviewed.

  16. Advanced Numerical Methods and Software Approaches for Semiconductor Device Simulation

    DOE PAGESBeta

    Carey, Graham F.; Pardhanani, A. L.; Bova, S. W.

    2000-01-01

    In this article we concisely present several modern strategies that are applicable to driftdominated carrier transport in higher-order deterministic models such as the driftdiffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of “upwind” and artificial dissipation schemes, generalization of the traditional Scharfetter – Gummel approach, Petrov – Galerkin and streamline-upwind Petrov Galerkin (SUPG), “entropy” variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of themore »methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. We have included numerical examples from our recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and we emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, we briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.« less

  17. Advanced numerical methods and software approaches for semiconductor device simulation

    SciTech Connect

    CAREY,GRAHAM F.; PARDHANANI,A.L.; BOVA,STEVEN W.

    2000-03-23

    In this article the authors concisely present several modern strategies that are applicable to drift-dominated carrier transport in higher-order deterministic models such as the drift-diffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of upwind and artificial dissipation schemes, generalization of the traditional Scharfetter-Gummel approach, Petrov-Galerkin and streamline-upwind Petrov Galerkin (SUPG), entropy variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of the methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. They have included numerical examples from the recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and they emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, they briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.

  18. Simulation of Thin-Film Damping and Thermal Mechanical Noise Spectra for Advanced Micromachined Microphone Structures

    PubMed Central

    Hall, Neal A.; Okandan, Murat; Littrell, Robert; Bicen, Baris; Degertekin, F. Levent

    2008-01-01

    In many micromachined sensors the thin (2–10 ?m thick) air film between a compliant diaphragm and backplate electrode plays a dominant role in shaping both the dynamic and thermal noise characteristics of the device. Silicon microphone structures used in grating-based optical-interference microphones have recently been introduced that employ backplates with minimal area to achieve low damping and low thermal noise levels. Finite-element based modeling procedures based on 2-D discretization of the governing Reynolds equation are ideally suited for studying thin-film dynamics in such structures which utilize relatively complex backplate geometries. In this paper, the dynamic properties of both the diaphragm and thin air film are studied using a modal projection procedure in a commonly used finite element software and the results are used to simulate the dynamic frequency response of the coupled structure to internally generated electrostatic actuation pressure. The model is also extended to simulate thermal mechanical noise spectra of these advanced sensing structures. In all cases simulations are compared with measured data and show excellent agreement—demonstrating 0.8 pN/?Hz and 1.8 ?Pa/?Hz thermal force and thermal pressure noise levels, respectively, for the 1.5 mm diameter structures under study which have a fundamental diaphragm resonance-limited bandwidth near 20 kHz. PMID:19081811

  19. Proceedings of the CASA Workshop on 3D Advanced Media In Gaming And Simulation (3AMIGAS)

    E-print Network

    Veltkamp, Remco

    Proceedings of the CASA Workshop on 3D Advanced Media In Gaming And Simulation (3AMIGAS) Amsterdam (3AMIGAS) Amsterdam, The Netherlands, 16 June 2009 ISBN 978-90-393-5102-4 Cover image created #12;i Preface The 3AMIGAS workshop is about the creation and use of advanced 3D media in gaming

  20. RECENT ADVANCES IN MACROMOLECULAR HYDRODYNAMIC MODELING

    PubMed Central

    Aragon, Sergio R.

    2010-01-01

    The modern implementation of the boundary element method (S.R. Aragon, J. Comput. Chem. 25(2004)1191–12055) has ushered unprecedented accuracy and precision for the solution of the Stokes equations of hydrodynamics with stick boundary conditions. This article begins by reviewing computations with the program BEST of smooth surface objects such as ellipsoids, the dumbbell, and cylinders that demonstrate that the numerical solution of the integral equation formulation of hydrodynamics yields very high precision and accuracy. When BEST is used for macromolecular computations, the limiting factor becomes the definition of the molecular hydrodynamic surface and the implied effective solvation of the molecular surface. Studies on 49 different proteins, ranging in molecular weight from 9 to over 400 kDa, have shown that a model using a 1.1 A thick hydration layer describes all protein transport properties very well for the overwhelming majority of them. In addition, this data implies that the crystal structure is an excellent representation of the average solution structure for most of them. In order to investigate the origin of a handful of significant discrepancies in some multimeric proteins (over ?20% observed in the intrinsic viscosity), the technique of Molecular Dynamics simulation (MD) has been incorporated into the research program. A preliminary study of dimeric ?-chymotrypsin using approximate implicit water MD is presented. In addition I describe the successful validation of modern protein force fields, ff03 and ff99SB, for the accurate computation of solution structure in explicit water simulation by comparison of trajectory ensemble average computed transport properties with experimental measurements. This work includes small proteins such as lysozyme, ribonuclease and ubiquitin using trajectories around 10 ns duration. We have also studied a 150 kDa flexible monoclonal IgG antibody, trastuzumab, with multiple independent trajectories encompassing over 320 ns of simulation. The close agreement within experimental error of the computed and measured properties allows us to conclude that MD does produce structures typical of those in solution, and that flexible molecules can be properly described using the method of ensemble averaging over a trajectory. We review similar work on the study of a transfer RNA molecule and DNA oligomers that demonstrate that within 3% a simple uniform hydration model 1.1 A thick provides agreement with experiment for these nucleic acids. In the case of linear oligomers, the precision can be improved close to 1% by a non-uniform hydration model that hydrates mainly in the DNA grooves, in agreement with high resolution x-ray diffraction. We conclude with a vista on planned improvements for the BEST program to decrease its memory requirements and increase its speed without sacrificing accuracy. PMID:21073955

  1. Evaluating uncertainty in stochastic simulation models

    SciTech Connect

    McKay, M.D.

    1998-02-01

    This paper discusses fundamental concepts of uncertainty analysis relevant to both stochastic simulation models and deterministic models. A stochastic simulation model, called a simulation model, is a stochastic mathematical model that incorporates random numbers in the calculation of the model prediction. Queuing models are familiar simulation models in which random numbers are used for sampling interarrival and service times. Another example of simulation models is found in probabilistic risk assessments where atmospheric dispersion submodels are used to calculate movement of material. For these models, randomness comes not from the sampling of times but from the sampling of weather conditions, which are described by a frequency distribution of atmospheric variables like wind speed and direction as a function of height above ground. A common characteristic of simulation models is that single predictions, based on one interarrival time or one weather condition, for example, are not nearly as informative as the probability distribution of possible predictions induced by sampling the simulation variables like time and weather condition. The language of model analysis is often general and vague, with terms having mostly intuitive meaning. The definition and motivations for some of the commonly used terms and phrases offered in this paper lead to an analysis procedure based on prediction variance. In the following mathematical abstraction the authors present a setting for model analysis, relate practical objectives to mathematical terms, and show how two reasonable premises lead to a viable analysis strategy.

  2. Advances in modeling wave particle interactions in the radiation belts

    NASA Astrophysics Data System (ADS)

    Shprits, Yuri

    2012-07-01

    We discuss the recent advances in simulations of the inner and outer radiation belts including radial, pitch-angle, energy, and mixed diffusio, and nno-lineaactions. Recently developed computer codes allow for qauntificaiton of the qussi-linear scattering due to day-side and night-side chorus waves, magneto-sonic waves, phasmaspheric hiss waves, EMIC and hiss waves in the regions of plumes, lightning generated whistlers, and anthropogenic whistlers. Sensitivity simulations show that the knowledge of wave spectral properties and spatial distribution of waves is crucially important for reproducing long term observations. The 3D simulations are compared to 3D reanalysis of the radiation belt fluxes that are obtained by blending the predictive model with observations from GEO, CRRES, Akebono, GPS and LANL. Recent research shows that similar processes may be responsible for acceleration and loss of energetic particles on the outer planet. Similar processes may be also important for acceleration and loss of particles on the Sun and solar wind and in other corners of the Universe.

  3. Technology evaluation, assessment, modeling, and simulation: the TEAMS capability

    NASA Astrophysics Data System (ADS)

    Holland, Orgal T.; Stiegler, Robert L.

    1998-08-01

    The United States Marine Corps' Technology Evaluation, Assessment, Modeling and Simulation (TEAMS) capability, located at the Naval Surface Warfare Center in Dahlgren Virginia, provides an environment for detailed test, evaluation, and assessment of live and simulated sensor and sensor-to-shooter systems for the joint warfare community. Frequent use of modeling and simulation allows for cost effective testing, bench-marking, and evaluation of various levels of sensors and sensor-to-shooter engagements. Interconnectivity to live, instrumented equipment operating in real battle space environments and to remote modeling and simulation facilities participating in advanced distributed simulations (ADS) exercises is available to support a wide- range of situational assessment requirements. TEAMS provides a valuable resource for a variety of users. Engineers, analysts, and other technology developers can use TEAMS to evaluate, assess and analyze tactical relevant phenomenological data on tactical situations. Expeditionary warfare and USMC concept developers can use the facility to support and execute advanced warfighting experiments (AWE) to better assess operational maneuver from the sea (OMFTS) concepts, doctrines, and technology developments. Developers can use the facility to support sensor system hardware, software and algorithm development as well as combat development, acquisition, and engineering processes. Test and evaluation specialists can use the facility to plan, assess, and augment their processes. This paper presents an overview of the TEAMS capability and focuses specifically on the technical challenges associated with the integration of live sensor hardware into a synthetic environment and how those challenges are being met. Existing sensors, recent experiments and facility specifications are featured.

  4. Multiple model simulation: modelling cell division and differentiation in the

    E-print Network

    Stepney, Susan

    Multiple model simulation: modelling cell division and differentiation in the prostate Alastair this approach to building a model of prostate cell division and differentiation, with each model layer can be designed and validated. In this paper we present the modelling and simulation of cell division

  5. Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling

    NASA Astrophysics Data System (ADS)

    Schum, William K.; Doolittle, Christina M.; Boyarko, George A.

    2006-05-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.

  6. Advances in simulating radiance signatures for dynamic air/water interfaces

    NASA Astrophysics Data System (ADS)

    Goodenough, Adam A.; Brown, Scott D.; Gerace, Aaron

    2015-05-01

    The air-water interface poses a number of problems for both collecting and simulating imagery. At the surface, the magnitude of observed radiance can change by multiple orders of magnitude at high spatiotemporal frequency due to glinting effects. In the volume, similarly high frequency focusing of photons by a dynamic wave surface significantly changes the reflected radiance of in-water objects and the scattered return of the volume itself. These phenomena are often manifest as saturated pixels and artifacts in collected imagery (often enhanced by time delays between neighboring pixels or interpolation between adjacent filters) and as noise and greater required computation times in simulated imagery. This paper describes recent advances made to the Digital Image and Remote Sensing Image Generation (DIRSIG) model to address the simulation issues to better facilitate an understanding of a multi/hyper-spectral collection. Glint effects are simulated using a dynamic height field that can be driven by wave frequency models and generates a sea state at arbitrary time scales. The volume scattering problem is handled by coupling the geometry representing the surface (facetization by the height field) with the single scattering contribution at any point in the water. The problem is constrained somewhat by assuming that contributions come from a Snell's window above the scattering point and by assuming a direct source (sun). Diffuse single scattered and multiple scattered energy contributions are handled by Monte Carlo techniques employed previously. The model is compared to existing radiative transfer codes where possible, with the objective of providing a robust movel of time-dependent absolute radiance at many wavelengths.

  7. ADVANCED SIMULATION CAPABILITY FOR ENVIRONMENTAL MANAGEMENT- CURRENT STATUS AND PHASE II DEMONSTRATION RESULTS

    SciTech Connect

    Seitz, R.

    2013-02-26

    The U.S. Department of Energy (USDOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Toolsets and High-Performance Computing (HPC) Multiprocess Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, toolsets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial toolsets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations, addressing attenuation-based remedies at the Savannah River Site F Area and performance assessment for a representative waste tank, illustrate integration of linked ASCEM capabilities and initial integration efforts with tools from the Cementitious Barriers Partnership.

  8. ADVANCED SIMULATION CAPABILITY FOR ENVIRONMENTAL MANAGEMENT – CURRENT STATUS AND PHASE II DEMONSTRATION RESULTS

    SciTech Connect

    Seitz, Roger; Freshley, Mark D.; Dixon, Paul; Hubbard, Susan S.; Freedman, Vicky L.; Flach, Gregory P.; Faybishenko, Boris; Gorton, Ian; Finsterle, Stefan A.; Moulton, John D.; Steefel, Carl I.; Marble, Justin

    2013-06-27

    The U.S. Department of Energy (USDOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Toolsets and High-Performance Computing (HPC) Multiprocess Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, toolsets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial toolsets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations, addressing attenuation-based remedies at the Savannah River Site F Area and performance assessment for a representative waste tank, illustrate integration of linked ASCEM capabilities and initial integration efforts with tools from the Cementitious Barriers Partnership.

  9. Steady-state Analysis Model for Advanced Fuelcycle Schemes

    Energy Science and Technology Software Center (ESTSC)

    2006-05-12

    The model was developed as a part of the study, "Advanced Fuel Cycles and Waste Management", which was performed during 2003?2005 by an ad-hoc expert group under the Nuclear Development Committee in the OECD/NEA. The model was designed for an efficient conduct of nuclear fuel cycle scheme cost analyses. It is simple, transparent and offers users the capability to track down the cost analysis results. All the fuel cycle schemes considered in the model aremore »represented in a graphic format and all values related to a fuel cycle step are shown in the graphic interface, i.e., there are no hidden values embedded in the calculations. All data on the fuel cycle schemes considered in the study including mass flows, waste generation, cost data, and other data such as activities, decay heat and neutron sources of spent fuel and high?level waste along time are included in the model and can be displayed. The user can modify easily the values of mass flows and/or cost parameters and see the corresponding changes in the results. The model calculates: front?end fuel cycle mass flows such as requirements of enrichment and conversion services and natural uranium; mass of waste based on the waste generation parameters and the mass flow; and all costs. It performs Monte Carlo simulations with changing the values of all unit costs within their respective ranges (from lower to upper bounds).« less

  10. Stress trajectory and advanced hydraulic-fracture simulations for the Eastern Gas Shales Project. Final report, April 30, 1981-July 30, 1983

    SciTech Connect

    Advani, S.H.; Lee, J.K.

    1983-01-01

    A summary review of hydraulic fracture modeling is given. Advanced hydraulic fracture model formulations and simulation, using the finite element method, are presented. The numerical examples include the determination of fracture width, height, length, and stress intensity factors with the effects of frac fluid properties, layered strata, in situ stresses, and joints. Future model extensions are also recommended. 66 references, 23 figures.

  11. Modeling biological systems using Dynetica a simulator of dynamic networks

    E-print Network

    Richardson, David

    Modeling biological systems using Dynetica ­ a simulator of dynamic networks June 2002 Lingchong Mathematical modeling and computer simulation may deepen our understanding of complex systems by testing of dynamic networks Keywords: mathematical modeling, deterministic simulation, stochastic simulation, genetic

  12. VHDL simulation with access to transistor models

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  13. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  14. Modeling techniques for simulating well behavior 

    E-print Network

    Rattu, Bungen Christina

    2002-01-01

    to model the combined effect of wellbore storage and skin in pressure-transient test are developed. These relations enable this effect to be modeled in any conventional reservoir simulator without the need to modify the existing program. Alternative grid...

  15. Material model validation for laser shock peening process simulation

    NASA Astrophysics Data System (ADS)

    Amarchinta, H. K.; Grandhi, R. V.; Langer, K.; Stargel, D. S.

    2009-01-01

    Advanced mechanical surface enhancement techniques have been used successfully to increase the fatigue life of metallic components. These techniques impart deep compressive residual stresses into the component to counter potentially damage-inducing tensile stresses generated under service loading. Laser shock peening (LSP) is an advanced mechanical surface enhancement technique used predominantly in the aircraft industry. To reduce costs and make the technique available on a large-scale basis for industrial applications, simulation of the LSP process is required. Accurate simulation of the LSP process is a challenging task, because the process has many parameters such as laser spot size, pressure profile and material model that must be precisely determined. This work focuses on investigating the appropriate material model that could be used in simulation and design. In the LSP process material is subjected to strain rates of 106 s-1, which is very high compared with conventional strain rates. The importance of an accurate material model increases because the material behaves significantly different at such high strain rates. This work investigates the effect of multiple nonlinear material models for representing the elastic-plastic behavior of materials. Elastic perfectly plastic, Johnson-Cook and Zerilli-Armstrong models are used, and the performance of each model is compared with available experimental results.

  16. 76 FR 68011 - Medicare Program; Advanced Payment Model

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... From the Federal Register Online via the Government Printing Office ] Vol. 76 Wednesday, No. 212... Medicare Program; Advanced Payment Model; Notice #0;#0;Federal Register / Vol. 76, No. 212 / Wednesday... Services Medicare Program; Advanced Payment Model AGENCY: Centers for Medicare & Medicaid Services...

  17. Large Eddy Simulations and Turbulence Modeling for Film Cooling

    NASA Technical Reports Server (NTRS)

    Acharya, Sumanta

    1999-01-01

    The objective of the research is to perform Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) for film cooling process, and to evaluate and improve advanced forms of the two equation turbulence models for turbine blade surface flow analysis. The DNS/LES were used to resolve the large eddies within the flow field near the coolant jet location. The work involved code development and applications of the codes developed to the film cooling problems. Five different codes were developed and utilized to perform this research. This report presented a summary of the development of the codes and their applications to analyze the turbulence properties at locations near coolant injection holes.

  18. Psychometric and Evidentiary Advances, Opportunities, and Challenges for Simulation-Based Assessment

    ERIC Educational Resources Information Center

    Levy, Roy

    2013-01-01

    This article characterizes the advances, opportunities, and challenges for psychometrics of simulation-based assessments through a lens that views assessment as evidentiary reasoning. Simulation-based tasks offer the prospect for student experiences that differ from traditional assessment. Such tasks may be used to support evidentiary arguments…

  19. SIMULATION MODELLING OF DEMENTIA PATIENT PATHWAYS

    E-print Network

    Oakley, Jeremy

    SIMULATION MODELLING OF DEMENTIA PATIENT PATHWAYS Mohsen Jahangirian, Julie Eatock School of Information Systems, Computing and Mathematics Brunel University, London, UK MalesFemales DEMENTIA `DISEASE-diagnosed patients Disease Progression Disease Onset Community Care DEMENTIA PATHWAY SIMULATION HIGH- LEVEL MODEL

  20. Crop Simulation Models and Decision Support Systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The first computer simulation models for agricultural systems were developed in the 1970s. These early models simulated potential production for major crops as a function of weather conditions, especially temperature and solar radiation. At a later stage, the water component was added to be able to ...

  1. Advanced Power Plant Modeling with Applications to an Advanced Boiling Water

    E-print Network

    Mitchell, John E.

    Advanced Power Plant Modeling with Applications to an Advanced Boiling Water Reactor and a Heat heat transfer. 2. the exact solution of differential equations analytically first through heat transfer, the less computing time advan- tage is not gained because of the additional

  2. Advanced 3-Dimensional CAD Modeling of the Gear Hobbing Process V. Dimitriou 1

    E-print Network

    Aristomenis, Antoniadis

    Advanced 3-Dimensional CAD Modeling of the Gear Hobbing Process V. Dimitriou 1 , A. Antoniadis 1* 1 to the realistic and accurate simulation of the gear hobbing process, an effec- tive and factual approximation is directly applied in one gear gap. Each generating position formulates a three dimensional surface path

  3. Advanced tokamak research with integrated modeling in JT-60 Upgradea)

    NASA Astrophysics Data System (ADS)

    Hayashi, N.; JT-60 Team

    2010-05-01

    Researches on advanced tokamak (AT) have progressed with integrated modeling in JT-60 Upgrade [N. Oyama et al., Nucl. Fusion 49, 104007 (2009)]. Based on JT-60U experimental analyses and first principle simulations, new models were developed and integrated into core, rotation, edge/pedestal, and scrape-off-layer (SOL)/divertor codes. The integrated models clarified complex and autonomous features in AT. An integrated core model was implemented to take account of an anomalous radial transport of alpha particles caused by Alfven eigenmodes. It showed the reduction in the fusion gain by the anomalous radial transport and further escape of alpha particles. Integrated rotation model showed mechanisms of rotation driven by the magnetic-field-ripple loss of fast ions and the charge separation due to fast-ion drift. An inward pinch model of high-Z impurity due to the atomic process was developed and indicated that the pinch velocity increases with the toroidal rotation. Integrated edge/pedestal model clarified causes of collisionality dependence of energy loss due to the edge localized mode and the enhancement of energy loss by steepening a core pressure gradient just inside the pedestal top. An ideal magnetohydrodynamics stability code was developed to take account of toroidal rotation and clarified a destabilizing effect of rotation on the pedestal. Integrated SOL/divertor model clarified a mechanism of X-point multifaceted asymmetric radiation from edge. A model of the SOL flow driven by core particle orbits which partially enter the SOL was developed by introducing the ion-orbit-induced flow to fluid equations.

  4. Advanced Thermal Simulator Testing: Thermal Analysis and Test Results

    NASA Technical Reports Server (NTRS)

    Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David; Reid, Robert; Adams, Mike; Davis, Joe

    2008-01-01

    Work at the NASA Marshall Space Flight Center seeks to develop high fidelity, electrically heated thermal simulators that represent fuel elements in a nuclear reactor design to support non-nuclear testing applicable to the development of a space nuclear power or propulsion system. Comparison between the fuel pins and thermal simulators is made at the outer fuel clad surface, which corresponds to the outer sheath surface in the thermal simulator. The thermal simulators that are currently being tested correspond to a SNAP derivative reactor design that could be applied for Lunar surface power. These simulators are designed to meet the geometric and power requirements of a proposed surface power reactor design, accommodate testing of various axial power profiles, and incorporate imbedded instrumentation. This paper reports the results of thermal simulator analysis and testing in a bare element configuration, which does not incorporate active heat removal, and testing in a water-cooled calorimeter designed to mimic the heat removal that would be experienced in a reactor core.

  5. Recent advances in modeling discontinuities in anisotropic and heterogeneous materials in eddy current NDE

    SciTech Connect

    Aldrin, John C.; Sabbagh, Harold A.; Murphy, R. Kim; Sabbagh, Elias H.

    2011-06-23

    Recent advances are presented to model discontinuities in random anisotropies that arise in certain materials, such as titanium alloys. A numerical model is developed to provide a full anisotropic representation of each crystalline in a gridded region of the material. Several simulated and experimental demonstrations are presented highlighting the effect of grain noise on eddy current measurements. Agreement between VIC-3D(c) model calculations and experimental data in titanium alloy specimens with known flaws is demonstrated.

  6. A Distributed Simulation Facility to Support Human Factors Research in Advanced Air Transportation Technology

    NASA Technical Reports Server (NTRS)

    Amonlirdviman, Keith; Farley, Todd C.; Hansman, R. John, Jr.; Ladik, John F.; Sherer, Dana Z.

    1998-01-01

    A distributed real-time simulation of the civil air traffic environment developed to support human factors research in advanced air transportation technology is presented. The distributed environment is based on a custom simulation architecture designed for simplicity and flexibility in human experiments. Standard Internet protocols are used to create the distributed environment, linking all advanced cockpit simulator, all Air Traffic Control simulator, and a pseudo-aircraft control and simulation management station. The pseudo-aircraft control station also functions as a scenario design tool for coordinating human factors experiments. This station incorporates a pseudo-pilot interface designed to reduce workload for human operators piloting multiple aircraft simultaneously in real time. The application of this distributed simulation facility to support a study of the effect of shared information (via air-ground datalink) on pilot/controller shared situation awareness and re-route negotiation is also presented.

  7. Simulation model of a twin-tail, high performance airplane

    NASA Technical Reports Server (NTRS)

    Buttrill, Carey S.; Arbuckle, P. Douglas; Hoffler, Keith D.

    1992-01-01

    The mathematical model and associated computer program to simulate a twin-tailed high performance fighter airplane (McDonnell Douglas F/A-18) are described. The simulation program is written in the Advanced Continuous Simulation Language. The simulation math model includes the nonlinear six degree-of-freedom rigid-body equations, an engine model, sensors, and first order actuators with rate and position limiting. A simplified form of the F/A-18 digital control laws (version 8.3.3) are implemented. The simulated control law includes only inner loop augmentation in the up and away flight mode. The aerodynamic forces and moments are calculated from a wind-tunnel-derived database using table look-ups with linear interpolation. The aerodynamic database has an angle-of-attack range of -10 to +90 and a sideslip range of -20 to +20 degrees. The effects of elastic deformation are incorporated in a quasi-static-elastic manner. Elastic degrees of freedom are not actively simulated. In the engine model, the throttle-commanded steady-state thrust level and the dynamic response characteristics of the engine are based on airflow rate as determined from a table look-up. Afterburner dynamics are switched in at a threshold based on the engine airflow and commanded thrust.

  8. Physically-based Modeling and Simulation of Extraocular Muscles

    PubMed Central

    Wei, Qi; Sueda, Shinjiro; Pai, Dinesh K.

    2010-01-01

    Dynamic simulation of human eye movements, with realistic physical models of extraocular muscles (EOMs), may greatly advance our understanding of the complexities of the oculomotor system and aid in treatment of visuomotor disorders. In this paper we describe the first three dimensional (3D) biomechanical model which can simulate the dynamics of ocular motility at interactive rates. We represent EOMs using “strands”, which are physical primitives that can model an EOM's complex nonlinear anatomical and physiological properties. Contact between the EOMs, the globe, and orbital structures can be explicitly modeled. Several studies were performed to assess the validity and utility of the model. EOM deformation during smooth pursuit was simulated and compared with published experimental data; the model reproduces qualitative features of the observed non-uniformity. The model is able to reproduce realistic saccadic trajectories when the lateral rectus muscle was driven by published measurements of abducens neuron discharge. Finally, acute superior oblique palsy, a pathological condition, was simulated to further evaluate the system behavior; the predicted deviation patterns agree qualitatively with experimental observations. This example also demonstrates potential clinical applications of such a model. PMID:20868704

  9. Advances in the U.S. Navy Non-hydrostatic Unified Model of the Atmosphere (NUMA): LES as a Stabilization Methodology for High-Order Spectral Elements in the Simulation of Deep Convection

    NASA Astrophysics Data System (ADS)

    Marras, Simone; Giraldo, Frank

    2015-04-01

    The prediction of extreme weather sufficiently ahead of its occurrence impacts society as a whole and coastal communities specifically (e.g. Hurricane Sandy that impacted the eastern seaboard of the U.S. in the fall of 2012). With the final goal of solving hurricanes at very high resolution and numerical accuracy, we have been developing the Non-hydrostatic Unified Model of the Atmosphere (NUMA) to solve the Euler and Navier-Stokes equations by arbitrary high-order element-based Galerkin methods on massively parallel computers. NUMA is a unified model with respect to the following criteria: (a) it is based on unified numerics in that element-based Galerkin methods allow the user to choose between continuous (spectral elements, CG) or discontinuous Galerkin (DG) methods and from a large spectrum of time integrators, (b) it is unified across scales in that it can solve flow in limited-area mode (flow in a box) or in global mode (flow on the sphere). NUMA is the dynamical core that powers the U.S. Naval Research Laboratory's next-generation global weather prediction system NEPTUNE (Navy's Environmental Prediction sysTem Utilizing the NUMA corE). Because the solution of the Euler equations by high order methods is prone to instabilities that must be damped in some way, we approach the problem of stabilization via an adaptive Large Eddy Simulation (LES) scheme meant to treat such instabilities by modeling the sub-grid scale features of the flow. The novelty of our effort lies in the extension to high order spectral elements for low Mach number stratified flows of a method that was originally designed for low order, adaptive finite elements in the high Mach number regime [1]. The Euler equations are regularized by means of a dynamically adaptive stress tensor that is proportional to the residual of the unperturbed equations. Its effect is close to none where the solution is sufficiently smooth, whereas it increases elsewhere, with a direct contribution to the stabilization of the otherwise oscillatory solution. As a first step toward the Large Eddy Simulation of a hurricane, we verify the model via a high-order and high resolution idealized simulation of deep convection on the sphere. References [1] M. Nazarov and J. Hoffman (2013) Residual-based artificial viscosity for simulation of turbulent compressible flow using adaptive finite element methods Int. J. Numer. Methods Fluids, 71:339-357

  10. Developments in Atmosphere Revitalization Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Knox, James C.; Kittredge, Kenneth; Xoker, Robert F.; Cummings, Ramona; Gomez, Carlos F.

    2012-01-01

    "NASA's Advanced Exploration Systems (AES) program is pioneering new approaches for rapidly developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit" (NASA 2012). These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must not only blast out of earth's gravity well as during the Apollo moon missions, but also launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach, which is then implemented in a full-scale integrated atmosphere revitalization test. This paper describes the development of atmosphere revitalization models and simulations. A companion paper discusses the hardware design and sorbent screening and characterization effort in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project within the AES program.

  11. Photocatalytic Removal of Microcystin-LR by Advanced WO3-Based Nanoparticles under Simulated Solar Light

    PubMed Central

    Zhao, Chao; Li, Dawei; Feng, Chuanping; Zhang, Zhenya; Sugiura, Norio; Yang, Yingnan

    2015-01-01

    A series of advanced WO3-based photocatalysts including CuO/WO3, Pd/WO3, and Pt/WO3 were synthesized for the photocatalytic removal of microcystin-LR (MC-LR) under simulated solar light. In the present study, Pt/WO3 exhibited the best performance for the photocatalytic degradation of MC-LR. The MC-LR degradation can be described by pseudo-first-order kinetic model. Chloride ion (Cl?) with proper concentration could enhance the MC-LR degradation. The presence of metal cations (Cu2+ and Fe3+) improved the photocatalytic degradation of MC-LR. This study suggests that Pt/WO3 photocatalytic oxidation under solar light is a promising option for the purification of water containing MC-LR. PMID:25884038

  12. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    SciTech Connect

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  13. Stellar Outflows with New Tools: Advanced Simulations and Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Frank, A.; Poludnenko, A.; Gardiner, T. A.; Lebedev, S. V.; Drake, R. P.

    2003-01-01

    In this contribution we provide a brief overview of new numerical results describing the evolution of clumpy flows as well as new studies of magnetized winds/jets. In addition, we report on a new approach to studying these phenomena: direct laboratory experiments. Recent advances in the use of ``high energy density'' laboratory devices now allows researchers to produce scalable plasma flow experiments relevant to hypersonic astrophysical jets and shock-clump interactions in the context of circumstellar outflows.

  14. A mathematical representation of an advanced helicopter for piloted simulator investigations of control system and display variations

    NASA Technical Reports Server (NTRS)

    Aiken, E. W.

    1980-01-01

    A mathematical model of an advanced helicopter is described. The model is suitable for use in control/display research involving piloted simulation. The general design approach for the six degree of freedom equations of motion is to use the full set of nonlinear gravitational and inertial terms of the equations and to express the aerodynamic forces and moments as the reference values and first order terms of a Taylor series expansion about a reference trajectory defined as a function of longitudinal airspeed. Provisions for several different specific and generic flight control systems are included in the model. The logic required to drive various flight control and weapon delivery symbols on a pilot's electronic display is also provided. Finally, the model includes a simplified representation of low altitude wind and turbulence effects. This model was used in a piloted simulator investigation of the effects of control system and display variations for an attack helicopter mission.

  15. Advanced microwave forward model for the land surface data assimilation

    NASA Astrophysics Data System (ADS)

    Park, Chang-Hwan; Pause, Marion; Gayler, Sebastian; Wollschlaeger, Ute; Jackson, Thomas J.; LeDrew, Ellsworth; Behrendt, Andreas; Wulfmeyer, Volker

    2015-04-01

    From local to global scales, microwave remote-sensing techniques can provide temporally and spatially highly resolved observations of land surface properties including soil moisture and temperature as well as the state of vegetation. These variables are critical for agricultural productivity and water resource management. Furthermore, having accurate information of these variables allows us to improve the performances of numerical weather forecasts and climate prediction models. However, it is challenging to translate a measured brightness temperature into the multiple land surface properties because of the inherent inversion problem. In this study, we introduce a novel forward model for microwave remote sensing to resolve this inversion problem and to close the gap between land surface modeling and observations. It is composed of the Noah-MP land surface model as well as new models for the dielectric mixing and the radiative transfer. For developing a realistic forward operator, the land surface model must simulate soil and vegetation processes properly. The Noah-MP land surface model provides an excellent starting point because it contains already a sophisticated soil texture and land cover data set. Soil moisture transport is derived using the Richards equation in combination with a set of soil hydraulic parameters. Vegetation properties are considered using several photosynthesis models with different complexity. The energy balance is closed for the top soil and the vegetation layers. The energy flux becomes more realistic due to including not only the volumetric ratio of land surface properties but also their surface fraction as sub-grid scale information (semitile approach). Dielectric constant is the fundamental link to quantify the land surface properties. Our physical based new dielectric-mixing model is superior to previous calibration and semi-empirical approaches. Furthermore, owing to the consideration of the oversaturated surface dielectric behaviour, a significant improvement by new approach would be expected in monitoring surface runoff and infiltration, managing and improving irrigation system, and mapping and predicting flood events. Finally, the novel dielectric-mixing model is able to successfully integrate the land surface model and the dielectric constant of microwave. Radiative transfer is calculated for the bare soil and the vegetated components of the grid box using a two-stream radiative transfer model. These model characteristics provide all relevant information needed for a simulation of the microwave emission from the land surface with unprecedented realism. Noah-MP is coupled with the Weather Research and Forecasting (WRF) model system. Also, the novel dielectric-mixing model physically links the Noah-MP land surface properties and the microwave effective dielectric constant. Finally, with the existing radiative transfer model the advanced forward model can assimilate microwave brightness temperature into a consistent land-surface-atmosphere system. A case study will be provided to investigate how well the simulation of the forward model matches to the real world. L-band microwave remote-sensing measurements over the Schäfertal region in Germany have been used for this case study.

  16. A holistic water depth simulation model for small ponds

    NASA Astrophysics Data System (ADS)

    Ali, Shakir; Ghosh, Narayan C.; Mishra, P. K.; Singh, R. K.

    2015-10-01

    Estimation of time varying water depth and time to empty of a pond is prerequisite for comprehensive and coordinated planning of water resource for its effective utilization. A holistic water depth simulation (HWDS) and time to empty (TE) model for small, shallow ephemeral ponds have been derived by employing the generalized model based on the Green-Ampt equation in the basic water balance equation. The HWDS model includes time varying rainfall, runoff, surface water evaporation, outflow and advancement of wetting front length as external inputs. The TE model includes two external inputs; surface water evaporation and advancement of wetting front length. Both the models also consider saturated hydraulic conductivity and fillable porosity of the pond's bed material as their parameters. The solution of the HWDS model involved numerical iteration in successive time intervals. The HWDS model has successfully evaluated with 3 years of field data from two small ponds located within a watershed in a semi-arid region in western India. The HWDS model simulated time varying water depth in the ponds with high accuracy as shown by correlation coefficient (R2 ? 0.9765), index of agreement (d ? 0.9878), root mean square errors (RMSE ? 0.20 m) and percent bias (PB ? 6.23%) for the pooled data sets of the measured and simulated water depth. The statistical F and t-tests also confirmed the reliability of the HWDS model at probability level, p ? 0.0001. The response of the TE model showed its ability to estimate the time to empty the ponds. An additional field calibration and validation of the HWDS and TE models with observed field data in varied hydro-climatic conditions could be conducted to increase the applicability and credibility of the models.

  17. Advances in Discrete-Event Simulation for MSL Command Validation

    NASA Technical Reports Server (NTRS)

    Patrikalakis, Alexander; O'Reilly, Taifun

    2013-01-01

    In the last five years, the discrete event simulator, SEQuence GENerator (SEQGEN), developed at the Jet Propulsion Laboratory to plan deep-space missions, has greatly increased uplink operations capacity to deal with increasingly complicated missions. In this paper, we describe how the Mars Science Laboratory (MSL) project makes full use of an interpreted environment to simulate change in more than fifty thousand flight software parameters and conditional command sequences to predict the result of executing a conditional branch in a command sequence, and enable the ability to warn users whenever one or more simulated spacecraft states change in an unexpected manner. Using these new SEQGEN features, operators plan more activities in one sol than ever before.

  18. Simulation of large-scale rule-based models

    SciTech Connect

    Hlavacek, William S; Monnie, Michael I; Colvin, Joshua; Faseder, James

    2008-01-01

    Interactions of molecules, such as signaling proteins, with multiple binding sites and/or multiple sites of post-translational covalent modification can be modeled using reaction rules. Rules comprehensively, but implicitly, define the individual chemical species and reactions that molecular interactions can potentially generate. Although rules can be automatically processed to define a biochemical reaction network, the network implied by a set of rules is often too large to generate completely or to simulate using conventional procedures. To address this problem, we present DYNSTOC, a general-purpose tool for simulating rule-based models. DYNSTOC implements a null-event algorithm for simulating chemical reactions in a homogenous reaction compartment. The simulation method does not require that a reaction network be specified explicitly in advance, but rather takes advantage of the availability of the reaction rules in a rule-based specification of a network to determine if a randomly selected set of molecular components participates in a reaction during a time step. DYNSTOC reads reaction rules written in the BioNetGen language which is useful for modeling protein-protein interactions involved in signal transduction. The method of DYNSTOC is closely related to that of STOCHSIM. DYNSTOC differs from STOCHSIM by allowing for model specification in terms of BNGL, which extends the range of protein complexes that can be considered in a model. DYNSTOC enables the simulation of rule-based models that cannot be simulated by conventional methods. We demonstrate the ability of DYNSTOC to simulate models accounting for multisite phosphorylation and multivalent binding processes that are characterized by large numbers of reactions. DYNSTOC is free for non-commercial use. The C source code, supporting documentation and example input files are available at .

  19. Advanced Simulation in Undergraduate Pilot Training (ASUPT) Facility Utilization Plan.

    ERIC Educational Resources Information Center

    Hagin, William V.; Smith, James F.

    The capabilities of a flight simulation research facility located at Williams AFB, Arizona are described. Research philosophy to be applied is discussed. Long range and short range objectives are identified. A time phased plan for long range research accomplishment is described. In addition, some examples of near term research efforts which will…

  20. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  1. The Simulation of a Jumbo Jet Transport Aircraft. Volume 2: Modeling Data

    NASA Technical Reports Server (NTRS)

    Hanke, C. R.; Nordwall, D. R.

    1970-01-01

    The manned simulation of a large transport aircraft is described. Aircraft and systems data necessary to implement the mathematical model described in Volume I and a discussion of how these data are used in model are presented. The results of the real-time computations in the NASA Ames Research Center Flight Simulator for Advanced Aircraft are shown and compared to flight test data and to the results obtained in a training simulator known to be satisfactory.

  2. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  3. Uncalibrated Building Energy Simulation Modeling Results 

    E-print Network

    Ahmad, M.; Culp, C.H.

    2006-01-01

    &R RESEARCH OCTOBER 2006 1141 Uncalibrated Building Energy Simulation Modeling Results Mushtaq Ahmad Charles H. Culp, PhD, PE Associate Member ASHRAE Fellow ASHRAE Received June 23, 2005; accepted April 17, 2006 Uncalibrated simulations have provided useful... for calibrated simulation procedures and tools. Mushtaq Ahmad is a research engineering associate II in the Energy Systems Laboratory and Charles H. Culp is an associate professor in the Department of Architecture and associate director of the Energy Systems...

  4. Advisor 2.0: A Second-Generation Advanced Vehicle Simulator for Systems Analysis

    SciTech Connect

    Wipke, K.; Cuddy, M.; Bharathan, D.; Burch, S.; Johnson, V.; Markel, A.; Sprik, S.

    1999-03-23

    The National Renewable Energy Laboratory has recently publicly released its second-generation advanced vehicle simulator called ADVISOR 2.0. This software program was initially developed four years ago, and after several years of in-house usage and evolution, the tool is now available to the public through a new vehicle systems analysis World Wide Web page. ADVISOR has been applied to many different systems analysis problems, such as helping to develop the SAE J1711 test procedure for hybrid vehicles and helping to evaluate new technologies as part of the Partnership for a New Generation of Vehicles (PNGV) technology selection process. The model has been and will continue to be benchmarked and validated with other models and with real vehicle test data. After two months of being available on the Web, more than 100 users have downloaded ADVISOR. ADVISOR 2.0 has many new features, including an easy-to-use graphical user interface, a detailed exhaust aftertreatment thermal model, and complete browser-based documentation. Future work will include adding to the library of components available in ADVISOR, including optimization functionality, and linking with a more detailed fuel cell model.

  5. Simulating data processing for an Advanced Ion Mobility Mass Spectrometer

    SciTech Connect

    Chavarría-Miranda, Daniel; Clowers, Brian H.; Anderson, Gordon A.; Belov, Mikhail E.

    2007-11-03

    We have designed and implemented a Cray XD-1-based sim- ulation of data capture and signal processing for an ad- vanced Ion Mobility mass spectrometer (Hadamard trans- form Ion Mobility). Our simulation is a hybrid application that uses both an FPGA component and a CPU-based soft- ware component to simulate Ion Mobility mass spectrome- try data processing. The FPGA component includes data capture and accumulation, as well as a more sophisticated deconvolution algorithm based on a PNNL-developed en- hancement to standard Hadamard transform Ion Mobility spectrometry. The software portion is in charge of stream- ing data to the FPGA and collecting results. We expect the computational and memory addressing logic of the FPGA component to be portable to an instrument-attached FPGA board that can be interfaced with a Hadamard transform Ion Mobility mass spectrometer.

  6. ADVANCES IN COMPREHENSIVE GYROKINETIC SIMULATIONS OF TRANSPORT IN TOKAMAKS

    SciTech Connect

    WALTZ RE; CANDY J; HINTON FL; ESTRADA-MILA C; KINSEY JE

    2004-10-01

    A continuum global gyrokinetic code GYRO has been developed to comprehensively simulate core turbulent transport in actual experimental profiles and enable direct quantitative comparisons to the experimental transport flows. GYRO not only treats the now standard ion temperature gradient (ITG) mode turbulence, but also treats trapped and passing electrons with collisions and finite {beta}, equilibrium ExB shear stabilization, and all in real tokamak geometry. Most importantly the code operates at finite relative gyroradius ({rho}{sub *}) so as to treat the profile shear stabilization and nonlocal effects which can break gyroBohm scaling. The code operates in either a cyclic flux-tube limit (which allows only gyroBohm scaling) or a globally with physical profile variation. Rohm scaling of DIII-D L-mode has been simulated with power flows matching experiment within error bars on the ion temperature gradient. Mechanisms for broken gyroBohm scaling, neoclassical ion flows embedded in turbulence, turbulent dynamos and profile corrugations, plasma pinches and impurity flow, and simulations at fixed flow rather than fixed gradient are illustrated and discussed.

  7. Modeling, simulation and optimization of

    E-print Network

    Bogliolo, Alessandro

    Multiprocessor Simulation Platform CORE CACHE MEM HW HWCORE CACHE CORE CACHE CORE CACHE INTERCONNECTION MEMMEM MEM MEM Massimo Poncino 9 Interconnections · Shared Bus ­ Low cost ­ Not scalable ­ Capacitance grows ­ Scalable ­ Complex Massimo Poncino 10 Interconnections: Shared Bus CORE 1 CORE 2 CORE 3 CORE 4 MEM MEM MEM

  8. Deep Modeling Complex Couplings within Financial Markets Advanced Analytics Institute

    E-print Network

    Cao, Longbing

    Deep Modeling Complex Couplings within Financial Markets Wei Cao Advanced Analytics Institute and Shanghai Jiaotong University lianghu@sjtu.edu.cn Longbing Cao Advanced Analytics Institute University- tagion to other regions, as well as the long-lasting im- pact on different markets, show

  9. SIMULATION MODELING OF GASTROINTESTINAL ABSORPTION

    EPA Science Inventory

    Mathematical dosimetry models incorporate mechanistic determinants of chemical disposition in a living organism to describe relationships between exposure concentration and the internal dose needed for PBPK models and human health risk assessment. Because they rely on determini...

  10. The Advancement Value Chain: An Exploratory Model

    ERIC Educational Resources Information Center

    Leonard, Edward F., III

    2005-01-01

    Since the introduction of the value chain concept in 1985, several varying, yet virtually similar, value chains have been developed for the business enterprise. Shifting to higher education, can a value chain be found that links together the various activities of advancement so that an institution's leaders can actually look at the philanthropic…

  11. Predicting Career Advancement with Structural Equation Modelling

    ERIC Educational Resources Information Center

    Heimler, Ronald; Rosenberg, Stuart; Morote, Elsa-Sofia

    2012-01-01

    Purpose: The purpose of this paper is to use the authors' prior findings concerning basic employability skills in order to determine which skills best predict career advancement potential. Design/methodology/approach: Utilizing survey responses of human resource managers, the employability skills showing the largest relationships to career…

  12. Cattle Uterus: A Novel Animal Laboratory Model for Advanced Hysteroscopic Surgery Training

    PubMed Central

    Ewies, Ayman A. A.; Khan, Zahid R.

    2015-01-01

    In recent years, due to reduced training opportunities, the major shift in surgical training is towards the use of simulation and animal laboratories. Despite the merits of Virtual Reality Simulators, they are far from representing the real challenges encountered in theatres. We introduce the “Cattle Uterus Model” in the hope that it will be adopted in training courses as a low cost and easy-to-set-up tool. It adds new dimensions to the advanced hysteroscopic surgery training experience by providing tactile sensation and simulating intraoperative difficulties. It complements conventional surgical training, aiming to maximise clinical exposure and minimise patients' harm. PMID:26265918

  13. Maui Electrical System Simulation Model Validation

    E-print Network

    Maui Electrical System Simulation Model Validation Prepared for the U.S. Department of Energy ­ Baseline Model Validation By GE Global Research Niskayuna, New York And University of Hawaii Hawaii Natural to build the models and are summarized in this report. ii #12;iii Table of Contents ACKNOWLEDGEMENT

  14. Computer simulations and physical modelling of erosion

    E-print Network

    Franklin, W. Randolph

    Computer simulations and physical modelling of erosion C.S. Stuetzle, J. Gross, Z. Chen, B. Cutler in 2 disciplines. 2 / 10 #12;Problem and goals Validation of Erosion Models for Levee Overtopping Levee terrain, a.k.a. soil, · better modeling of local erosion in terrain and earthen structures such as levees

  15. MODELING CONCEPTS FOR BMP/LID SIMULATION

    EPA Science Inventory

    Enhancement of simulation options for stormwater best management practices (BMPs) and hydrologic source control is discussed in the context of the EPA Storm Water Management Model (SWMM). Options for improvement of various BMP representations are presented, with emphasis on inco...

  16. A Simulation To Model Exponential Growth.

    ERIC Educational Resources Information Center

    Appelbaum, Elizabeth Berman

    2000-01-01

    Describes a simulation using dice-tossing students in a population cluster to model the growth of cancer cells. This growth is recorded in a scatterplot and compared to an exponential function graph. (KHR)

  17. MATHEMATICAL MODELING OF SIMULATED PHOTOCHEMICAL SMOG

    EPA Science Inventory

    This report deals with the continuing effort to develop a chemical kinetic mechanism to describe the formation of photochemical smog. Using the technique of computer modeling to simulate smog chamber data, several explicit kinetic mechanisms for specific hydrocarbons were analyze...

  18. Mathematical Model Development and Simulation Support

    NASA Technical Reports Server (NTRS)

    Francis, Ronald C.; Tobbe, Patrick A.

    2000-01-01

    This report summarizes the work performed in support of the Contact Dynamics 6DOF Facility and the Flight Robotics Lab at NASA/ MSFC in the areas of Mathematical Model Development and Simulation Support.

  19. Theory, modeling, and simulation annual report, 1992

    SciTech Connect

    Not Available

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  20. Structural model uncertainty in stochastic simulation

    SciTech Connect

    McKay, M.D.; Morrison, J.D.

    1997-09-01

    Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.

  1. Microwave Processing of Simulated Advanced Nuclear Fuel Pellets

    SciTech Connect

    D.E. Clark; D.C. Folz

    2010-08-29

    Throughout the three-year project funded by the Department of Energy (DOE) and lead by Virginia Tech (VT), project tasks were modified by consensus to fit the changing needs of the DOE with respect to developing new inert matrix fuel processing techniques. The focus throughout the project was on the use of microwave energy to sinter fully stabilized zirconia pellets using microwave energy and to evaluate the effectiveness of techniques that were developed. Additionally, the research team was to propose fundamental concepts as to processing radioactive fuels based on the effectiveness of the microwave process in sintering the simulated matrix material.

  2. Advanced flight deck/crew station simulator functional requirements

    NASA Technical Reports Server (NTRS)

    Wall, R. L.; Tate, J. L.; Moss, M. J.

    1980-01-01

    This report documents a study of flight deck/crew system research facility requirements for investigating issues involved with developing systems, and procedures for interfacing transport aircraft with air traffic control systems planned for 1985 to 2000. Crew system needs of NASA, the U.S. Air Force, and industry were investigated and reported. A matrix of these is included, as are recommended functional requirements and design criteria for simulation facilities in which to conduct this research. Methods of exploiting the commonality and similarity in facilities are identified, and plans for exploiting this in order to reduce implementation costs and allow efficient transfer of experiments from one facility to another are presented.

  3. Simulation of the heliosphere - Model

    NASA Astrophysics Data System (ADS)

    McNutt, R. L., Jr.; Lyon, John; Goodrich, Charles C.

    1998-02-01

    The problem of the interaction of the solar wind with the very local interstellar medium (VLISM) is complicated by the role played by collisions between the plasma components of the heliosphere and VLISM and the neutral component of the VLISM. We outline the inherent approximations in fluid descriptions of the problem and give formulas for the near-exact charge exchange and elastic collision transfer integrals for particles, momentum, and energy for two drifting Maxwellians with different drift speeds and temperatures. To lowest order, all Boltzmann collision operators can be evaluated analytically in terms of exponentials and error functions. Analytic approximations that have relative errors of less than 2.62 percent compared with the exact expressions can be implemented in large simulation codes. Our formulation avoids approximations used previously by others and leads to a simplification of the formulation and increased accuracy for numerical simulations of the heliosphere/VLISM interaction.

  4. Tutorial on agent-based modeling and simulation.

    SciTech Connect

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2005-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS is a third way of doing science besides deductive and inductive reasoning. Computational advances have made possible a growing number of agent-based applications in a variety of fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling consumer behavior to understanding the fall of ancient civilizations, to name a few. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing ABMS models, and provides some thoughts on the relationship between ABMS and traditional modeling techniques.

  5. Stochastic Approximation to Understand Simple Simulation Models

    NASA Astrophysics Data System (ADS)

    Izquierdo, Segismundo S.; Izquierdo, Luis R.

    2013-04-01

    This paper illustrates how a deterministic approximation of a stochastic process can be usefully applied to analyse the dynamics of many simple simulation models. To demonstrate the type of results that can be obtained using this approximation, we present two illustrative examples which are meant to serve as methodological references for researchers exploring this area. Finally, we prove some convergence results for simulations of a family of evolutionary games, namely, intra-population imitation models in n-player games with arbitrary payoffs.

  6. Atomistic spin model simulations of magnetic nanomaterials

    NASA Astrophysics Data System (ADS)

    Evans, R. F. L.; Fan, W. J.; Chureemart, P.; Ostler, T. A.; Ellis, M. O. A.; Chantrell, R. W.

    2014-03-01

    Atomistic modelling of magnetic materials provides unprecedented detail about the underlying physical processes that govern their macroscopic properties, and allows the simulation of complex effects such as surface anisotropy, ultrafast laser-induced spin dynamics, exchange bias, and microstructural effects. Here we present the key methods used in atomistic spin models which are then applied to a range of magnetic problems. We detail the parallelization strategies used which enable the routine simulation of extended systems with full atomistic resolution.

  7. Coupled Model Simulation of Snowfall Events over the Black Hills.

    NASA Astrophysics Data System (ADS)

    Wang, J.; Hjelmfelt, M. R.; Capehart, W. J.; Farley, R. D.

    2003-06-01

    Numerical simulations of two snowfall events over the Black Hills of South Dakota are made to demonstrate the use and potential of a coupled atmospheric and land surface model. The Coupled Atmospheric-Hydrologic Model System was used to simulate a moderate topographic snowfall event of 10-11 April 1999 and a blizzard event of 18-23 April 2000. These two cases were chosen to provide a contrast of snowfall amounts, locations, and storm dynamics. The model configuration utilized a nested grid with an outer grid of 16-km spacing driven by numerical forecast model data and an inner grid of 4 km centered over the Black Hills region. Simulations for the first case were made with the atmospheric model, the Advanced Regional Prediction System (ARPS) alone, and with ARPS coupled with the National Center for Atmospheric Research Land Surface Model (LSM). Results indicated that the main features of the precipitation pattern were captured by ARPS alone. However, precipitation amounts were greatly overpredicted. ARPS coupled with LSM produced a very similar precipitation pattern, but with precipitation amounts much closer to those observed. The coupled model also permits simulation of the resulting snow cover and snowmelt. Simulated percentage snow melting occurred somewhat more rapidly than that of the observed. Snow-rain discrimination may be taken from the precipitation type falling out of the atmospheric model based on the microphysical parameterization, or by the use of a surface temperature criteria, as used in most large-scale models. The resulting snow accumulation patterns and amounts were nearly identical. The coupled model configuration was used to simulate the second case. In this case the simulated precipitation and snow depth maximum over the eastern Black Hills were biased to the east and north by about 24 km. The resulting spatial correlation of the simulated snowfall and observations was only 0.37. If this bias is removed, the shifted pattern over the Black Hills region has a correlation of 0.68. Snow-melting patterns for 21 and 22 April appeared reasonable, given the spatial bias in the snowfall simulation.

  8. Numerical simulations of Hurricane Bertha using a mesoscale atmospheric model

    SciTech Connect

    Buckley, R.L.

    1996-08-01

    The Regional Atmospheric Model System (RAMS) has been used to simulate Hurricane Bertha as it moved toward and onto shore during the period July 10--12, 1996. Using large-scale atmospheric data from 00 UTC, 11 July (Wednesday evening) to initialize the model, a 36-hour simulation was created for a domain centered over the Atlantic Ocean east of the Florida coast near Jacksonville. The simulated onshore impact time of the hurricane was much earlier than observed (due to the use of results from the large-scale model, which predicted early arrival). However, the movement of the hurricane center (eye) as it approached the North Carolina/South Carolina coast as simulated in RAMS was quite good. Observations revealed a northerly storm track off the South Carolina coast as it moved toward land. As it approached landfall, Hurricane Bertha turned to the north-northeast, roughly paralleling the North Carolina coast before moving inland near Wilmington. Large-scale model forecasts were unable to detect this change in advance and predicted landfall near Myrtle Beach, South Carolina; RAMS, however, correctly predicted the parallel coastal movement. For future hurricane activity in the southeast, RAMS is being configured to run in an operational model using input from the large-scale pressure data in hopes of providing more information on predicted hurricane movement and landfall location.

  9. Theory, Modeling and Simulation: Research progress report 1994--1995

    SciTech Connect

    Garrett, B.C.; Dixon, D.A.; Dunning, T.H.

    1997-01-01

    The Pacific Northwest National Laboratory (PNNL) has established the Environmental Molecular Sciences Laboratory (EMSL). In April 1994, construction began on the new EMSL, a collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation (TM and S) program will play a critical role in understanding molecular processes important in restoring DOE`s research, development, and production sites, including understanding the migration and reactions of contaminants in soils and ground water, developing processes for isolation and processing of pollutants, developing improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TM and S program are fivefold: to apply available electronic structure and dynamics techniques to study fundamental molecular processes involved in the chemistry of natural and contaminated systems; to extend current electronic structure and dynamics techniques to treat molecular systems of future importance and to develop new techniques for addressing problems that are computationally intractable at present; to apply available molecular modeling techniques to simulate molecular processes occurring in the multi-species, multi-phase systems characteristic of natural and polluted environments; to extend current molecular modeling techniques to treat ever more complex molecular systems and to improve the reliability and accuracy of such simulations; and to develop technologies for advanced parallel architectural computer systems. Research highlights of 82 projects are given.

  10. Recent advances in the simulation of particle-laden flows

    NASA Astrophysics Data System (ADS)

    Harting, J.; Frijters, S.; Ramaioli, M.; Robinson, M.; Wolf, D. E.; Luding, S.

    2014-10-01

    A substantial number of algorithms exists for the simulation of moving particles suspended in fluids. However, finding the best method to address a particular physical problem is often highly non-trivial and depends on the properties of the particles and the involved fluid(s) together. In this report, we provide a short overview on a number of existing simulation methods and provide two state of the art examples in more detail. In both cases, the particles are described using a Discrete Element Method (DEM). The DEM solver is usually coupled to a fluid-solver, which can be classified as grid-based or mesh-free (one example for each is given). Fluid solvers feature different resolutions relative to the particle size and separation. First, a multicomponent lattice Boltzmann algorithm (mesh-based and with rather fine resolution) is presented to study the behavior of particle stabilized fluid interfaces and second, a Smoothed Particle Hydrodynamics implementation (mesh-free, meso-scale resolution, similar to the particle size) is introduced to highlight a new player in the field, which is expected to be particularly suited for flows including free surfaces.

  11. ADVANCES IN COMPREHENSIVE GYROKINETIC SIMULATIONS OF TRANSPORT IN TOKAMAKS

    SciTech Connect

    WALTZ,R.E; CANDY,J; HINTON,F.L; ESTRADA-MILA,C; KINSEY,J.E

    2004-10-01

    A continuum global gyrokinetic code GYRO has been developed to comprehensively simulate core turbulent transport in actual experimental profiles and enable direct quantitative comparisons to the experimental transport flows. GYRO not only treats the now standard ion temperature gradient (ITG) mode turbulence, but also treats trapped and passing electrons with collisions and finite {beta}, equilibrium ExB shear stabilization, and all in real tokamak geometry. Most importantly the code operates at finite relative gyroradius ({rho}{sub *}) so as to treat the profile shear stabilization and nonlocal effects which can break gyroBohm scaling. The code operates in either a cyclic flux-tube limit (which allows only gyroBohm scaling) or globally with physical profile variation. Bohm scaling of DIII-D L-mode has been simulated with power flows matching experiment within error bars on the ion temperature gradient. Mechanisms for broken gyroBohm scaling, neoclassical ion flows embedded in turbulence, turbulent dynamos and profile corrugations, are illustrated.

  12. GOING TO THE EXTREMES AN INTERCOMPARISON OF MODEL-SIMULATED HISTORICAL AND

    E-print Network

    Meehl, Gerald A.

    GOING TO THE EXTREMES AN INTERCOMPARISON OF MODEL-SIMULATED HISTORICAL AND FUTURE CHANGES of climate change on human and natural systems. Modeling advances now provide the opportunity of utilizing. This analysis provides a first overview of projected changes in climate extremes from the IPCC-AR4 model

  13. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  14. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This presentation describes the capabilities of three-dimensional thermal power model of advanced stirling radioisotope generator (ASRG). The performance of the ASRG is presented for different scenario, such as Venus flyby with or without the auxiliary cooling system.

  15. Analysis and Modeling of Parasitic Capacitances in Advanced Nanoscale Devices 

    E-print Network

    Bekal, Prasanna

    2012-07-16

    and sidewall capacitances in MOSFETs, bipolar transistors and FinFETs in advanced process nodes. We analyze the importance of considering layout and process variables in device extraction by comparing with standard SPICE models. The results are validated...

  16. Advancing Software Architecture Modeling for Large Scale Heterogeneous Systems

    SciTech Connect

    Gorton, Ian; Liu, Yan

    2010-11-07

    In this paper we describe how incorporating technology-specific modeling at the architecture level can help reduce risks and produce better designs for large, heterogeneous software applications. We draw an analogy with established modeling approaches in scientific domains, using groundwater modeling as an example, to help illustrate gaps in current software architecture modeling approaches. We then describe the advances in modeling, analysis and tooling that are required to bring sophisticated modeling and development methods within reach of software architects.

  17. Advanced load modelling for power system studies 

    E-print Network

    Collin, Adam John

    2013-11-28

    Although power system load modelling is a mature research area, there is a renewed interest in updating available load models and formulating improved load modelling methodologies. The main drivers of this interest are the ...

  18. Advances in petascale kinetic plasma simulation with VPIC and Roadrunner

    SciTech Connect

    Bowers, Kevin J; Albright, Brian J; Yin, Lin; Daughton, William S; Roytershteyn, Vadim; Kwan, Thomas J T

    2009-01-01

    VPIC, a first-principles 3d electromagnetic charge-conserving relativistic kinetic particle-in-cell (PIC) code, was recently adapted to run on Los Alamos's Roadrunner, the first supercomputer to break a petaflop (10{sup 15} floating point operations per second) in the TOP500 supercomputer performance rankings. They give a brief overview of the modeling capabilities and optimization techniques used in VPIC and the computational characteristics of petascale supercomputers like Roadrunner. They then discuss three applications enabled by VPIC's unprecedented performance on Roadrunner: modeling laser plasma interaction in upcoming inertial confinement fusion experiments at the National Ignition Facility (NIF), modeling short pulse laser GeV ion acceleration and modeling reconnection in magnetic confinement fusion experiments.

  19. Computational Techniques for Modeling and Simulating Biological Systems

    E-print Network

    Clack, Christopher D.

    Computational Techniques for Modeling and Simulating Biological Systems CHIH-CHUN CHEN Department Computational techniques for modelling and simulating biological systems are surveyed and their semantic-based, cellular automata, computational biology, rewriting systems, systems biology, modelling, simulation 1

  20. Investigation of advanced fault insertion and simulator methods

    NASA Technical Reports Server (NTRS)

    Dunn, W. R.; Cottrell, D.

    1986-01-01

    The cooperative agreement partly supported research leading to the open-literature publication cited. Additional efforts under the agreement included research into fault modeling of semiconductor devices. Results of this research are presented in this report which is summarized in the following paragraphs. As a result of the cited research, it appears that semiconductor failure mechanism data is abundant but of little use in developing pin-level device models. Failure mode data on the other hand does exist but is too sparse to be of any statistical use in developing fault models. What is significant in the failure mode data is that, unlike classical logic, MSI and LSI devices do exhibit more than 'stuck-at' and open/short failure modes. Specifically they are dominated by parametric failures and functional anomalies that can include intermittent faults and multiple-pin failures. The report discusses methods of developing composite pin-level models based on extrapolation of semiconductor device failure mechanisms, failure modes, results of temperature stress testing and functional modeling. Limitations of this model particularly with regard to determination of fault detection coverage and latency time measurement are discussed. Indicated research directions are presented.

  1. Radiation Damage in Nuclear Fuel for Advanced Burner Reactors: Modeling and Experimental Validation

    SciTech Connect

    Jensen, Niels Gronbech; Asta, Mark; Ozolins, Nigel Browning'Vidvuds; de Walle, Axel van; Wolverton, Christopher

    2011-12-29

    The consortium has completed its existence and we are here highlighting work and accomplishments. As outlined in the proposal, the objective of the work was to advance the theoretical understanding of advanced nuclear fuel materials (oxides) toward a comprehensive modeling strategy that incorporates the different relevant scales involved in radiation damage in oxide fuels. Approaching this we set out to investigate and develop a set of directions: 1) Fission fragment and ion trajectory studies through advanced molecular dynamics methods that allow for statistical multi-scale simulations. This work also includes an investigation of appropriate interatomic force fields useful for the energetic multi-scale phenomena of high energy collisions; 2) Studies of defect and gas bubble formation through electronic structure and Monte Carlo simulations; and 3) an experimental component for the characterization of materials such that comparisons can be obtained between theory and experiment.

  2. Advanced Simulation Capability for Environmental Management - Current Status and Phase II Demonstration Results - 13161

    SciTech Connect

    Seitz, Roger R.; Flach, Greg; Freshley, Mark D.; Freedman, Vicky; Gorton, Ian; Dixon, Paul; Moulton, J. David; Hubbard, Susan S.; Faybishenko, Boris; Steefel, Carl I.; Finsterle, Stefan; Marble, Justin

    2013-07-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) Multi-process Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial tool-sets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations, addressing attenuation-based remedies at the Savannah River Site F Area and performance assessment for a representative waste tank, illustrate integration of linked ASCEM capabilities and initial integration efforts with tools from the Cementitious Barriers Partnership. (authors)

  3. Dynamic modeling and simulation of planetary rovers

    NASA Technical Reports Server (NTRS)

    Lindemann, Randel A.

    1992-01-01

    This paper documents a preliminary study into the dynamic modeling and computer simulation of wheeled surface vehicles. The research centered on the feasibility of using commercially available multibody dynamics codes running on engineering workstations to perform the analysis. The results indicated that physically representative vehicle mechanics can be modeled and simulated in state-of-the-art Computer Aided Engineering environments, but at excessive cost in modeling and computation time. The results lead to the recommendation for the development of an efficient rover mobility-specific software system. This system would be used for vehicle design and simulation in planetary environments; controls prototyping, design, and testing; as well as local navigation simulation and expectation planning.

  4. Trimming an aircraft model for flight simulation

    NASA Technical Reports Server (NTRS)

    Mcfarland, Richard E.

    1987-01-01

    Real-time piloted aircraft simulations with digital computers have been performed at Ames Research Center (ARC) for over two decades. For the simulation of conventional aircraft models, the establishment of initial vehicle and control orientations at various operational flight regimes has been adequately handled by either analog techniques or simple inversion processes. However, exotic helicopter configurations have been introduced recently that require more sophisticated techniques because of their expanded degrees of freedom and environmental vibration levels. At ARC, these techniques are used for the backward solutions to real-time simulation models as required for the generation of trim points. These techniques are presented in this paper with examples from a blade-element helicopter simulation model.

  5. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGESBeta

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-11-01

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation packagemore »capable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).« less

  6. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-11-01

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).

  7. Design and Test of Advanced Thermal Simulators for an Alkali Metal-Cooled Reactor Simulator

    NASA Technical Reports Server (NTRS)

    Garber, Anne E.; Dickens, Ricky E.

    2011-01-01

    The Early Flight Fission Test Facility (EFF-TF) at NASA Marshall Space Flight Center (MSFC) has as one of its primary missions the development and testing of fission reactor simulators for space applications. A key component in these simulated reactors is the thermal simulator, designed to closely mimic the form and function of a nuclear fuel pin using electric heating. Continuing effort has been made to design simple, robust, inexpensive thermal simulators that closely match the steady-state and transient performance of a nuclear fuel pin. A series of these simulators have been designed, developed, fabricated and tested individually and in a number of simulated reactor systems at the EFF-TF. The purpose of the thermal simulators developed under the Fission Surface Power (FSP) task is to ensure that non-nuclear testing can be performed at sufficiently high fidelity to allow a cost-effective qualification and acceptance strategy to be used. Prototype thermal simulator design is founded on the baseline Fission Surface Power reactor design. Recent efforts have been focused on the design, fabrication and test of a prototype thermal simulator appropriate for use in the Technology Demonstration Unit (TDU). While designing the thermal simulators described in this paper, effort were made to improve the axial power profile matching of the thermal simulators. Simultaneously, a search was conducted for graphite materials with higher resistivities than had been employed in the past. The combination of these two efforts resulted in the creation of thermal simulators with power capacities of 2300-3300 W per unit. Six of these elements were installed in a simulated core and tested in the alkali metal-cooled Fission Surface Power Primary Test Circuit (FSP-PTC) at a variety of liquid metal flow rates and temperatures. This paper documents the design of the thermal simulators, test program, and test results.

  8. MixSIAR: advanced stable isotope mixing models in R

    EPA Science Inventory

    Background/Question/Methods The development of stable isotope mixing models has coincided with modeling products (e.g. IsoSource, MixSIR, SIAR), where methodological advances are published in parity with software packages. However, while mixing model theory has recently been ext...

  9. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  10. Off-gas adsorption model and simulation - OSPREY

    SciTech Connect

    Rutledge, V.J.

    2013-07-01

    A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes is expected to provide substantial cost savings and many technical benefits. To support this capability, a modeling effort focused on the off-gas treatment system of a used nuclear fuel recycling facility is in progress. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and Recovery (OSPREY) models the adsorption of offgas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas composition, sorbent and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data can be obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. In addition to concentration data, the model predicts temperature along the column length as a function of time and pressure drop along the column length. A description of the OSPREY model, results from krypton adsorption modeling and plans for modeling the behavior of iodine, xenon, and tritium will be discussed. (author)

  11. Interactive Visualization to Advance Earthquake Simulation LOUISE H. KELLOGG,1

    E-print Network

    Billen, Magali I.

    with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR

  12. Advanced terahertz imaging system performance model for concealed weapon identification

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Redman, Brian; Espinola, Richard L.; Franck, Charmaine C.; Petkie, Douglas T.; De Lucia, Frank C.; Jacobs, Eddie L.; Griffin, Steven T.; Halford, Carl E.; Reynolds, Joe

    2007-04-01

    The U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) and the U.S. Army Research Laboratory (ARL) have developed a terahertz-band imaging system performance model for detection and identification of concealed weaponry. The details of this MATLAB-based model which accounts for the effects of all critical sensor and display components, and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security and Defence Symposium. The focus of this paper is to report on recent advances to the base model which have been designed to more realistically account for the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system. The advanced terahertz-band imaging system performance model now also accounts for target and background thermal emission, and has been recast into a user-friendly, Windows-executable tool. This advanced THz model has been developed in support of the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. This paper will describe the advanced THz model and its new radiometric sub-model in detail, and provide modeling and experimental results on target observability as a function of target and background orientation.

  13. LAKE WATER TEMPERATURE SIMULATION MODEL

    EPA Science Inventory

    Functional relationships to describe surface wind mixing, vertical turbulent diffusion, convective heat transfer, and radiation penetration based on data from lakes in Minnesota have been developed. hese relationships have been introduced by regressing model parameters found eith...

  14. A virtual laboratory notebook for simulation models.

    PubMed

    Winfield, A J

    1998-01-01

    In this paper we describe how we have adopted the laboratory notebook as a metaphor for interacting with computer simulation models. This 'virtual' notebook stores the simulation output and meta-data (which is used to record the scientist's interactions with the simulation). The meta-data stored consists of annotations (equivalent to marginal notes in a laboratory notebook), a history tree and a log of user interactions. The history tree structure records when in 'simulation' time, and from what starting point in the tree changes are made to the parameters by the user. Typically these changes define a new run of the simulation model (which is represented as a new branch of the history tree). The tree shows the structure of the changes made to the simulation and the log is required to keep the order in which the changes occurred. Together they form a record which you would normally find in a laboratory notebook. The history tree is plotted in simulation parameter space. This shows the scientist's interactions with the simulation visually and allows direct manipulation of the parameter information presented, which in turn is used to control directly the state of the simulation. The interactions with the system are graphical and usually involve directly selecting or dragging data markers and other graphical control devices around in parameter space. If the graphical manipulators do not provide precise enough control then textual manipulation is still available which allows numerical values to be entered by hand. The Virtual Laboratory Notebook, by providing interesting interactions with the visual view of the history tree, provides a mechanism for giving the user complex and novel ways of interacting with biological computer simulation models. PMID:9697181

  15. Simulation of phase boundaries using constrained cell models

    NASA Astrophysics Data System (ADS)

    Nayhouse, Michael; Heng, Vincent R.; Amlani, Ankur M.; Orkoulas, G.

    2012-09-01

    Despite impressive advances, precise simulation of fluid-fluid and fluid-solid phase transitions still remains a challenging task. The present work focuses on the determination of the phase diagram of a system of particles that interact through a pair potential, ?(r), which is of the form ?(r) = 4?[(?/r)2n - (?/r)n] with n = 12. The vapor-liquid phase diagram of this model is established from constant-pressure simulations and flat-histogram techniques. The properties of the solid phase are obtained from constant-pressure simulations using constrained cell models. In the constrained cell model, the simulation volume is divided into Wigner-Seitz cells and each particle is confined to moving in a single cell. The constrained cell model is a limiting case of a more general cell model which is constructed by adding a homogeneous external field that controls the relative stability of the fluid and the solid phase. Fluid-solid coexistence at a reduced temperature of 2 is established from constant-pressure simulations of the generalized cell model. The previous fluid-solid coexistence point is used as a reference point in the determination of the fluid-solid phase boundary through a thermodynamic integration type of technique based on histogram reweighting. Since the attractive interaction is of short range, the vapor-liquid transition is metastable against crystallization. In the present work, the phase diagram of the corresponding constrained cell model is also determined. The latter is found to contain a stable vapor-liquid critical point and a triple point.

  16. Simulation of phase boundaries using constrained cell models.

    PubMed

    Nayhouse, Michael; Heng, Vincent R; Amlani, Ankur M; Orkoulas, G

    2012-09-19

    Despite impressive advances, precise simulation of fluid-fluid and fluid-solid phase transitions still remains a challenging task. The present work focuses on the determination of the phase diagram of a system of particles that interact through a pair potential, ?(r), which is of the form ?(r) = 4?[(?/r)(2n) - (?/r)(n)] with n = 12. The vapor-liquid phase diagram of this model is established from constant-pressure simulations and flat-histogram techniques. The properties of the solid phase are obtained from constant-pressure simulations using constrained cell models. In the constrained cell model, the simulation volume is divided into Wigner-Seitz cells and each particle is confined to moving in a single cell. The constrained cell model is a limiting case of a more general cell model which is constructed by adding a homogeneous external field that controls the relative stability of the fluid and the solid phase. Fluid-solid coexistence at a reduced temperature of 2 is established from constant-pressure simulations of the generalized cell model. The previous fluid-solid coexistence point is used as a reference point in the determination of the fluid-solid phase boundary through a thermodynamic integration type of technique based on histogram reweighting. Since the attractive interaction is of short range, the vapor-liquid transition is metastable against crystallization. In the present work, the phase diagram of the corresponding constrained cell model is also determined. The latter is found to contain a stable vapor-liquid critical point and a triple point. PMID:22850590

  17. Atmospheric model intercomparison project: Monsoon simulations

    SciTech Connect

    Sperber, K.R.; Palmer, T.N.

    1994-06-01

    The simulation of monsoons, in particular the Indian summer monsoon, has proven to be a critical test of a general circulation model`s ability to simulate tropical climate and variability. The Monsoon Numerical Experimentation Group has begun to address questions regarding the predictability of monsoon extremes, in particular conditions associated with El Nino and La Nina conditions that tend to be associated with drought and flood conditions over the Indian subcontinent, through a series of seasonal integrations using analyzed initial conditions from successive days in 1987 and 1988. In this paper the authors present an analysis of simulations associated with the Atmospheric Model Intercomparison Project (AMIP), a coordinated effort to simulate the 1979--1988 decade using standardized boundary conditions with approximately 30 atmospheric general circulation models. The 13 models analyzed to date are listed. Using monthly mean data from these simulations they have calculated indices of precipitation and wind shear in an effort to access the performance of the models over the course of the AMIP decade.

  18. Advances in finite element simulations of elastosonography for breast lesion detection.

    PubMed

    Celi, Simona; Di Puccio, Francesca; Forte, Paola

    2011-08-01

    Among the available tools for the early diagnosis of breast cancer, the elastographic technique based on ultrasounds has many advantages such as the noninvasive measure, the absence of ionizing effects, the high tolerability by patients, and the wide diffusion of the ecographic machines. However this diagnostic procedure is strongly affected by many subjective factors and is considered not reliable enough even to reduce the number of biopsies used to identify the nature of lesions. Therefore in the literature experimental and numerical simulations on physical and virtual phantoms are presented to test and validate procedures and algorithms and to interpret elastosonographic results. In this work, first a description of the elastographic technique and a review of the principal finite element (FE) models are provided and second diagnostic indexes employed to assess the nature of a lump mass are presented. As advances in FE simulations of elastosonography, axisymmetric phantom, and anthropomorphic models are described, which, with respect to the literature, include some features of breast mechanics. In particular deterministic analyses were used to compare the various details of virtual elastograms and also to investigate diagnostic indexes with respect to the regions where strains were considered. In order to improve the reliability of the elastosonographic procedure, univariate and multivariate sensitivity analyses, based on a probabilistic FE approach, were also performed to identify the parameters that mostly influence the deformation contrast between healthy and cancerous tissues. Moreover, synthetic indicators of the strain field, such as the strain contrast coefficient, were evaluated in different regions of interest in order to identify the most suitable for lesion type assessment. The deterministic analyses show that the malignant lesion is characterized by a uniform strain inside the inclusion due to the firmly bonding condition, while in the benign inclusion (loosely bonded) a strain gradient is observed independently from the elastic modulus contrast. The multivariate analyses reveal that the strain contrast depends linearly on the relative stiffness between the lesion and the healthy tissue and not linearly on the interface friction coefficient. The anthropomorphic model shows other interesting features, such as the layer or curvature effects, which introduce difficulties in selecting a reference region for strain assessment. The results show that a simple axisymmetric model with linear elastic material properties can be suitable to simulate the elastosonographic procedure although the breast curvature and layer distinction play a significant role in the strain assessment. PMID:21950899

  19. Air target models for fuzing simulations

    NASA Astrophysics Data System (ADS)

    Dammann, J. F., Jr.

    1982-09-01

    Radar backscatter models for air targets suitable for computer simulation of radar fuze-air target encounters are described. These models determine the characteristics of the energy reflected to the fuze when the target is illuminated by a fuze radar. When the target models are coupled with fuze models, the time when the fuze detects the presence of the target can be determined for any arbitrary terminal encounter geometry. Fuze detection times for representative trajectories can be compared with fuze specifications to measure fuze performance or can be used as a part of a simulation of an entire system to determine system performance. Following one basic methodology, target models have been written for the Fishbed, Foxbat, and Flogger fighter aircraft; the Hind-D helicopter; and the Backfire, Blinder, and B-1 bombers. All of the models are specular point models where the major return is assumed to come from a small number of glitter points or specular points on the target.

  20. Advanced battery modeling using neural networks 

    E-print Network

    Arikara, Muralidharan Pushpakam

    1993-01-01

    variable of the performance of the battery need not be known apriori. The neural network develops the model by corelating experimental data. A software model was developed and tested for lead acid batteries using this technique. The results obtained from...

  1. WinSRFR: Current Advances in Software for Surface Irrigation Simulation and Analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Significant advances have been made over the last decade in the development of software for surface irrigation analysis. WinSRFR is an integrated tool that combines unsteady flow simulation with tools for system evaluation/parameter estimation, system design, and for operational optimization. Ongoi...

  2. Transient fault modeling and fault injection simulation 

    E-print Network

    Yuan, Xuejun

    1996-01-01

    An accurate transient fault model is presented in this thesis. A 7-term exponential current upset model is derived from the results of a device-level, 3-dimensional, single-event-upset simulation. A curve-fitting algorithm is used to extract...

  3. Estimating solar radiation for plant simulation models

    NASA Technical Reports Server (NTRS)

    Hodges, T.; French, V.; Leduc, S.

    1985-01-01

    Five algorithms producing daily solar radiation surrogates using daily temperatures and rainfall were evaluated using measured solar radiation data for seven U.S. locations. The algorithms were compared both in terms of accuracy of daily solar radiation estimates and terms of response when used in a plant growth simulation model (CERES-wheat). Requirements for accuracy of solar radiation for plant growth simulation models are discussed. One algorithm is recommended as being best suited for use in these models when neither measured nor satellite estimated solar radiation values are available.

  4. A future Outlook: Web based Simulation of Hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Islam, A. S.; Piasecki, M.

    2003-12-01

    Despite recent advances to present simulation results as 3D graphs or animation contours, the modeling user community still faces some shortcomings when trying to move around and analyze data. Typical problems include the lack of common platforms with standard vocabulary to exchange simulation results from different numerical models, insufficient descriptions about data (metadata), lack of robust search and retrieval tools for data, and difficulties to reuse simulation domain knowledge. This research demonstrates how to create a shared simulation domain in the WWW and run a number of models through multi-user interfaces. Firstly, meta-datasets have been developed to describe hydrodynamic model data based on geographic metadata standard (ISO 19115) that has been extended to satisfy the need of the hydrodynamic modeling community. The Extended Markup Language (XML) is used to publish this metadata by the Resource Description Framework (RDF). Specific domain ontology for Web Based Simulation (WBS) has been developed to explicitly define vocabulary for the knowledge based simulation system. Subsequently, this knowledge based system is converted into an object model using Meta Object Family (MOF). The knowledge based system acts as a Meta model for the object oriented system, which aids in reusing the domain knowledge. Specific simulation software has been developed based on the object oriented model. Finally, all model data is stored in an object relational database. Database back-ends help store, retrieve and query information efficiently. This research uses open source software and technology such as Java Servlet and JSP, Apache web server, Tomcat Servlet Engine, PostgresSQL databases, Protégé ontology editor, RDQL and RQL for querying RDF in semantic level, Jena Java API for RDF. Also, we use international standards such as the ISO 19115 metadata standard, and specifications such as XML, RDF, OWL, XMI, and UML. The final web based simulation product is deployed as Web Archive (WAR) files which is platform and OS independent and can be used by Windows, UNIX, or Linux. Keywords: Apache, ISO 19115, Java Servlet, Jena, JSP, Metadata, MOF, Linux, Ontology, OWL, PostgresSQL, Protégé, RDF, RDQL, RQL, Tomcat, UML, UNIX, Windows, WAR, XML

  5. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  6. A graphical workstation based part-task flight simulator for preliminary rapid evaluation of advanced displays

    NASA Technical Reports Server (NTRS)

    Wanke, Craig; Kuchar, James; Hahn, Edward; Pritchett, A.; Hansman, R. John

    1994-01-01

    Advances in avionics and display technology are significantly changing the cockpit environment in current transport aircraft. The MIT Aeronautical Systems Lab (ASL) developed a part-task flight simulator specifically to study the effects of these new technologies on flight crew situational awareness and performance. The simulator is based on a commercially-available graphics workstation, and can be rapidly reconfigured to meet the varying demands of experimental studies. The simulator was successfully used to evaluate graphical microbursts alerting displays, electronic instrument approach plates, terrain awareness and alerting displays, and ATC routing amendment delivery through digital datalinks.

  7. Mars Smart Lander Parachute Simulation Model

    NASA Technical Reports Server (NTRS)

    Queen, Eric M.; Raiszadeh, Ben

    2002-01-01

    A multi-body flight simulation for the Mars Smart Lander has been developed that includes six degree-of-freedom rigid-body models for both the supersonically-deployed and subsonically-deployed parachutes. This simulation is designed to be incorporated into a larger simulation of the entire entry, descent and landing (EDL) sequence. The complete end-to-end simulation will provide attitude history predictions of all bodies throughout the flight as well as loads on each of the connecting lines. Other issues such as recontact with jettisoned elements (heat shield, back shield, parachute mortar covers, etc.), design of parachute and attachment points, and desirable line properties can also be addressed readily using this simulation.

  8. A queuing model for road traffic simulation

    SciTech Connect

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-03-10

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.

  9. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward

    2014-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and compared against each other. Results show both models can be tuned to achieve results within 7% of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  10. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2015-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and also compared against each other. Results show both models can be tuned to achieve results within 7 of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  11. Sharpening advanced land imager multispectral data using a sensor model

    USGS Publications Warehouse

    Lemeshewsky, G.P.

    2005-01-01

    The Advanced Land Imager (ALI) instrument on NASA's Earth Observing One (EO-1) satellite provides for nine spectral bands at 30m ground sample distance (GSD) and a 10m GSD panchromatic band. This report describes an image sharpening technique where the higher spatial resolution information of the panchromatic band is used to increase the spatial resolution of ALI multispectral (MS) data. To preserve the spectral characteristics, this technique combines reported deconvolution deblurring methods for the MS data with highpass filter-based fusion methods for the Pan data. The deblurring process uses the point spread function (PSF) model of the ALI sensor. Information includes calculation of the PSF from pre-launch calibration data. Performance was evaluated using simulated ALI MS data generated by degrading the spatial resolution of high resolution IKONOS satellite MS data. A quantitative measure of performance was the error between sharpened MS data and high resolution reference. This report also compares performance with that of a reported method that includes PSF information. Preliminary results indicate improved sharpening with the method reported here.

  12. Power electronics system modeling and simulation

    SciTech Connect

    Lai, Jih-Sheng

    1994-12-31

    This paper introduces control system design based softwares, SIMNON and MATLAB/SIMULINK, for power electronics system simulation. A complete power electronics system typically consists of a rectifier bridge along with its smoothing capacitor, an inverter, and a motor. The system components, featuring discrete or continuous, linear or nonlinear, are modeled in mathematical equations. Inverter control methods,such as pulse-width-modulation and hysteresis current control, are expressed in either computer algorithms or digital circuits. After describing component models and control methods, computer programs are then developed for complete systems simulation. Simulation results are mainly used for studying system performances, such as input and output current harmonics, torque ripples, and speed responses. Key computer programs and simulation results are demonstrated for educational purposes.

  13. Advances on genetic rat models of epilepsy

    PubMed Central

    Serikawa, Tadao; Mashimo, Tomoji; Kuramoto, Takashi; Voigt, Birger; Ohno, Yukihiro; Sasa, Masashi

    2014-01-01

    Considering the suitability of laboratory rats in epilepsy research, we and other groups have been developing genetic models of epilepsy in this species. After epileptic rats or seizure-susceptible rats were sporadically found in outbred stocks, the epileptic traits were usually genetically-fixed by selective breeding. So far, the absence seizure models GAERS and WAG/Rij, audiogenic seizure models GEPR-3 and GEPR-9, generalized tonic-clonic seizure models IER, NER and WER, and Canavan-disease related epileptic models TRM and SER have been established. Dissection of the genetic bases including causative genes in these epileptic rat models would be a significant step toward understanding epileptogenesis. N-ethyl-N-nitrosourea (ENU) mutagenesis provides a systematic approach which allowed us to develop two novel epileptic rat models: heat-induced seizure susceptible (Hiss) rats with an Scn1a missense mutation and autosomal dominant lateral temporal epilepsy (ADLTE) model rats with an Lgi1 missense mutation. In addition, we have established episodic ataxia type 1 (EA1) model rats with a Kcna1 missense mutation derived from the ENU-induced rat mutant stock, and identified a Cacna1a missense mutation in a N-Methyl-N-nitrosourea (MNU)-induced mutant rat strain GRY, resulting in the discovery of episodic ataxia type 2 (EA2) model rats. Thus, epileptic rat models have been established on the two paths: ‘phenotype to gene’ and ‘gene to phenotype’. In the near future, development of novel epileptic rat models will be extensively promoted by the use of sophisticated genome editing technologies. PMID:25312505

  14. Man-vehicle systems research facility advanced aircraft flight simulator throttle mechanism

    NASA Technical Reports Server (NTRS)

    Kurasaki, S. S.; Vallotton, W. C.

    1985-01-01

    The Advanced Aircraft Flight Simulator is equipped with a motorized mechanism that simulates a two engine throttle control system that can be operated via a computer driven performance management system or manually by the pilots. The throttle control system incorporates features to simulate normal engine operations and thrust reverse and vary the force feel to meet a variety of research needs. While additional testing to integrate the work required is principally now in software design, since the mechanical aspects function correctly. The mechanism is an important part of the flight control system and provides the capability to conduct human factors research of flight crews with advanced aircraft systems under various flight conditions such as go arounds, coupled instrument flight rule approaches, normal and ground operations and emergencies that would or would not normally be experienced in actual flight.

  15. URC Fuzzy Modeling and Simulation of Gene Regulation

    SciTech Connect

    Sokhansanj, B A; Fitch, J P

    2001-05-01

    Recent technological advances in high-throughput data collection give biologists the ability to study increasingly complex systems. A new methodology is needed to develop and test biological models based on experimental observations and predict the effect of perturbations of the network (e.g. genetic engineering, pharmaceuticals, gene therapy). Diverse modeling approaches have been proposed, in two general categories: modeling a biological pathway as (a) a logical circuit or (b) a chemical reaction network. Boolean logic models can not represent necessary biological details. Chemical kinetics simulations require large numbers of parameters that are very difficult to accurately measure. Based on the way biologists have traditionally thought about systems, we propose that fuzzy logic is a natural language for modeling biology. The Union Rule Configuration (URC) avoids combinatorial explosion in the fuzzy rule base, allowing complex system models. We demonstrate the fuzzy modeling method on the commonly studied lac operon of E. coli. Our goal is to develop a modeling and simulation approach that can be understood and applied by biologists without the need for experts in other fields or ''black-box'' software.

  16. Recent Advances in the LEWICE Icing Model

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Addy, Gene; Struk, Peter; Bartkus, Tadas

    2015-01-01

    This paper will describe two recent modifications to the Glenn ICE software. First, a capability for modeling ice crystals and mixed phase icing has been modified based on recent experimental data. Modifications have been made to the ice particle bouncing and erosion model. This capability has been added as part of a larger effort to model ice crystal ingestion in aircraft engines. Comparisons have been made to ice crystal ice accretions performed in the NRC Research Altitude Test Facility (RATFac). Second, modifications were made to the run back model based on data and observations from thermal scaling tests performed in the NRC Altitude Icing Tunnel.

  17. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  18. An Advanced Sea-Floor Spreading Model.

    ERIC Educational Resources Information Center

    Dutch, Steven I.

    1986-01-01

    Describes models which (1) illustrate spreading that varies in rate from place to place; (2) clearly show transform faults as arcs of small circles; and (3) illustrate what happens near a pole of rotation. The models are easy to construct and have been well received by students. (JN)

  19. Carbon export algorithm advancements in models

    NASA Astrophysics Data System (ADS)

    Ça?lar Yumruktepe, Veli; Saliho?lu, Bar??

    2015-04-01

    The rate at which anthropogenic CO2 is absorbed by the oceans remains a critical question under investigation by climate researchers. Construction of a complete carbon budget, requires better understanding of air-sea exchanges and the processes controlling the vertical and horizontal transport of carbon in the ocean, particularly the biological carbon pump. Improved parameterization of carbon sequestration within ecosystem models is vital to better understand and predict changes in the global carbon cycle. Due to the complexity of processes controlling particle aggregation, sinking and decomposition, existing ecosystem models necessarily parameterize carbon sequestration using simple algorithms. Development of improved algorithms describing carbon export and sequestration, suitable for inclusion in numerical models is an ongoing work. Existing unique algorithms used in the state-of-the art ecosystem models and new experimental results obtained from mesocosm experiments and open ocean observations have been inserted into a common 1D pelagic ecosystem model for testing purposes. The model was implemented to the timeseries stations in the North Atlantic (BATS, PAP and ESTOC) and were evaluated with datasets of carbon export. Targetted topics of algorithms were PFT functional types, grazing and vertical movement of zooplankton, and remineralization, aggregation and ballasting dynamics of organic matter. Ultimately it is intended to feed improved algorithms to the 3D modelling community, for inclusion in coupled numerical models.

  20. Advanced Information Technology in Simulation Based Life Cycle Design

    NASA Technical Reports Server (NTRS)

    Renaud, John E.

    2003-01-01

    In this research a Collaborative Optimization (CO) approach for multidisciplinary systems design is used to develop a decision based design framework for non-deterministic optimization. To date CO strategies have been developed for use in application to deterministic systems design problems. In this research the decision based design (DBD) framework proposed by Hazelrigg is modified for use in a collaborative optimization framework. The Hazelrigg framework as originally proposed provides a single level optimization strategy that combines engineering decisions with business decisions in a single level optimization. By transforming this framework for use in collaborative optimization one can decompose the business and engineering decision making processes. In the new multilevel framework of Decision Based Collaborative Optimization (DBCO) the business decisions are made at the system level. These business decisions result in a set of engineering performance targets that disciplinary engineering design teams seek to satisfy as part of subspace optimizations. The Decision Based Collaborative Optimization framework more accurately models the existing relationship between business and engineering in multidisciplinary systems design.

  1. Electrical Load Modeling and Simulation

    SciTech Connect

    Chassin, David P.

    2013-01-01

    Electricity consumer demand response and load control are playing an increasingly important role in the development of a smart grid. Smart grid load management technologies such as Grid FriendlyTM controls and real-time pricing are making their way into the conventional model of grid planning and operations. However, the behavior of load both affects, and is affected by load control strategies that are designed to support electric grid planning and operations. This chapter discussed the natural behavior of electric loads, how it interacts with various load control and demand response strategies, what the consequences are for new grid operation concepts and the computing issues these new technologies raise.

  2. Hydrodynamic simulations of propellers: Isothermal model

    NASA Astrophysics Data System (ADS)

    Seiß, M.; Hoffmann, H.; Spahn, F.

    2015-10-01

    Small moons embedded in Saturn's rings can cause S-shaped density structures, called propellers, in their close vicinity. This structures have been predicted first on base of a combined model involving gravitational scattering of test particles (creating the structure) and diffusion (smearing out the structure) [1, 2]. The propeller model was confirmed later with the help of Nbody simulations showing the additional appearance of moon wakes adjacent to the S-shaped gaps [3, 4]. It was a great success of the Cassini mission when the propeller were detected in the ISS imaging [5, 6] and UVIS occultation data [7]. Here we present an isothermal hydrodynamic simulation of a propeller as a further development of the original model [1, 2] where gravitational scattering and diffusion had to be treated separately. With this new approach we prove the correctness of the predicted scaling laws for the radial and azimuthal extent of the propeller. Furthermore, we will show a comparison between results of N-body and hydrodynamic simulations. Finally, we will present simulation results of the giant propeller Bleriot, which can not be modeled by N-body simulations in its full extent yet.

  3. Distributed earth model/orbiter simulation

    NASA Technical Reports Server (NTRS)

    Geisler, Erik; Mcclanahan, Scott; Smith, Gary

    1989-01-01

    Distributed Earth Model/Orbiter Simulation (DEMOS) is a network based application developed for the UNIX environment that visually monitors or simulates the Earth and any number of orbiting vehicles. Its purpose is to provide Mission Control Center (MCC) flight controllers with a visually accurate three dimensional (3D) model of the Earth, Sun, Moon and orbiters, driven by real time or simulated data. The project incorporates a graphical user interface, 3D modelling employing state-of-the art hardware, and simulation of orbital mechanics in a networked/distributed environment. The user interface is based on the X Window System and the X Ray toolbox. The 3D modelling utilizes the Programmer's Hierarchical Interactive Graphics System (PHIGS) standard and Raster Technologies hardware for rendering/display performance. The simulation of orbiting vehicles uses two methods of vector propagation implemented with standard UNIX/C for portability. Each part is a distinct process that can run on separate nodes of a network, exploiting each node's unique hardware capabilities. The client/server communication architecture of the application can be reused for a variety of distributed applications.

  4. Modeling and simulation of PCM-enhanced facade systems

    NASA Astrophysics Data System (ADS)

    Al-Saadi, Saleh Nasser

    Building facade contributes to the overall architectural aesthetic but can be utilized for heat storage when proper systems are incorporated. Latent heat storage such as using a phase change material (PCM) gains growing attentions recently due to its ability of storing significant thermal energy within a small volume, making it one of most promising technologies for developing energy efficient buildings. This research is focused on modeling and simulation of PCM when integrated into advanced facade systems. The study first reviews the different mathematical modeling methods generally used for PCM's simulations. It categorizes the PCM's numerical models that are implemented for standalone facade systems. The study then evaluates the PCM's models that are integrated into whole building simulation tools such as EnergyPlus, TRNSYS, ESPr etc. It is revealed that the heat capacity method is mostly used in programs, despite its limitations on time and spatial resolutions. Therefore, alternative numerical models are investigated to overcome the above constrains and limitations in current PCM's simulation practice. Eight potential computational models based on a fully implicit finite volume method are developed in MATLAB/SIMULINK environment, validated using experimental results from the literature and verified against well-known building simulation programs. A linearized enthalpy method with hybrid correction scheme is proposed and validated in this work as an improvement to the existing numerical schemes for implementation into building simulation tools. Through sensitivity analysis achieved by varying the PCM thermal properties, the models have been analyzed for their computational efficiency and prediction accuracy. Some models are found sensitive to melting range of PCM, for example heat capacity method, but less sensitive to the variations of latent heat. Among the correction schemes, the non-iterative scheme is inaccurate due to the significant temperature spikes when PCM changes a state. The iterative and the hybrid correction schemes are computationally efficient and less sensitive to variations of PCM's thermal properties. Hence, these two schemes can potentially be implemented for modeling PCM instead of existing slow and unstable numerical algorithms. Based on this conclusion, a library of modules capable of modeling Advanced Facade Systems, entitled "AdvFacSy" toolbox, is developed in SIMULINK GUI environment. The toolbox can be easily used to evaluate innovative advanced facade systems with and without PCM. Using this toolbox, two PCM-enhanced facade designs are evaluated and general conclusions have been drawn. Using a novel coupling methodology, several modules from the toolbox are then fully integrated into TRNSYS; a whole-building simulation tool. In addition, a standard TRNSYS module, Type-285, is specifically developed under this research work for modeling multilayer wall with/without PCM. A typical residential building with PCM-embedded walls is analyzed under representative US climates. It is concluded that PCM poorly performs when it is exposed to natural environmental conditions. However, the performance of PCM has indeed been enhanced when activated using other passive strategies.

  5. Common modeling system for digital simulation

    NASA Technical Reports Server (NTRS)

    Painter, Rick

    1994-01-01

    The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.

  6. Advanced Technologies in Energy-Economy Models for

    E-print Network

    is applied to a global economy-wide model to study the roles of low-carbon alternatives in the power sectorAdvanced Technologies in Energy-Economy Models for Climate Change Assessment Jennifer F. Morris on the Science and Policy of Global Change combines cutting-edge scientific research with independent policy

  7. Advances and applications of occupancy models

    USGS Publications Warehouse

    Bailey, Larissa; MacKenzie, Darry I.; Nichols, James D.

    2013-01-01

    Summary: The past decade has seen an explosion in the development and application of models aimed at estimating species occurrence and occupancy dynamics while accounting for possible non-detection or species misidentification. We discuss some recent occupancy estimation methods and the biological systems that motivated their development. Collectively, these models offer tremendous flexibility, but simultaneously place added demands on the investigator. Unlike many mark–recapture scenarios, investigators utilizing occupancy models have the ability, and responsibility, to define their sample units (i.e. sites), replicate sampling occasions, time period over which species occurrence is assumed to be static and even the criteria that constitute ‘detection’ of a target species. Subsequent biological inference and interpretation of model parameters depend on these definitions and the ability to meet model assumptions. We demonstrate the relevance of these definitions by highlighting applications from a single biological system (an amphibian–pathogen system) and discuss situations where the use of occupancy models has been criticized. Finally, we use these applications to suggest future research and model development.

  8. Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA)

    NASA Technical Reports Server (NTRS)

    Lichtwardt, Jonathan; Paciano, Eric; Jameson, Tina; Fong, Robert; Marshall, David

    2012-01-01

    With the very recent advent of NASA's Environmentally Responsible Aviation Project (ERA), which is dedicated to designing aircraft that will reduce the impact of aviation on the environment, there is a need for research and development of methodologies to minimize fuel burn, emissions, and reduce community noise produced by regional airliners. ERA tackles airframe technology, propulsion technology, and vehicle systems integration to meet performance objectives in the time frame for the aircraft to be at a Technology Readiness Level (TRL) of 4-6 by the year of 2020 (deemed N+2). The proceeding project that investigated similar goals to ERA was NASA's Subsonic Fixed Wing (SFW). SFW focused on conducting research to improve prediction methods and technologies that will produce lower noise, lower emissions, and higher performing subsonic aircraft for the Next Generation Air Transportation System. The work provided in this investigation was a NASA Research Announcement (NRA) contract #NNL07AA55C funded by Subsonic Fixed Wing. The project started in 2007 with a specific goal of conducting a large-scale wind tunnel test along with the development of new and improved predictive codes for the advanced powered-lift concepts. Many of the predictive codes were incorporated to refine the wind tunnel model outer mold line design. The large scale wind tunnel test goal was to investigate powered lift technologies and provide an experimental database to validate current and future modeling techniques. Powered-lift concepts investigated were Circulation Control (CC) wing in conjunction with over-the-wing mounted engines to entrain the exhaust to further increase the lift generated by CC technologies alone. The NRA was a five-year effort; during the first year the objective was to select and refine CESTOL concepts and then to complete a preliminary design of a large-scale wind tunnel model for the large scale test. During the second, third, and fourth years the large-scale wind tunnel model design would be completed, manufactured, and calibrated. During the fifth year the large scale wind tunnel test was conducted. This technical memo will describe all phases of the Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA) project and provide a brief summary of the background and modeling efforts involved in the NRA. The conceptual designs considered for this project and the decision process for the selected configuration adapted for a wind tunnel model will be briefly discussed. The internal configuration of AMELIA, and the internal measurements chosen in order to satisfy the requirements of obtaining a database of experimental data to be used for future computational model validations. The external experimental techniques that were employed during the test, along with the large-scale wind tunnel test facility are covered in great detail. Experimental measurements in the database include forces and moments, and surface pressure distributions, local skin friction measurements, boundary and shear layer velocity profiles, far-field acoustic data and noise signatures from turbofan propulsion simulators. Results and discussion of the circulation control performance, over-the-wing mounted engines, and the combined performance are also discussed in great detail.

  9. Dynamic modeling used for the addition of robotic operation to the Advanced Servomanipulator teleoperator

    SciTech Connect

    Corbett, G.K.; Bailey, J.M.

    1989-01-01

    A robotic mode has been added to the Advanced Servomanipulator (ASM), a 6 degree-of-freedom master/slave teleoperator. In order to understand the requirements for implementation of robotics on an arm designed for teleoperation, a dynamic simulation of the ASM slave arm was developed. The ASM model and modifications of the control system for robotic operation are presented. 7 refs., 3 figs.

  10. Modeling surgical skill learning with cognitive simulation.

    PubMed

    Park, Shi-Hyun; Suh, Irene H; Chien, Jung-hung; Paik, Jaehyon; Ritter, Frank E; Oleynikov, Dmitry; Siu, Ka-Chun

    2011-01-01

    We used a cognitive architecture (ACT-R) to explore the procedural learning of surgical tasks and then to understand the process of perceptual motor learning and skill decay in surgical skill performance. The ACT-R cognitive model simulates declarative memory processes during motor learning. In this ongoing study, four surgical tasks (bimanual carrying, peg transfer, needle passing, and suture tying) were performed using the da Vinci© surgical system. Preliminary results revealed that an ACT-R model produced similar learning effects. Cognitive simulation can be used to demonstrate and optimize the perceptual motor learning and skill decay in surgical skill training. PMID:21335834

  11. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  12. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  13. The role of numerical simulation for the development of an advanced HIFU system

    NASA Astrophysics Data System (ADS)

    Okita, Kohei; Narumi, Ryuta; Azuma, Takashi; Takagi, Shu; Matumoto, Yoichiro

    2014-10-01

    High-intensity focused ultrasound (HIFU) has been used clinically and is under clinical trials to treat various diseases. An advanced HIFU system employs ultrasound techniques for guidance during HIFU treatment instead of magnetic resonance imaging in current HIFU systems. A HIFU beam imaging for monitoring the HIFU beam and a localized motion imaging for treatment validation of tissue are introduced briefly as the real-time ultrasound monitoring techniques. Numerical simulations have a great impact on the development of real-time ultrasound monitoring as well as the improvement of the safety and efficacy of treatment in advanced HIFU systems. A HIFU simulator was developed to reproduce ultrasound propagation through the body in consideration of the elasticity of tissue, and was validated by comparison with in vitro experiments in which the ultrasound emitted from the phased-array transducer propagates through the acrylic plate acting as a bone phantom. As the result, the defocus and distortion of the ultrasound propagating through the acrylic plate in the simulation quantitatively agree with that in the experimental results. Therefore, the HIFU simulator accurately reproduces the ultrasound propagation through the medium whose shape and physical properties are well known. In addition, it is experimentally confirmed that simulation-assisted focus control of the phased-array transducer enables efficient assignment of the focus to the target. Simulation-assisted focus control can contribute to design of transducers and treatment planning.

  14. Hot-bench simulation of the active flexible wing wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Buttrill, Carey S.; Houck, Jacob A.

    1990-01-01

    Two simulations, one batch and one real-time, of an aeroelastically-scaled wind-tunnel model were developed. The wind-tunnel model was a full-span, free-to-roll model of an advanced fighter concept. The batch simulation was used to generate and verify the real-time simulation and to test candidate control laws prior to implementation. The real-time simulation supported hot-bench testing of a digital controller, which was developed to actively control the elastic deformation of the wind-tunnel model. Time scaling was required for hot-bench testing. The wind-tunnel model, the mathematical models for the simulations, the techniques employed to reduce the hot-bench time-scale factors, and the verification procedures are described.

  15. Advanced Space Propulsion System Flowfield Modeling

    NASA Technical Reports Server (NTRS)

    Smith, Sheldon

    1998-01-01

    Solar thermal upper stage propulsion systems currently under development utilize small low chamber pressure/high area ratio nozzles. Consequently, the resulting flow in the nozzle is highly viscous, with the boundary layer flow comprising a significant fraction of the total nozzle flow area. Conventional uncoupled flow methods which treat the nozzle boundary layer and inviscid flowfield separately by combining the two calculations via the influence of the boundary layer displacement thickness on the inviscid flowfield are not accurate enough to adequately treat highly viscous nozzles. Navier Stokes models such as VNAP2 can treat these flowfields but cannot perform a vacuum plume expansion for applications where the exhaust plume produces induced environments on adjacent structures. This study is built upon recently developed artificial intelligence methods and user interface methodologies to couple the VNAP2 model for treating viscous nozzle flowfields with a vacuum plume flowfield model (RAMP2) that is currently a part of the Plume Environment Prediction (PEP) Model. This study integrated the VNAP2 code into the PEP model to produce an accurate, practical and user friendly tool for calculating highly viscous nozzle and exhaust plume flowfields.

  16. Modelling the Electricity Market: from Equilibrium Models to Simulation

    E-print Network

    Lavaei, Javad

    Modelling the Electricity Market: from Equilibrium Models to Simulation Yoann Poirier Abstract - This paper aims at providing an overview of the different models used in order to describe the Electricity the Electricity Market: Cournot Equilibrium, Bertrand Equilibrium and Supply Function Equilibrium. I will make

  17. Phenomenological Modeling of Infrared Sources: Recent Advances

    NASA Technical Reports Server (NTRS)

    Leung, Chun Ming; Kwok, Sun (Editor)

    1993-01-01

    Infrared observations from planned space facilities (e.g., ISO (Infrared Space Observatory), SIRTF (Space Infrared Telescope Facility)) will yield a large and uniform sample of high-quality data from both photometric and spectroscopic measurements. To maximize the scientific returns of these space missions, complementary theoretical studies must be undertaken to interpret these observations. A crucial step in such studies is the construction of phenomenological models in which we parameterize the observed radiation characteristics in terms of the physical source properties. In the last decade, models with increasing degree of physical realism (in terms of grain properties, physical processes, and source geometry) have been constructed for infrared sources. Here we review current capabilities available in the phenomenological modeling of infrared sources and discuss briefly directions for future research in this area.

  18. Modeling Innovations Advance Wind Energy Industry

    NASA Technical Reports Server (NTRS)

    2009-01-01

    In 1981, Glenn Research Center scientist Dr. Larry Viterna developed a model that predicted certain elements of wind turbine performance with far greater accuracy than previous methods. The model was met with derision from others in the wind energy industry, but years later, Viterna discovered it had become the most widely used method of its kind, enabling significant wind energy technologies-like the fixed pitch turbines produced by manufacturers like Aerostar Inc. of Westport, Massachusetts-that are providing sustainable, climate friendly energy sources today.

  19. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  20. Smart Engines Via Advanced Model Based Controls

    SciTech Connect

    Allain, Marc

    2000-08-20

    A ''new'' process for developing control systems - Less engine testing - More robust control system - Shorter development cycle time - ''Smarter'' approach to engine control - On-board models describe engine behavior - Shorter, systematic calibration process - Customer and legislative requirements designed-in.

  1. Advances in Swine Biomedical Model Genomics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The swine has been a major biomedical model species, for transplantation, heart disease, allergies and asthma, as well as normal neonatal development and reproductive physiology. Swine have been used extensively for studies of infectious disease processes and analyses of preventative strategies, inc...

  2. New advances for modelling the debris avalanches

    NASA Astrophysics Data System (ADS)

    Cuomo, Sabatino; Cascini, Leonardo; Pastor, Manuel; Castorino, Giuseppe Claudio

    2013-04-01

    Flow-like landslides are a major global hazard and they occur worldwide causing a large number of casualties, significant structural damages to property and infrastructures as well as economic losses. When involving open slopes, these landslides often occur in triangular source areas where initial slides turn into avalanches through further failures and/or eventual soil entrainment. This paper deals with the numerical modelling of the propagation stage of debris avalanches which provides information such as the propagation pattern of the mobilized material, its velocity, thickness and run-out distance. In the paper, a "depth integrated" model is used which allows: i) adequately taking into account the irregular topography of real slopes which greatly affect the propagation stage and ii) using a less time consuming model than fully 3D approaches. The used model is named "GeoFlow_SPH" and it was formerly applied to theoretical, experimental and real case histories (Pastor et al., 2009; Cascini et al., 2012). In this work the behavior of debris avalanches is analyzed with special emphasis on the apical angle, one of the main features of this type of landslide, in relation to soil rheology, hillslope geometry and features of triggering area. Furthermore, the role of erosion has been investigated with reference to the uppermost parts of open slopes with a different steepness. These analyses are firstly carried out for simplified benchmark slopes, using both water-like materials (with no shear strength) and debris type materials. Then, three important case studies of Campania region (Cervinara, Nocera Inferiore e Sarno) are analyzed where debris avalanches involved pyroclastic soils originated from the eruptive products of Vesusius volcano. The results achieved for both benchmark slopes and real case histories outline the key role played by the erosion on the whole propagation stage of debris avalanches. The results are particularly satisfactory since they indicate the "GeoFlow_SPH" model as a suitable tool for the analysis of these phenomena. References Pastor, M., Haddad, B., Sorbino, G., Cuomo, S., Drempetic V. (2009). A depth-integrated, coupled SPH model for flow-like landslides and related phenomena. International Journal for Numerical and Analytical Methods in Geomechanics, 33, 143-184. Cascini L., Cuomo S., Pastor M., Sorbino G., Piciullo L. (2012). Modeling of propagation and entrainment phenomena for landslides of the flow type: the May 1998 case study. Proc. of 11th Int. Symposium on Landslides: Landslides and Engineered Slopes, Banf, Canada June 3-8, 2012, Ed. E. Eberhardt, C. Froese, K. Turner, S. Leroueil, ISBN 978-0-415-62423-6, 1723-1729.

  3. Modeling Galaxy CO Simulations as an Observer

    NASA Astrophysics Data System (ADS)

    Kamenetzky, Julia R.; Privon, George C.; Narayanan, Desika

    2016-01-01

    Our new ability to comprehensively model CO emission from J=1-0 to J=13-12 is one legacy of the SPIRE FTS onboard the Herschel Space Observatory. Much attention has been paid to this dataset by both theorists and observers, due to these lines' ability to probe the warm, highly excited molecular gas that may be associated with star formation. We are investigating the CO excitation ladders produced by the numerical simulations of disc galaxies and galaxy mergers (Narayanan and Krumholz 2014). Using one- and two-component non-LTE RADEX models, we compare the physical conditions derived from modeling the unresolved CO as SPIRE observers would with the known gas properties and distributions of the simulations. Our goal is to better understand the meaning of the derived physical conditions when modeling unresolved continuous gas distributions as discrete physical components over the CO ladder.

  4. Simulation model for Infrared Imaging systems

    NASA Astrophysics Data System (ADS)

    Toler, O. E.; Grey, D. S.

    1980-01-01

    A simulation model has been developed to analyze Infrared Imaging systems. This model synthesizes an optical point spread function and convolves it with target and detector configurations to produce a target signal. The model includes the effects on the signal of detector responsivity contours, varying target shapes, noise and various electronic filtering and signal processing methods. The model runs on CDC 6600/7600 computers and requires about 40 seconds processor time per run when exercising all options. Outputs are in the form of tables, printer plots and calcomp plots.

  5. Modeling of advanced ECLSS/ARS with ASPEN

    NASA Technical Reports Server (NTRS)

    Kolodney, M.; Lange, K. E.; Edeen, M. A.

    1991-01-01

    System-level ASPEN models were developed for the CO2 partial reduction subsystem and a bioregenerative life support subsystem (BRLSS). The individual component and subsystem models were integrated into three different system-level atmospheric revitalization subsystem (ARS) models: baseline physico-chemical, BRLSS, and partial reduction of Martian CO2. The Aspen models were based on FORTRAN interfaces necessary for integration with another program, G189A, to perform quasi-transient modeling. Detailed reactor models were prepared for the two CO2 reduction reactors (Bosch and Advanced Carbon Formation), and the low-temperature trace contaminant oxidation reactor.

  6. Realistic modeling of neurons and networks: towards brain simulation

    PubMed Central

    D’Angelo, Egidio; Solinas, Sergio; Garrido, Jesus; Casellato, Claudia; Pedrocchi, Alessandra; Mapelli, Jonathan; Gandolfi, Daniela; Prestori, Francesca

    Summary Realistic modeling is a new advanced methodology for investigating brain functions. Realistic modeling is based on a detailed biophysical description of neurons and synapses, which can be integrated into microcircuits. The latter can, in turn, be further integrated to form large-scale brain networks and eventually to reconstruct complex brain systems. Here we provide a review of the realistic simulation strategy and use the cerebellar network as an example. This network has been carefully investigated at molecular and cellular level and has been the object of intense theoretical investigation. The cerebellum is thought to lie at the core of the forward controller operations of the brain and to implement timing and sensory prediction functions. The cerebellum is well described and provides a challenging field in which one of the most advanced realistic microcircuit models has been generated. We illustrate how these models can be elaborated and embedded into robotic control systems to gain insight into how the cellular properties of cerebellar neurons emerge in integrated behaviors. Realistic network modeling opens up new perspectives for the investigation of brain pathologies and for the neurorobotic field. PMID:24139652

  7. Analysis of Intelligent Transportation Systems Using Model-Driven Simulations.

    PubMed

    Fernández-Isabel, Alberto; Fuentes-Fernández, Rubén

    2015-01-01

    Intelligent Transportation Systems (ITSs) integrate information, sensor, control, and communication technologies to provide transport related services. Their users range from everyday commuters to policy makers and urban planners. Given the complexity of these systems and their environment, their study in real settings is frequently unfeasible. Simulations help to address this problem, but present their own issues: there can be unintended mistakes in the transition from models to code; their platforms frequently bias modeling; and it is difficult to compare works that use different models and tools. In order to overcome these problems, this paper proposes a framework for a model-driven development of these simulations. It is based on a specific modeling language that supports the integrated specification of the multiple facets of an ITS: people, their vehicles, and the external environment; and a network of sensors and actuators conveniently arranged and distributed that operates over them. The framework works with a model editor to generate specifications compliant with that language, and a code generator to produce code from them using platform specifications. There are also guidelines to help researchers in the application of this infrastructure. A case study on advanced management of traffic lights with cameras illustrates its use. PMID:26083232

  8. Analysis of Intelligent Transportation Systems Using Model-Driven Simulations

    PubMed Central

    Fernández-Isabel, Alberto; Fuentes-Fernández, Rubén

    2015-01-01

    Intelligent Transportation Systems (ITSs) integrate information, sensor, control, and communication technologies to provide transport related services. Their users range from everyday commuters to policy makers and urban planners. Given the complexity of these systems and their environment, their study in real settings is frequently unfeasible. Simulations help to address this problem, but present their own issues: there can be unintended mistakes in the transition from models to code; their platforms frequently bias modeling; and it is difficult to compare works that use different models and tools. In order to overcome these problems, this paper proposes a framework for a model-driven development of these simulations. It is based on a specific modeling language that supports the integrated specification of the multiple facets of an ITS: people, their vehicles, and the external environment; and a network of sensors and actuators conveniently arranged and distributed that operates over them. The framework works with a model editor to generate specifications compliant with that language, and a code generator to produce code from them using platform specifications. There are also guidelines to help researchers in the application of this infrastructure. A case study on advanced management of traffic lights with cameras illustrates its use. PMID:26083232

  9. Modeling and Simulation of Count Data

    PubMed Central

    Plan, E L

    2014-01-01

    Count data, or number of events per time interval, are discrete data arising from repeated time to event observations. Their mean count, or piecewise constant event rate, can be evaluated by discrete probability distributions from the Poisson model family. Clinical trial data characterization often involves population count analysis. This tutorial presents the basics and diagnostics of count modeling and simulation in the context of pharmacometrics. Consideration is given to overdispersion, underdispersion, autocorrelation, and inhomogeneity. PMID:25116273

  10. Modelling and Simulating of Rain Derivatives

    E-print Network

    Modelling and Simulating of Rain Derivatives Master thesis Cathrin van Emmerich Supervisor of a European put . . . . . . . . . . . . . . . . 3 2.1 Rain, Schleswig, 1947-2003, 7-days-intervals . . . . . . . . . . 8 2.2 Rain, Schleswig, 1980-2003, 28-days-intervals . . . . . . . . . . 9 2.3 Rain, Schleswig, 1992

  11. Modeling Surgical Skill Learning with Cognitive Simulation

    E-print Network

    Ritter, Frank

    -R cognitive model simulates declarative memory processes during motor learning. In this ongoing study, four the medical profession in recent years. A benefit of VR training is to enhance surgical proficiency of novice and correction of errors. However, most VR trainers are only designed for a set of task difficulty levels without

  12. Love Kills:. Simulations in Penna Ageing Model

    NASA Astrophysics Data System (ADS)

    Stauffer, Dietrich; Cebrat, Stanis?aw; Penna, T. J. P.; Sousa, A. O.

    The standard Penna ageing model with sexual reproduction is enlarged by adding additional bit-strings for love: Marriage happens only if the male love strings are sufficiently different from the female ones. We simulate at what level of required difference the population dies out.

  13. FAST KINETIC MODELS FOR SIMULATING AMPA, NMDA,

    E-print Network

    Destexhe, Alain

    types, the fast GABAergic receptors 9 #12;10 Chapter 2 GABAA and nicotinic acetylcholine ACh receptors. In these so-called ionotropic receptors, the ligand is a neurotransmitter and its binding to the complex leads2 FAST KINETIC MODELS FOR SIMULATING AMPA, NMDA, GABAA AND GABAB RECEPTORS Alain Destexhe, Zachary

  14. RECENT ADVANCES IN THE MODELING OF AIRBORNE SUBSTANCES

    EPA Science Inventory

    Since the 1950's, the primary mission of the Atmospheric Modeling Division has been to develop and evaluate air quality simulation models. While the Division has traditionally focused the research on the meteorological aspects of these models, this focus has expanded in recent...

  15. Gnie mcanique Thermal Modelling And Correlation of a Comet Simulation

    E-print Network

    Psaltis, Demetri

    SECTION DE Génie mécanique Thermal Modelling And Correlation of a Comet Simulation Facility Author Thomas Beck André Bieler Why simulating comets? The comet simulation facility SCITEAS (Simulation Chamber of future investigations on comet CG by the ESA Rosetta mission. A thermal model of the simulation chamber

  16. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  17. Cogeneration computer model assessment: advanced cogeneration research study

    SciTech Connect

    Rosenberg, L.

    1983-06-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  18. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Astrophysics Data System (ADS)

    Rosenberg, L.

    1983-06-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  19. Twitter's tweet method modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  20. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  1. Computational Spectrum of Agent Model Simulation

    SciTech Connect

    Perumalla, Kalyan S

    2010-01-01

    The study of human social behavioral systems is finding renewed interest in military, homeland security and other applications. Simulation is the most generally applied approach to studying complex scenarios in such systems. Here, we outline some of the important considerations that underlie the computational aspects of simulation-based study of human social systems. The fundamental imprecision underlying questions and answers in social science makes it necessary to carefully distinguish among different simulation problem classes and to identify the most pertinent set of computational dimensions associated with those classes. We identify a few such classes and present their computational implications. The focus is then shifted to the most challenging combinations in the computational spectrum, namely, large-scale entity counts at moderate to high levels of fidelity. Recent developments in furthering the state-of-the-art in these challenging cases are outlined. A case study of large-scale agent simulation is provided in simulating large numbers (millions) of social entities at real-time speeds on inexpensive hardware. Recent computational results are identified that highlight the potential of modern high-end computing platforms to push the envelope with respect to speed, scale and fidelity of social system simulations. Finally, the problem of shielding the modeler or domain expert from the complex computational aspects is discussed and a few potential solution approaches are identified.

  2. Simulation Modeling and Performance Evaluation of Space Networks

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Segui, John

    2006-01-01

    In space exploration missions, the coordinated use of spacecraft as communication relays increases the efficiency of the endeavors. To conduct trade-off studies of the performance and resource usage of different communication protocols and network designs, JPL designed a comprehensive extendable tool, the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE). The design and development of MACHETE began in 2000 and is constantly evolving. Currently, MACHETE contains Consultative Committee for Space Data Systems (CCSDS) protocol standards such as Proximity-1, Advanced Orbiting Systems (AOS), Packet Telemetry/Telecommand, Space Communications Protocol Specification (SCPS), and the CCSDS File Delivery Protocol (CFDP). MACHETE uses the Aerospace Corporation s Satellite Orbital Analysis Program (SOAP) to generate the orbital geometry information and contact opportunities. Matlab scripts provide the link characteristics. At the core of MACHETE is a discrete event simulator, QualNet. Delay Tolerant Networking (DTN) is an end-to-end architecture providing communication in and/or through highly stressed networking environments. Stressed networking environments include those with intermittent connectivity, large and/or variable delays, and high bit error rates. To provide its services, the DTN protocols reside at the application layer of the constituent internets, forming a store-and-forward overlay network. The key capabilities of the bundling protocols include custody-based reliability, ability to cope with intermittent connectivity, ability to take advantage of scheduled and opportunistic connectivity, and late binding of names to addresses. In this presentation, we report on the addition of MACHETE models needed to support DTN, namely: the Bundle Protocol (BP) model. To illustrate the use of MACHETE with the additional DTN model, we provide an example simulation to benchmark its performance. We demonstrate the use of the DTN protocol and discuss statistics gathered concerning the total time needed to simulate numerous bundle transmissions

  3. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    SciTech Connect

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.; Poore III, Willis P.; Muhlheim, Michael David

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  4. Robust three-body water simulation model.

    PubMed

    Tainter, C J; Pieniazek, P A; Lin, Y-S; Skinner, J L

    2011-05-14

    The most common potentials used in classical simulations of liquid water assume a pairwise additive form. Although these models have been very successful in reproducing many properties of liquid water at ambient conditions, none is able to describe accurately water throughout its complicated phase diagram. The primary reason for this is the neglect of many-body interactions. To this end, a simulation model with explicit three-body interactions was introduced recently [R. Kumar and J. L. Skinner, J. Phys. Chem. B 112, 8311 (2008)]. This model was parameterized to fit the experimental O-O radial distribution function and diffusion constant. Herein we reparameterize the model, fitting to a wider range of experimental properties (diffusion constant, rotational correlation time, density for the liquid, liquid/vapor surface tension, melting point, and the ice Ih density). The robustness of the model is then verified by comparing simulation to experiment for a number of other quantities (enthalpy of vaporization, dielectric constant, Debye relaxation time, temperature of maximum density, and the temperature-dependent second and third virial coefficients), with good agreement. PMID:21568515

  5. Flight Simulation Model Exchange. Volume 1

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce

    2011-01-01

    The NASA Engineering and Safety Center Review Board sponsored an assessment of the draft Standard, Flight Dynamics Model Exchange Standard, BSR/ANSI-S-119-201x (S-119) that was conducted by simulation and guidance, navigation, and control engineers from several NASA Centers. The assessment team reviewed the conventions and formats spelled out in the draft Standard and the actual implementation of two example aerodynamic models (a subsonic F-16 and the HL-20 lifting body) encoded in the Extensible Markup Language grammar. During the implementation, the team kept records of lessons learned and provided feedback to the American Institute of Aeronautics and Astronautics Modeling and Simulation Technical Committee representative. This document contains the results of the assessment.

  6. Flight Simulation Model Exchange. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce

    2011-01-01

    The NASA Engineering and Safety Center Review Board sponsored an assessment of the draft Standard, Flight Dynamics Model Exchange Standard, BSR/ANSI-S-119-201x (S-119) that was conducted by simulation and guidance, navigation, and control engineers from several NASA Centers. The assessment team reviewed the conventions and formats spelled out in the draft Standard and the actual implementation of two example aerodynamic models (a subsonic F-16 and the HL-20 lifting body) encoded in the Extensible Markup Language grammar. During the implementation, the team kept records of lessons learned and provided feedback to the American Institute of Aeronautics and Astronautics Modeling and Simulation Technical Committee representative. This document contains the appendices to the main report.

  7. Atmospheric Modeling And Sensor Simulation (AMASS) study

    NASA Technical Reports Server (NTRS)

    Parker, K. G.

    1984-01-01

    The capabilities of the atmospheric modeling and sensor simulation (AMASS) system were studied in order to enhance them. This system is used in processing atmospheric measurements which are utilized in the evaluation of sensor performance, conducting design-concept simulation studies, and also in the modeling of the physical and dynamical nature of atmospheric processes. The study tasks proposed in order to both enhance the AMASS system utilization and to integrate the AMASS system with other existing equipment to facilitate the analysis of data for modeling and image processing are enumerated. The following array processors were evaluated for anticipated effectiveness and/or improvements in throughput by attachment of the device to the P-e: (1) Floating Point Systems AP-120B; (2) Floating Point Systems 5000; (3) CSP, Inc. MAP-400; (4) Analogic AP500; (5) Numerix MARS-432; and (6) Star Technologies, Inc. ST-100.

  8. SIMULATION, MODELING AND ANALYSIS OF A WATER TO AIR

    E-print Network

    SIMULATION, MODELING AND ANALYSIS OF A WATER TO AIR HEAT PUMP By ARUN SHENOY Bachelor December, 2004 #12;SIMULATION, MODELING AND ANALYSIS OF A WATER TO AIR HEAT PUMP Thesis Approved.................................................................7 2.1.4. BLAST Model................................................................7 2

  9. Advanced deposition model for thermal activated chemical vapor deposition

    NASA Astrophysics Data System (ADS)

    Cai, Dang

    Thermal Activated Chemical Vapor Deposition (TACVD) is defined as the formation of a stable solid product on a heated substrate surface from chemical reactions and/or dissociation of gaseous reactants in an activated environment. It has become an essential process for producing solid film, bulk material, coating, fibers, powders and monolithic components. Global market of CVD products has reached multi billions dollars for each year. In the recent years CVD process has been extensively used to manufacture semiconductors and other electronic components such as polysilicon, AlN and GaN. Extensive research effort has been directed to improve deposition quality and throughput. To obtain fast and high quality deposition, operational conditions such as temperature, pressure, fluid velocity and species concentration and geometry conditions such as source-substrate distance need to be well controlled in a CVD system. This thesis will focus on design of CVD processes through understanding the transport and reaction phenomena in the growth reactor. Since the in situ monitor is almost impossible for CVD reactor, many industrial resources have been expended to determine the optimum design by semi-empirical methods and trial-and-error procedures. This approach has allowed the achievement of improvements in the deposition sequence, but begins to show its limitations, as this method cannot always fulfill the more and more stringent specifications of the industry. To resolve this problem, numerical simulation is widely used in studying the growth techniques. The difficulty of numerical simulation of TACVD crystal growth process lies in the simulation of gas phase and surface reactions, especially the latter one, due to the fact that very limited kinetic information is available in the open literature. In this thesis, an advanced deposition model was developed to study the multi-component fluid flow, homogeneous gas phase reactions inside the reactor chamber, heterogeneous surface reactions on the substrate surface, conductive, convective, inductive and radiative heat transfer, species transport and thereto-elastic stress distributions. Gas phase and surface reactions are studied thermodynamically and kinetically. Based on experimental results, detailed reaction mechanisms are proposed and the deposition rates are predicted. The deposition model proposed could be used for other experiments with similar operating conditions. Four different growth systems are presented in this thesis to discuss comprehensive transport phenomena in crystal growth from vapor. The first is the polysilicon bulk growth by modified Siemens technique in which a silicon tube is used as the starting material. The research effort has been focused on system design, geometric and operating parameters optimization, and heterogeneous and homogeneous silane pyrolysis analysis. The second is the GaN thin film growth by iodine vapor phase epitaxy technique. Heat and mass transport is studied analytically and numerically. Gas phase and surface reactions are analyzed thermodynamically and kinetically. Quasi-equilibrium and kinetic deposition models are developed to predict the growth rate. The third one is the AlN thin film growth by halide vapor phase epitaxy technique. The effects of gas phase and surface reactions on the crystal growth rate and deposition uniformity are studied. The last one is the AlN sublimation growth system. The research effort has been focused on the effect of thermal environment evolution on the crystal growth process. The thermoelastic stress formed in the as-grown AlN crystal is also calculated.

  10. Electron precipitation models in global magnetosphere simulations

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Lotko, W.; Brambles, O.; Wiltberger, M.; Lyon, J.

    2015-02-01

    General methods for improving the specification of electron precipitation in global simulations are described and implemented in the Lyon-Fedder-Mobarry (LFM) global simulation model, and the quality of its predictions for precipitation is assessed. LFM's existing diffuse and monoenergetic electron precipitation models are improved, and new models are developed for lower energy, broadband, and direct-entry cusp precipitation. The LFM simulation results for combined diffuse plus monoenergetic electron precipitation exhibit a quadratic increase in the hemispheric precipitation power as the intensity of solar wind driving increases, in contrast with the prediction from the OVATION Prime (OP) 2010 empirical precipitation model which increases linearly with driving intensity. Broadband precipitation power increases approximately linearly with driving intensity in both models. Comparisons of LFM and OP predictions with estimates of precipitating power derived from inversions of Polar satellite UVI images during a double substorm event (28-29 March 1998) show that the LFM peak precipitating power is >4× larger when using the improved precipitation model and most closely tracks the larger of three different inversion estimates. The OP prediction most closely tracks the double peaks in the intermediate inversion estimate, but it overestimates the precipitating power between the two substorms by a factor >2 relative to all other estimates. LFMs polar pattern of precipitating energy flux tracks that of OP for broadband precipitation exhibits good correlation with duskside region 1 currents for monoenergetic energy flux that OP misses and fails to produce sufficient diffuse precipitation power in the prenoon quadrant that is present in OP. The prenoon deficiency is most likely due to the absence of drift kinetic physics in the LFM simulation.

  11. VHub - Cyberinfrastructure for volcano eruption and hazards modeling and simulation

    NASA Astrophysics Data System (ADS)

    Valentine, G. A.; Jones, M. D.; Bursik, M. I.; Calder, E. S.; Gallo, S. M.; Connor, C.; Carn, S. A.; Rose, W. I.; Moore-Russo, D. A.; Renschler, C. S.; Pitman, B.; Sheridan, M. F.

    2009-12-01

    Volcanic risk is increasing as populations grow in active volcanic regions, and as national economies become increasingly intertwined. In addition to their significance to risk, volcanic eruption processes form a class of multiphase fluid dynamics with rich physics on many length and time scales. Risk significance, physics complexity, and the coupling of models to complex dynamic spatial datasets all demand the development of advanced computational techniques and interdisciplinary approaches to understand and forecast eruption dynamics. Innovative cyberinfrastructure is needed to enable global collaboration and novel scientific creativity, while simultaneously enabling computational thinking in real-world risk mitigation decisions - an environment where quality control, documentation, and traceability are key factors. Supported by NSF, we are developing a virtual organization, referred to as VHub, to address this need. Overarching goals of the VHub project are: Dissemination. Make advanced modeling and simulation capabilities and key data sets readily available to researchers, students, and practitioners around the world. Collaboration. Provide a mechanism for participants not only to be users but also co-developers of modeling capabilities, and contributors of experimental and observational data sets for use in modeling and simulation, in a collaborative environment that reaches far beyond local work groups. Comparison. Facilitate comparison between different models in order to provide the practitioners with guidance for choosing the "right" model, depending upon the intended use, and provide a platform for multi-model analysis of specific problems and incorporation into probabilistic assessments. Application. Greatly accelerate access and application of a wide range of modeling tools and related data sets to agencies around the world that are charged with hazard planning, mitigation, and response. Education. Provide resources that will promote the training of the next generation of volcanologists and hazards specialists such that modeling and simulation form part of a tripartite foundation of approaches, alongside observational data and experimentation. Adaptation. Conduct ongoing, rigorous self-assessment to study the impact of the virtual organization and promote continual adaptation to optimize its impact, as well as to understand emergent collective learning and collaborative patterns. VHub development is just beginning and we are very interested in input from the community and the addition of new partners to the effort. Current partners include A. Costa, A. Neri, W. Marzocchi, R.S.J. Sparks, S.J. Cronin, S. Takarada, Joan Marti, J.-C. Komorowski, T.H. Druitt, T. Koyaguchi, J.L. Macias, and S. Dartevelle.

  12. QUEST FOR AN ADVANCED REGIONAL AIR QUALITY MODEL

    EPA Science Inventory

    Organizations interested in advancing the science and technology of regional air quality modeling on the "grand challenge" scale have joined to form CAMRAQ. hey plan to leverage their research finds by collaborating on the development and evaluation of CMSs so ambitious in scope ...

  13. Advanced MHD models of anisotropy, flow and chaotic fields

    E-print Network

    Hudson, Stuart

    Advanced MHD models of anisotropy, flow and chaotic fields M. J. Hole1, M. Fitzgerald1, G. Dennis1 pinches Highlight some recent progress Future directions · Conclusions #12;· Pressure different parallel, pressure" #12;· Pressure different parallel and perpendicular to field due mainly to directed neutral beam

  14. Advances in Games Technology: Software, Models, and Intelligence

    ERIC Educational Resources Information Center

    Prakash, Edmond; Brindle, Geoff; Jones, Kevin; Zhou, Suiping; Chaudhari, Narendra S.; Wong, Kok-Wai

    2009-01-01

    Games technology has undergone tremendous development. In this article, the authors report the rapid advancement that has been observed in the way games software is being developed, as well as in the development of games content using game engines. One area that has gained special attention is modeling the game environment such as terrain and…

  15. ADVANCED WUFI COMPUTER MODELING WORKSHOP FOR WALL DESIGN AND PERFORMANCE

    E-print Network

    Oak Ridge National Laboratory

    ADVANCED WUFI COMPUTER MODELING WORKSHOP FOR WALL DESIGN AND PERFORMANCE (HEAT AND MOISTURE · Thermal comfort evaluation · WUFI® -Plus project work · Net-Zero Energy Houses in Dubai (concept design.m. · Differences of 1D and 2D calculations (anisotropy, boundary conditions etc.) · Heat and moisture bridges

  16. Deep Drawing Simulations With Different Polycrystalline Models

    NASA Astrophysics Data System (ADS)

    Duchêne, Laurent; de Montleau, Pierre; Bouvier, Salima; Habraken, Anne Marie

    2004-06-01

    The goal of this research is to study the anisotropic material behavior during forming processes, represented by both complex yield loci and kinematic-isotropic hardening models. A first part of this paper describes the main concepts of the `Stress-strain interpolation' model that has been implemented in the non-linear finite element code Lagamine. This model consists of a local description of the yield locus based on the texture of the material through the full constraints Taylor's model. The texture evolution due to plastic deformations is computed throughout the FEM simulations. This `local yield locus' approach was initially linked to the classical isotropic Swift hardening law. Recently, a more complex hardening model was implemented: the physically-based microstructural model of Teodosiu. It takes into account intergranular heterogeneity due to the evolution of dislocation structures, that affects isotropic and kinematic hardening. The influence of the hardening model is compared to the influence of the texture evolution thanks to deep drawing simulations.

  17. Facebook's personal page modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  18. Towards Better Coupling of Hydrological Simulation Models

    NASA Astrophysics Data System (ADS)

    Penton, D.; Stenson, M.; Leighton, B.; Bridgart, R.

    2012-12-01

    Standards for model interoperability and scientific workflow software provide techniques and tools for coupling hydrological simulation models. However, model builders are yet to realize the benefits of these and continue to write ad hoc implementations and scripts. Three case studies demonstrate different approaches to coupling models, the first using tight interfaces (OpenMI), the second using a scientific workflow system (Trident) and the third using a tailored execution engine (Delft Flood Early Warning System - Delft-FEWS). No approach was objectively better than any other approach. The foremost standard for coupling hydrological models is the Open Modeling Interface (OpenMI), which defines interfaces for models to interact. An implementation of the OpenMI standard involves defining interchange terms and writing a .NET/Java wrapper around the model. An execution wrapper such as OatC.GUI or Pipistrelle executes the models. The team built two OpenMI implementations for eWater Source river system models. Once built, it was easy to swap river system models. The team encountered technical challenges with versions of the .Net framework (3.5 calling 4.0) and with the performance of the execution wrappers when running daily simulations. By design, the OpenMI interfaces are general, leaving significant decisions around the semantics of the interfaces to the implementer. Increasingly, scientific workflow tools such as Kepler, Taverna and Trident are able to replace custom scripts. These tools aim to improve the provenance and reproducibility of processing tasks. In particular, Taverna and the myExperiment website have had success making many bioinformatics workflows reusable and sharable. The team constructed Trident activities for hydrological software including IQQM, REALM and eWater Source. They built an activity generator for model builders to build activities for particular river systems. The models were linked at a simulation level, without any daily time-step feedbacks. There was no obvious way to add daily time-step feedbacks without incurring a considerable performance penalty. The Delft-FEWS system connects hydrological models for flood warnings and forecasts in a workflow system. It provides a range of custom facilities for connecting real-time data services. A Delft-FEWS system was constructed to connect a series of eWater Source hydrological models using the batch forecast mode to orchestrate a time-stepping system. The system coupled a series of river models running daily through a service interface. The implementation did not easily support interoperability with other models; however, using command line calls and the file-system did allow a level of language independence. The case-studies covered the coupling of hydrological models through tight interfaces (OpenMI), broad scientific workflow software (Trident) and a tailored execution engine (Delft-FEWS). We found that no approach was objectively better than any other approach. OpenMI had the most flexible interfaces, Trident the best handling of provenance and Delft-FEWS provided a significant set of tools for ingesting and transforming data. The case studies revealed a need for stable execution wrappers, patterns for efficient cross-language interoperability, targeted semantics for hydrological simulation and better handling of daily simulation.

  19. Virtual Driving and Eco-Simulation VR City Modeling, Drive Simulation, and Ecological Habits

    E-print Network

    Virtual Driving and Eco-Simulation VR City Modeling, Drive Simulation, and Ecological Habits Abstract This paper introduces a VR city model developed for research in driving simulation, driver of VR city modeling are then introduced. The modeling process of creating road and intersection networks

  20. eShopper modeling and simulation

    NASA Astrophysics Data System (ADS)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.