Sample records for process simulation model

  1. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  2. Process Modeling and Dynamic Simulation for EAST Helium Refrigerator

    NASA Astrophysics Data System (ADS)

    Lu, Xiaofei; Fu, Peng; Zhuang, Ming; Qiu, Lilong; Hu, Liangbing

    2016-06-01

    In this paper, the process modeling and dynamic simulation for the EAST helium refrigerator has been completed. The cryogenic process model is described and the main components are customized in detail. The process model is controlled by the PLC simulator, and the realtime communication between the process model and the controllers is achieved by a customized interface. Validation of the process model has been confirmed based on EAST experimental data during the cool down process of 300-80 K. Simulation results indicate that this process simulator is able to reproduce dynamic behaviors of the EAST helium refrigerator very well for the operation of long pulsed plasma discharge. The cryogenic process simulator based on control architecture is available for operation optimization and control design of EAST cryogenic systems to cope with the long pulsed heat loads in the future. supported by National Natural Science Foundation of China (No. 51306195) and Key Laboratory of Cryogenics, Technical Institute of Physics and Chemistry, CAS (No. CRYO201408)

  3. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  4. LISP based simulation generators for modeling complex space processes

    NASA Technical Reports Server (NTRS)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  5. Towards Automatic Processing of Virtual City Models for Simulations

    NASA Astrophysics Data System (ADS)

    Piepereit, R.; Schilling, A.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2016-10-01

    Especially in the field of numerical simulations, such as flow and acoustic simulations, the interest in using virtual 3D models to optimize urban systems is increasing. The few instances in which simulations were already carried out in practice have been associated with an extremely high manual and therefore uneconomical effort for the processing of models. Using different ways of capturing models in Geographic Information System (GIS) and Computer Aided Engineering (CAE), increases the already very high complexity of the processing. To obtain virtual 3D models suitable for simulation, we developed a tool for automatic processing with the goal to establish ties between the world of GIS and CAE. In this paper we introduce a way to use Coons surfaces for the automatic processing of building models in LoD2, and investigate ways to simplify LoD3 models in order to reduce unnecessary information for a numerical simulation.

  6. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  7. The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator.

    PubMed

    Roh, S D; Kim, S W; Cho, W S

    2001-10-01

    The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator were accomplished. In the numerical modelling, two models applied to the modelling within the kiln are the combustion chamber model including the mass and energy balance equations for two combustion chambers and 3D thermal model. The combustion chamber model predicts temperature within the kiln, flue gas composition, flux and heat of combustion. Using the combustion chamber model and 3D thermal model, the production-rules for the process simulation can be obtained through interrelation analysis between control and operation variables. The process simulation of the kiln is operated with the production-rules for automatic operation. The process simulation aims to provide fundamental solutions to the problems in incineration process by introducing an online expert control system to provide an integrity in process control and management. Knowledge-based expert control systems use symbolic logic and heuristic rules to find solutions for various types of problems. It was implemented to be a hybrid intelligent expert control system by mutually connecting with the process control systems which has the capability of process diagnosis, analysis and control.

  8. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  9. Reduced order model based on principal component analysis for process simulation and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lang, Y.; Malacina, A.; Biegler, L.

    2009-01-01

    It is well-known that distributed parameter computational fluid dynamics (CFD) models provide more accurate results than conventional, lumped-parameter unit operation models used in process simulation. Consequently, the use of CFD models in process/equipment co-simulation offers the potential to optimize overall plant performance with respect to complex thermal and fluid flow phenomena. Because solving CFD models is time-consuming compared to the overall process simulation, we consider the development of fast reduced order models (ROMs) based on CFD results to closely approximate the high-fidelity equipment models in the co-simulation. By considering process equipment items with complicated geometries and detailed thermodynamic property models,more » this study proposes a strategy to develop ROMs based on principal component analysis (PCA). Taking advantage of commercial process simulation and CFD software (for example, Aspen Plus and FLUENT), we are able to develop systematic CFD-based ROMs for equipment models in an efficient manner. In particular, we show that the validity of the ROM is more robust within well-sampled input domain and the CPU time is significantly reduced. Typically, it takes at most several CPU seconds to evaluate the ROM compared to several CPU hours or more to solve the CFD model. Two case studies, involving two power plant equipment examples, are described and demonstrate the benefits of using our proposed ROM methodology for process simulation and optimization.« less

  10. Architectural Improvements and New Processing Tools for the Open XAL Online Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Christopher K; Pelaia II, Tom; Freed, Jonathan M

    The online model is the component of Open XAL providing accelerator modeling, simulation, and dynamic synchronization to live hardware. Significant architectural changes and feature additions have been recently made in two separate areas: 1) the managing and processing of simulation data, and 2) the modeling of RF cavities. Simulation data and data processing have been completely decoupled. A single class manages all simulation data while standard tools were developed for processing the simulation results. RF accelerating cavities are now modeled as composite structures where parameter and dynamics computations are distributed. The beam and hardware models both maintain their relative phasemore » information, which allows for dynamic phase slip and elapsed time computation.« less

  11. Hydrological and water quality processes simulation by the integrated MOHID model

    NASA Astrophysics Data System (ADS)

    Epelde, Ane; Antiguedad, Iñaki; Brito, David; Eduardo, Jauch; Neves, Ramiro; Sauvage, Sabine; Sánchez-Pérez, José Miguel

    2016-04-01

    Different modelling approaches have been used in recent decades to study the water quality degradation caused by non-point source pollution. In this study, the MOHID fully distributed and physics-based model has been employed to simulate hydrological processes and nitrogen dynamics in a nitrate vulnerable zone: the Alegria River watershed (Basque Country, Northern Spain). The results of this study indicate that the MOHID code is suitable for hydrological processes simulation at the watershed scale, as the model shows satisfactory performance at simulating the discharge (with NSE: 0.74 and 0.76 during calibration and validation periods, respectively). The agronomical component of the code, allowed the simulation of agricultural practices, which lead to adequate crop yield simulation in the model. Furthermore, the nitrogen exportation also shows satisfactory performance (with NSE: 0.64 and 0.69 during calibration and validation periods, respectively). While the lack of field measurements do not allow to evaluate the nutrient cycling processes in depth, it has been observed that the MOHID model simulates the annual denitrification according to general ranges established for agricultural watersheds (in this study, 9 kg N ha-1 year-1). In addition, the model has simulated coherently the spatial distribution of the denitrification process, which is directly linked to the simulated hydrological conditions. Thus, the model has localized the highest rates nearby the discharge zone of the aquifer and also where the aquifer thickness is low. These results evidence the strength of this model to simulate watershed scale hydrological processes as well as the crop production and the agricultural activity derived water quality degradation (considering both nutrient exportation and nutrient cycling processes).

  12. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  13. Modeling microbiological and chemical processes in municipal solid waste bioreactor, Part II: Application of numerical model BIOKEMOD-3P.

    PubMed

    Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh

    2010-02-01

    Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.

  14. Combined Log Inventory and Process Simulation Models for the Planning and Control of Sawmill Operations

    Treesearch

    Guillermo A. Mendoza; Roger J. Meimban; Philip A. Araman; William G. Luppold

    1991-01-01

    A log inventory model and a real-time hardwood process simulation model were developed and combined into an integrated production planning and control system for hardwood sawmills. The log inventory model was designed to monitor and periodically update the status of the logs in the log yard. The process simulation model was designed to estimate various sawmill...

  15. Virtual milk for modelling and simulation of dairy processes.

    PubMed

    Munir, M T; Zhang, Y; Yu, W; Wilson, D I; Young, B R

    2016-05-01

    The modeling of dairy processing using a generic process simulator suffers from shortcomings, given that many simulators do not contain milk components in their component libraries. Recently, pseudo-milk components for a commercial process simulator were proposed for simulation and the current work extends this pseudo-milk concept by studying the effect of both total milk solids and temperature on key physical properties such as thermal conductivity, density, viscosity, and heat capacity. This paper also uses expanded fluid and power law models to predict milk viscosity over the temperature range from 4 to 75°C and develops a succinct regressed model for heat capacity as a function of temperature and fat composition. The pseudo-milk was validated by comparing the simulated and actual values of the physical properties of milk. The milk thermal conductivity, density, viscosity, and heat capacity showed differences of less than 2, 4, 3, and 1.5%, respectively, between the simulated results and actual values. This work extends the capabilities of the previously proposed pseudo-milk and of a process simulator to model dairy processes, processing different types of milk (e.g., whole milk, skim milk, and concentrated milk) with different intrinsic compositions, and to predict correct material and energy balances for dairy processes. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  17. Modeling and Simulation of Quenching and Tempering Process in steels

    NASA Astrophysics Data System (ADS)

    Deng, Xiaohu; Ju, Dongying

    Quenching and tempering (Q&T) is a combined heat treatment process to achieve maximum toughness and ductility at a specified hardness and strength. It is important to develop a mathematical model for quenching and tempering process for satisfy requirement of mechanical properties with low cost. This paper presents a modified model to predict structural evolution and hardness distribution during quenching and tempering process of steels. The model takes into account tempering parameters, carbon content, isothermal and non-isothermal transformations. Moreover, precipitation of transition carbides, decomposition of retained austenite and precipitation of cementite can be simulated respectively. Hardness distributions of quenched and tempered workpiece are predicted by experimental regression equation. In order to validate the model, it is employed to predict the tempering of 80MnCr5 steel. The predicted precipitation dynamics of transition carbides and cementite is consistent with the previous experimental and simulated results from literature. Then the model is implemented within the framework of the developed simulation code COSMAP to simulate microstructure, stress and distortion in the heat treated component. It is applied to simulate Q&T process of J55 steel. The calculated results show a good agreement with the experimental ones. This agreement indicates that the model is effective for simulation of Q&T process of steels.

  18. Modeling and Simulation of Metallurgical Process Based on Hybrid Petri Net

    NASA Astrophysics Data System (ADS)

    Ren, Yujuan; Bao, Hong

    2016-11-01

    In order to achieve the goals of energy saving and emission reduction of iron and steel enterprises, an increasing number of modeling and simulation technologies are used to research and analyse metallurgical production process. In this paper, the basic principle of Hybrid Petri net is used to model and analyse the Metallurgical Process. Firstly, the definition of Hybrid Petri Net System of Metallurgical Process (MPHPNS) and its modeling theory are proposed. Secondly, the model of MPHPNS based on material flow is constructed. The dynamic flow of materials and the real-time change of each technological state in metallurgical process are simulated vividly by using this model. The simulation process can implement interaction between the continuous event dynamic system and the discrete event dynamic system at the same level, and play a positive role in the production decision.

  19. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  20. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  1. 40 CFR Appendix C to Part 75 - Missing Data Estimation Procedures

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... certification of a parametric, empirical, or process simulation method or model for calculating substitute data... available process simulation methods and models. 1.2Petition Requirements Continuously monitor, determine... desulfurization, a corresponding empirical correlation or process simulation parametric method using appropriate...

  2. Process Modeling of Composite Materials for Wind-Turbine Rotor Blades: Experiments and Numerical Modeling

    PubMed Central

    Wieland, Birgit; Ropte, Sven

    2017-01-01

    The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results. PMID:28981458

  3. Process Modeling of Composite Materials for Wind-Turbine Rotor Blades: Experiments and Numerical Modeling.

    PubMed

    Wieland, Birgit; Ropte, Sven

    2017-10-05

    The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results.

  4. VARTM Process Modeling of Aerospace Composite Structures

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  5. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  6. Modelling and Simulation as a Recognizing Method in Education

    ERIC Educational Resources Information Center

    Stoffa, Veronika

    2004-01-01

    Computer animation-simulation models of complex processes and events, which are the method of instruction, can be an effective didactic device. Gaining deeper knowledge about objects modelled helps to plan simulation experiments oriented on processes and events researched. Animation experiments realized on multimedia computers can aid easier…

  7. A Software Development Simulation Model of a Spiral Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  8. Assessment of effectiveness of geologic isolation systems. Geologic-simulation model for a hypothetical site in the Columbia Plateau. Volume 2: results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, M.G.; Petrie, G.M.; Baldwin, A.J.

    1982-06-01

    This report contains the input data and computer results for the Geologic Simulation Model. This model is described in detail in the following report: Petrie, G.M., et. al. 1981. Geologic Simulation Model for a Hypothetical Site in the Columbia Plateau, Pacific Northwest Laboratory, Richland, Washington. The Geologic Simulation Model is a quasi-deterministic process-response model which simulates, for a million years into the future, the development of the geologic and hydrologic systems of the ground-water basin containing the Pasco Basin. Effects of natural processes on the ground-water hydrologic system are modeled principally by rate equations. The combined effects and synergistic interactionsmore » of different processes are approximated by linear superposition of their effects during discrete time intervals in a stepwise-integration approach.« less

  9. Practical Unitary Simulator for Non-Markovian Complex Processes

    NASA Astrophysics Data System (ADS)

    Binder, Felix C.; Thompson, Jayne; Gu, Mile

    2018-06-01

    Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.

  10. Using manufacturing simulators to evaluate important processing decisions in the furniture and cabinet industries

    Treesearch

    Janice K. Wiedenbeck; Philip A. Araman

    1995-01-01

    We've been telling the wood industry about our process simulation modeling research and development work for several years. We've demonstrated our crosscut-first and rip-first rough mill simulation and animation models. Weâve advised companies on how they could use simulation modeling to help make critically important, pending decisions related to mill layout...

  11. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  12. Bacterial transformation and biodegradation processes simulation in horizontal subsurface flow constructed wetlands using CWM1-RETRASO.

    PubMed

    Llorens, Esther; Saaltink, Maarten W; Poch, Manel; García, Joan

    2011-01-01

    The performance and reliability of the CWM1-RETRASO model for simulating processes in horizontal subsurface flow constructed wetlands (HSSF CWs) and the relative contribution of different microbial reactions to organic matter (COD) removal in a HSSF CW treating urban wastewater were evaluated. Various different approaches with diverse influent configurations were simulated. According to the simulations, anaerobic processes were more widespread in the simulated wetland and contributed to a higher COD removal rate [72-79%] than anoxic [0-1%] and aerobic reactions [20-27%] did. In all the cases tested, the reaction that most contributed to COD removal was methanogenesis [58-73%]. All results provided by the model were in consonance with literature and experimental field observations, suggesting a good performance and reliability of CWM1-RETRASO. According to the good simulation predictions, CWM1-RETRASO is the first mechanistic model able to successfully simulate the processes described by the CWM1 model in HSSF CWs. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Modeling and simulation: A key to future defense technology

    NASA Technical Reports Server (NTRS)

    Muccio, Anthony B.

    1993-01-01

    The purpose of this paper is to express the rationale for continued technological and scientific development of the modeling and simulation process for the defense industry. The defense industry, along with a variety of other industries, is currently being forced into making sacrifices in response to the current economic hardships. These sacrifices, which may not compromise the safety of our nation, nor jeopardize our current standing as the world peace officer, must be concentrated in areas which will withstand the needs of the changing world. Therefore, the need for cost effective alternatives of defense issues must be examined. This paper provides support that the modeling and simulation process is an economically feasible process which will ensure our nation's safety as well as provide and keep up with the future technological developments and demands required by the defense industry. The outline of this paper is as follows: introduction, which defines and describes the modeling and simulation process; discussion, which details the purpose and benefits of modeling and simulation and provides specific examples of how the process has been successful; and conclusion, which summarizes the specifics of modeling and simulation of defense issues and lends the support for its continued use in the defense arena.

  14. A simple analytical infiltration model for short-duration rainfall

    NASA Astrophysics Data System (ADS)

    Wang, Kaiwen; Yang, Xiaohua; Liu, Xiaomang; Liu, Changming

    2017-12-01

    Many infiltration models have been proposed to simulate infiltration process. Different initial soil conditions and non-uniform initial water content can lead to infiltration simulation errors, especially for short-duration rainfall (SHR). Few infiltration models are specifically derived to eliminate the errors caused by the complex initial soil conditions. We present a simple analytical infiltration model for SHR infiltration simulation, i.e., Short-duration Infiltration Process model (SHIP model). The infiltration simulated by 5 models (i.e., SHIP (high) model, SHIP (middle) model, SHIP (low) model, Philip model and Parlange model) were compared based on numerical experiments and soil column experiments. In numerical experiments, SHIP (middle) and Parlange models had robust solutions for SHR infiltration simulation of 12 typical soils under different initial soil conditions. The absolute values of percent bias were less than 12% and the values of Nash and Sutcliffe efficiency were greater than 0.83. Additionally, in soil column experiments, infiltration rate fluctuated in a range because of non-uniform initial water content. SHIP (high) and SHIP (low) models can simulate an infiltration range, which successfully covered the fluctuation range of the observed infiltration rate. According to the robustness of solutions and the coverage of fluctuation range of infiltration rate, SHIP model can be integrated into hydrologic models to simulate SHR infiltration process and benefit the flood forecast.

  15. Seasonal changes in the atmospheric heat balance simulated by the GISS general circulation model

    NASA Technical Reports Server (NTRS)

    Stone, P. H.; Chow, S.; Helfand, H. M.; Quirk, W. J.; Somerville, R. C. J.

    1975-01-01

    Tests of the ability of numerical general circulation models to simulate the atmosphere have focussed so far on simulations of the January climatology. These models generally present boundary conditions such as sea surface temperature, but this does not prevent testing their ability to simulate seasonal changes in atmospheric processes that accompany presented seasonal changes in boundary conditions. Experiments to simulate changes in the zonally averaged heat balance are discussed since many simplified models of climatic processes are based solely on this balance.

  16. Multi-scale Modeling of Arctic Clouds

    NASA Astrophysics Data System (ADS)

    Hillman, B. R.; Roesler, E. L.; Dexheimer, D.

    2017-12-01

    The presence and properties of clouds are critically important to the radiative budget in the Arctic, but clouds are notoriously difficult to represent in global climate models (GCMs). The challenge stems partly from a disconnect in the scales at which these models are formulated and the scale of the physical processes important to the formation of clouds (e.g., convection and turbulence). Because of this, these processes are parameterized in large-scale models. Over the past decades, new approaches have been explored in which a cloud system resolving model (CSRM), or in the extreme a large eddy simulation (LES), is embedded into each gridcell of a traditional GCM to replace the cloud and convective parameterizations to explicitly simulate more of these important processes. This approach is attractive in that it allows for more explicit simulation of small-scale processes while also allowing for interaction between the small and large-scale processes. The goal of this study is to quantify the performance of this framework in simulating Arctic clouds relative to a traditional global model, and to explore the limitations of such a framework using coordinated high-resolution (eddy-resolving) simulations. Simulations from the global model are compared with satellite retrievals of cloud fraction partioned by cloud phase from CALIPSO, and limited-area LES simulations are compared with ground-based and tethered-balloon measurements from the ARM Barrow and Oliktok Point measurement facilities.

  17. Learning-Testing Process in Classroom: An Empirical Simulation Model

    ERIC Educational Resources Information Center

    Buda, Rodolphe

    2009-01-01

    This paper presents an empirical micro-simulation model of the teaching and the testing process in the classroom (Programs and sample data are available--the actual names of pupils have been hidden). It is a non-econometric micro-simulation model describing informational behaviors of the pupils, based on the observation of the pupils'…

  18. The Use of Particle/Substrate Material Models in Simulation of Cold-Gas Dynamic-Spray Process

    NASA Astrophysics Data System (ADS)

    Rahmati, Saeed; Ghaei, Abbas

    2014-02-01

    Cold spray is a coating deposition method in which the solid particles are accelerated to the substrate using a low temperature supersonic gas flow. Many numerical studies have been carried out in the literature in order to study this process in more depth. Despite the inability of Johnson-Cook plasticity model in prediction of material behavior at high strain rates, it is the model that has been frequently used in simulation of cold spray. Therefore, this research was devoted to compare the performance of different material models in the simulation of cold spray process. Six different material models, appropriate for high strain-rate plasticity, were employed in finite element simulation of cold spray process for copper. The results showed that the material model had a considerable effect on the predicted deformed shapes.

  19. Simulation of salt production process

    NASA Astrophysics Data System (ADS)

    Muraveva, E. A.

    2017-10-01

    In this paper an approach to the use of simulation software iThink to simulate the salt production system has been proposed. The dynamic processes of the original system are substituted by processes simulated in the abstract model, but in compliance with the basic rules of the original system, which allows one to accelerate and reduce the cost of the research. As a result, a stable workable simulation model was obtained that can display the rate of the salt exhaustion and many other parameters which are important for business planning.

  20. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  1. Integrated modeling and heat treatment simulation of austempered ductile iron

    NASA Astrophysics Data System (ADS)

    Hepp, E.; Hurevich, V.; Schäfer, W.

    2012-07-01

    The integrated modeling and simulation of the casting and heat treatment processes for producing austempered ductile iron (ADI) castings is presented. The focus is on describing different models to simulate the austenitization, quenching and austempering steps during ADI heat treatment. The starting point for the heat treatment simulation is the simulated microstructure after solidification and cooling. The austenitization model considers the transformation of the initial ferrite-pearlite matrix into austenite as well as the dissolution of graphite in austenite to attain a uniform carbon distribution. The quenching model is based on measured CCT diagrams. Measurements have been carried out to obtain these diagrams for different alloys with varying Cu, Ni and Mo contents. The austempering model includes nucleation and growth kinetics of the ADI matrix. The model of ADI nucleation is based on experimental measurements made for varied Cu, Ni, Mo contents and austempering temperatures. The ADI kinetic model uses a diffusion controlled approach to model the growth. The models have been integrated in a tool for casting process simulation. Results are shown for the optimization of the heat treatment process of a planetary carrier casting.

  2. A simulation framework for mapping risks in clinical processes: the case of in-patient transfers.

    PubMed

    Dunn, Adam G; Ong, Mei-Sing; Westbrook, Johanna I; Magrabi, Farah; Coiera, Enrico; Wobcke, Wayne

    2011-05-01

    To model how individual violations in routine clinical processes cumulatively contribute to the risk of adverse events in hospital using an agent-based simulation framework. An agent-based simulation was designed to model the cascade of common violations that contribute to the risk of adverse events in routine clinical processes. Clinicians and the information systems that support them were represented as a group of interacting agents using data from direct observations. The model was calibrated using data from 101 patient transfers observed in a hospital and results were validated for one of two scenarios (a misidentification scenario and an infection control scenario). Repeated simulations using the calibrated model were undertaken to create a distribution of possible process outcomes. The likelihood of end-of-chain risk is the main outcome measure, reported for each of the two scenarios. The simulations demonstrate end-of-chain risks of 8% and 24% for the misidentification and infection control scenarios, respectively. Over 95% of the simulations in both scenarios are unique, indicating that the in-patient transfer process diverges from prescribed work practices in a variety of ways. The simulation allowed us to model the risk of adverse events in a clinical process, by generating the variety of possible work subject to violations, a novel prospective risk analysis method. The in-patient transfer process has a high proportion of unique trajectories, implying that risk mitigation may benefit from focusing on reducing complexity rather than augmenting the process with further rule-based protocols.

  3. PLYMAP : a computer simulation model of the rotary peeled softwood plywood manufacturing process

    Treesearch

    Henry Spelter

    1990-01-01

    This report documents a simulation model of the plywood manufacturing process. Its purpose is to enable a user to make quick estimates of the economic impact of a particular process change within a mill. The program was designed to simulate the processing of plywood within a relatively simplified mill design. Within that limitation, however, it allows a wide range of...

  4. Design process for applying the nonlocal thermal transport iSNB model to a Polar-Drive ICF simulation

    NASA Astrophysics Data System (ADS)

    Cao, Duc; Moses, Gregory; Delettrez, Jacques; Collins, Timothy

    2014-10-01

    A design process is presented for the nonlocal thermal transport iSNB (implicit Schurtz, Nicolai, and Busquet) model to provide reliable nonlocal thermal transport in polar-drive ICF simulations. Results from the iSNB model are known to be sensitive to changes in the SNB ``mean free path'' formula, and the latter's original form required modification to obtain realistic preheat levels. In the presented design process, SNB mean free paths are first modified until the model can match temperatures from Goncharov's thermal transport model in 1D temperature relaxation simulations. Afterwards the same mean free paths are tested in a 1D polar-drive surrogate simulation to match adiabats from Goncharov's model. After passing the two previous steps, the model can then be run in a full 2D polar-drive simulation. This research is supported by the University of Rochester Laboratory for Laser Energetics.

  5. Development of a physically-based planar inductors VHDL-AMS model for integrated power converter design

    NASA Astrophysics Data System (ADS)

    Ammouri, Aymen; Ben Salah, Walid; Khachroumi, Sofiane; Ben Salah, Tarek; Kourda, Ferid; Morel, Hervé

    2014-05-01

    Design of integrated power converters needs prototype-less approaches. Specific simulations are required for investigation and validation process. Simulation relies on active and passive device models. Models of planar devices, for instance, are still not available in power simulator tools. There is, thus, a specific limitation during the simulation process of integrated power systems. The paper focuses on the development of a physically-based planar inductor model and its validation inside a power converter during transient switching. The planar inductor model remains a complex device to model, particularly when the skin, the proximity and the parasitic capacitances effects are taken into account. Heterogeneous simulation scheme, including circuit and device models, is successfully implemented in VHDL-AMS language and simulated in Simplorer platform. The mixed simulation results has been favorably tested and compared with practical measurements. It is found that the multi-domain simulation results and measurements data are in close agreement.

  6. Logistics of Trainsets Creation with the Use of Simulation Models

    NASA Astrophysics Data System (ADS)

    Sedláček, Michal; Pavelka, Hynek

    2016-12-01

    This paper focuses on rail transport in following the train formation operational processes problem using computer simulations. The problem has been solved using SIMUL8 and applied to specific train formation station in the Czech Republic. The paper describes a proposal simulation model of the train formation work. Experimental modeling with an assessment of achievements and design solution for optimizing of the train formation operational process is also presented.

  7. Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dali; Yuan, Fengming; Hernandez, Benjamin

    Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less

  8. Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations

    DOE PAGES

    Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...

    2017-01-01

    Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less

  9. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.

  10. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  11. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  12. Commentary on the Integration of Model Sharing and Reproducibility Analysis to Scholarly Publishing Workflow in Computational Biomechanics

    PubMed Central

    Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason P.; Modenese, Luca; Reinbolt, Jeffrey A.; Thelen, Darryl G.; Umberger, Brian R.

    2016-01-01

    Objective The overall goal of this document is to demonstrate that dissemination of models and analyses for assessing the reproducibility of simulation results can be incorporated in the scientific review process in biomechanics. Methods As part of a special issue on model sharing and reproducibility in IEEE Transactions on Biomedical Engineering, two manuscripts on computational biomechanics were submitted: A. Rajagopal et al., IEEE Trans. Biomed. Eng., 2016 and A. Schmitz and D. Piovesan, IEEE Trans. Biomed. Eng., 2016. Models used in these studies were shared with the scientific reviewers and the public. In addition to the standard review of the manuscripts, the reviewers downloaded the models and performed simulations that reproduced results reported in the studies. Results There was general agreement between simulation results of the authors and those of the reviewers. Discrepancies were resolved during the necessary revisions. The manuscripts and instructions for download and simulation were updated in response to the reviewers’ feedback; changes that may otherwise have been missed if explicit model sharing and simulation reproducibility analysis were not conducted in the review process. Increased burden on the authors and the reviewers, to facilitate model sharing and to repeat simulations, were noted. Conclusion When the authors of computational biomechanics studies provide access to models and data, the scientific reviewers can download and thoroughly explore the model, perform simulations, and evaluate simulation reproducibility beyond the traditional manuscript-only review process. Significance Model sharing and reproducibility analysis in scholarly publishing will result in a more rigorous review process, which will enhance the quality of modeling and simulation studies and inform future users of computational models. PMID:28072567

  13. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  14. Neurolinguistically constrained simulation of sentence comprehension: integrating artificial intelligence and brain theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gigley, H.M.

    1982-01-01

    An artificial intelligence approach to the simulation of neurolinguistically constrained processes in sentence comprehension is developed using control strategies for simulation of cooperative computation in associative networks. The desirability of this control strategy in contrast to ATN and production system strategies is explained. A first pass implementation of HOPE, an artificial intelligence simulation model of sentence comprehension, constrained by studies of aphasic performance, psycholinguistics, neurolinguistics, and linguistic theory is described. Claims that the model could serve as a basis for sentence production simulation and for a model of language acquisition as associative learning are discussed. HOPE is a model thatmore » performs in a normal state and includes a lesion simulation facility. HOPE is also a research tool. Its modifiability and use as a tool to investigate hypothesized causes of degradation in comprehension performance by aphasic patients are described. Issues of using behavioral constraints in modelling and obtaining appropriate data for simulated process modelling are discussed. Finally, problems of validation of the simulation results are raised; and issues of how to interpret clinical results to define the evolution of the model are discussed. Conclusions with respect to the feasibility of artificial intelligence simulation process modelling are discussed based on the current state of research.« less

  15. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  16. Simulating the decentralized processes of the human immune system in a virtual anatomy model.

    PubMed

    Sarpe, Vladimir; Jacob, Christian

    2013-01-01

    Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.

  17. Simulation based analysis of laser beam brazing

    NASA Astrophysics Data System (ADS)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  18. Numerical and experimental studies on effects of moisture content on combustion characteristics of simulated municipal solid wastes in a fixed bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Rui, E-mail: Sunsr@hit.edu.cn; Ismail, Tamer M., E-mail: temoil@aucegypt.edu; Ren, Xiaohan

    Highlights: • The effects of moisture content on the burning process of MSW are investigated. • A two-dimensional mathematical model was built to simulate the combustion process. • Temperature distributions, process rates, gas species were measured and simulated. • The The conversion ratio of C/CO and N/NO in MSW are inverse to moisture content. - Abstract: In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on themore » combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k–ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW.« less

  19. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  20. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  1. Simulating The Technological Movements Of The Equipment Used For Manufacturing Prosthetic Devices Using 3D Models

    NASA Astrophysics Data System (ADS)

    Chicea, Anca-Lucia

    2015-09-01

    The paper presents the process of building geometric and kinematic models of a technological equipment used in the process of manufacturing devices. First, the process of building the model for a six axes industrial robot is presented. In the second part of the paper, the process of building the model for a five-axis CNC milling machining center is also shown. Both models can be used for accurate cutting processes simulation of complex parts, such as prosthetic devices.

  2. Development of a parallel FE simulator for modeling the whole trans-scale failure process of rock from meso- to engineering-scale

    NASA Astrophysics Data System (ADS)

    Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao

    2017-01-01

    Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is shown that the parallel FE simulator developed in this study is an efficient tool for modeling the whole trans-scale failure process of rock from meso- to engineering-scale.

  3. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  4. BioNetSim: a Petri net-based modeling tool for simulations of biochemical processes.

    PubMed

    Gao, Junhui; Li, Li; Wu, Xiaolin; Wei, Dong-Qing

    2012-03-01

    BioNetSim, a Petri net-based software for modeling and simulating biochemistry processes, is developed, whose design and implement are presented in this paper, including logic construction, real-time access to KEGG (Kyoto Encyclopedia of Genes and Genomes), and BioModel database. Furthermore, glycolysis is simulated as an example of its application. BioNetSim is a helpful tool for researchers to download data, model biological network, and simulate complicated biochemistry processes. Gene regulatory networks, metabolic pathways, signaling pathways, and kinetics of cell interaction are all available in BioNetSim, which makes modeling more efficient and effective. Similar to other Petri net-based softwares, BioNetSim does well in graphic application and mathematic construction. Moreover, it shows several powerful predominances. (1) It creates models in database. (2) It realizes the real-time access to KEGG and BioModel and transfers data to Petri net. (3) It provides qualitative analysis, such as computation of constants. (4) It generates graphs for tracing the concentration of every molecule during the simulation processes.

  5. Simulation of Triple Oxidation Ditch Wastewater Treatment Process

    NASA Astrophysics Data System (ADS)

    Yang, Yue; Zhang, Jinsong; Liu, Lixiang; Hu, Yongfeng; Xu, Ziming

    2010-11-01

    This paper presented the modeling mechanism and method of a sewage treatment system. A triple oxidation ditch process of a WWTP was simulated based on activated sludge model ASM2D with GPS-X software. In order to identify the adequate model structure to be implemented into the GPS-X environment, the oxidation ditch was divided into several completely stirred tank reactors depended on the distribution of aeration devices and dissolved oxygen concentration. The removal efficiency of COD, ammonia nitrogen, total nitrogen, total phosphorus and SS were simulated by GPS-X software with influent quality data of this WWTP from June to August 2009, to investigate the differences between the simulated results and the actual results. The results showed that, the simulated values could well reflect the actual condition of the triple oxidation ditch process. Mathematical modeling method was appropriate in effluent quality predicting and process optimizing.

  6. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, Kevin M.; Meservey, Richard H.; Landon, Mark D.

    1999-01-01

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D&D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded.

  7. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  8. Model-Based Verification and Validation of the SMAP Uplink Processes

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  9. Multi-model ensemble hydrological simulation using a BP Neural Network for the upper Yalongjiang River Basin, China

    NASA Astrophysics Data System (ADS)

    Li, Zhanjie; Yu, Jingshan; Xu, Xinyi; Sun, Wenchao; Pang, Bo; Yue, Jiajia

    2018-06-01

    Hydrological models are important and effective tools for detecting complex hydrological processes. Different models have different strengths when capturing the various aspects of hydrological processes. Relying on a single model usually leads to simulation uncertainties. Ensemble approaches, based on multi-model hydrological simulations, can improve application performance over single models. In this study, the upper Yalongjiang River Basin was selected for a case study. Three commonly used hydrological models (SWAT, VIC, and BTOPMC) were selected and used for independent simulations with the same input and initial values. Then, the BP neural network method was employed to combine the results from the three models. The results show that the accuracy of BP ensemble simulation is better than that of the single models.

  10. Modeling hospital surgical delivery process design using system simulation: optimizing patient flow and bed capacity as an illustration.

    PubMed

    Kumar, Sameer

    2011-01-01

    It is increasingly recognized that hospital operation is an intricate system with limited resources and many interacting sources of both positive and negative feedback. The purpose of this study is to design a surgical delivery process in a county hospital in the U.S where patient flow through a surgical ward is optimized. The system simulation modeling is used to address questions of capacity planning, throughput management and interacting resources which constitute the constantly changing complexity that characterizes designing a contemporary surgical delivery process in a hospital. The steps in building a system simulation model is demonstrated using an example of building a county hospital in a small city in the US. It is used to illustrate a modular system simulation modeling of patient surgery process flows. The system simulation model development will enable planners and designers how they can build in overall efficiencies in a healthcare facility through optimal bed capacity for peak patient flow of emergency and routine patients.

  11. Temporal Gillespie Algorithm: Fast Simulation of Contagion Processes on Time-Varying Networks

    PubMed Central

    Vestergaard, Christian L.; Génois, Mathieu

    2015-01-01

    Stochastic simulations are one of the cornerstones of the analysis of dynamical processes on complex networks, and are often the only accessible way to explore their behavior. The development of fast algorithms is paramount to allow large-scale simulations. The Gillespie algorithm can be used for fast simulation of stochastic processes, and variants of it have been applied to simulate dynamical processes on static networks. However, its adaptation to temporal networks remains non-trivial. We here present a temporal Gillespie algorithm that solves this problem. Our method is applicable to general Poisson (constant-rate) processes on temporal networks, stochastically exact, and up to multiple orders of magnitude faster than traditional simulation schemes based on rejection sampling. We also show how it can be extended to simulate non-Markovian processes. The algorithm is easily applicable in practice, and as an illustration we detail how to simulate both Poissonian and non-Markovian models of epidemic spreading. Namely, we provide pseudocode and its implementation in C++ for simulating the paradigmatic Susceptible-Infected-Susceptible and Susceptible-Infected-Recovered models and a Susceptible-Infected-Recovered model with non-constant recovery rates. For empirical networks, the temporal Gillespie algorithm is here typically from 10 to 100 times faster than rejection sampling. PMID:26517860

  12. Temporal Gillespie Algorithm: Fast Simulation of Contagion Processes on Time-Varying Networks.

    PubMed

    Vestergaard, Christian L; Génois, Mathieu

    2015-10-01

    Stochastic simulations are one of the cornerstones of the analysis of dynamical processes on complex networks, and are often the only accessible way to explore their behavior. The development of fast algorithms is paramount to allow large-scale simulations. The Gillespie algorithm can be used for fast simulation of stochastic processes, and variants of it have been applied to simulate dynamical processes on static networks. However, its adaptation to temporal networks remains non-trivial. We here present a temporal Gillespie algorithm that solves this problem. Our method is applicable to general Poisson (constant-rate) processes on temporal networks, stochastically exact, and up to multiple orders of magnitude faster than traditional simulation schemes based on rejection sampling. We also show how it can be extended to simulate non-Markovian processes. The algorithm is easily applicable in practice, and as an illustration we detail how to simulate both Poissonian and non-Markovian models of epidemic spreading. Namely, we provide pseudocode and its implementation in C++ for simulating the paradigmatic Susceptible-Infected-Susceptible and Susceptible-Infected-Recovered models and a Susceptible-Infected-Recovered model with non-constant recovery rates. For empirical networks, the temporal Gillespie algorithm is here typically from 10 to 100 times faster than rejection sampling.

  13. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to represent the relations of people, locations, systems, artifacts, communication and information content.

  14. Mathematical modeling of high-pH chemical flooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhuyan, D.; Lake, L.W.; Pope, G.A.

    1990-05-01

    This paper describes a generalized compositional reservoir simulator for high-pH chemical flooding processes. This simulator combines the reaction chemistry associated with these processes with the extensive physical- and flow-property modeling schemes of an existing micellar/polymer flood simulator, UTCHEM. Application of the model is illustrated for cases from a simple alkaline preflush to surfactant-enhanced alkaline-polymer flooding.

  15. Process simulation and dynamic control for marine oily wastewater treatment using UV irradiation.

    PubMed

    Jing, Liang; Chen, Bing; Zhang, Baiyu; Li, Pu

    2015-09-15

    UV irradiation and advanced oxidation processes have been recently regarded as promising solutions in removing polycyclic aromatic hydrocarbons (PAHs) from marine oily wastewater. However, such treatment methods are generally not sufficiently understood in terms of reaction mechanisms, process simulation and process control. These deficiencies can drastically hinder their application in shipping and offshore petroleum industries which produce bilge/ballast water and produced water as the main streams of marine oily wastewater. In this study, the factorial design of experiment was carried out to investigate the degradation mechanism of a typical PAH, namely naphthalene, under UV irradiation in seawater. Based on the experimental results, a three-layer feed-forward artificial neural network simulation model was developed to simulate the treatment process and to forecast the removal performance. A simulation-based dynamic mixed integer nonlinear programming (SDMINP) approach was then proposed to intelligently control the treatment process by integrating the developed simulation model, genetic algorithm and multi-stage programming. The applicability and effectiveness of the developed approach were further tested though a case study. The experimental results showed that the influences of fluence rate and temperature on the removal of naphthalene were greater than those of salinity and initial concentration. The developed simulation model could well predict the UV-induced removal process under varying conditions. The case study suggested that the SDMINP approach, with the aid of the multi-stage control strategy, was able to significantly reduce treatment cost when comparing to the traditional single-stage process optimization. The developed approach and its concept/framework have high potential of applicability in other environmental fields where a treatment process is involved and experimentation and modeling are used for process simulation and control. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Simulating effects of fire on northern Rocky Mountain landscapes with the ecological process model FIRE-BGC.

    PubMed

    Keane, R E; Ryan, K C; Running, S W

    1996-03-01

    A mechanistic, biogeochemical succession model, FIRE-BGC, was used to investigate the role of fire on long-term landscape dynamics in northern Rocky Mountain coniferous forests of Glacier National Park, Montana, USA. FIRE-BGC is an individual-tree model-created by merging the gap-phase process-based model FIRESUM with the mechanistic ecosystem biogeochemical model FOREST-BGC-that has mixed spatial and temporal resolution in its simulation architecture. Ecological processes that act at a landscape level, such as fire and seed dispersal, are simulated annually from stand and topographic information. Stand-level processes, such as tree establishment, growth and mortality, organic matter accumulation and decomposition, and undergrowth plant dynamics are simulated both daily and annually. Tree growth is mechanistically modeled based on the ecosystem process approach of FOREST-BGC where carbon is fixed daily by forest canopy photosynthesis at the stand level. Carbon allocated to the tree stem at the end of the year generates the corresponding diameter and height growth. The model also explicitly simulates fire behavior and effects on landscape characteristics. We simulated the effects of fire on ecosystem characteristics of net primary productivity, evapotranspiration, standing crop biomass, nitrogen cycling and leaf area index over 200 years for the 50,000-ha McDonald Drainage in Glacier National Park. Results show increases in net primary productivity and available nitrogen when fires are included in the simulation. Standing crop biomass and evapotranspiration decrease under a fire regime. Shade-intolerant species dominate the landscape when fires are excluded. Model tree increment predictions compared well with field data.

  17. Event-based hydrological modeling for detecting dominant hydrological process and suitable model strategy for semi-arid catchments

    NASA Astrophysics Data System (ADS)

    Huang, Pengnian; Li, Zhijia; Chen, Ji; Li, Qiaoling; Yao, Cheng

    2016-11-01

    To simulate the hydrological processes in semi-arid areas properly is still challenging. This study assesses the impact of different modeling strategies on simulating flood processes in semi-arid catchments. Four classic hydrological models, TOPMODEL, XINANJIANG (XAJ), SAC-SMA and TANK, were selected and applied to three semi-arid catchments in North China. Based on analysis and comparison of the simulation results of these classic models, four new flexible models were constructed and used to further investigate the suitability of various modeling strategies for semi-arid environments. Numerical experiments were also designed to examine the performances of the models. The results show that in semi-arid catchments a suitable model needs to include at least one nonlinear component to simulate the main process of surface runoff generation. If there are more than two nonlinear components in the hydrological model, they should be arranged in parallel, rather than in series. In addition, the results show that the parallel nonlinear components should be combined by multiplication rather than addition. Moreover, this study reveals that the key hydrological process over semi-arid catchments is the infiltration excess surface runoff, a non-linear component.

  18. Validation of mathematical model for CZ process using small-scale laboratory crystal growth furnace

    NASA Astrophysics Data System (ADS)

    Bergfelds, Kristaps; Sabanskis, Andrejs; Virbulis, Janis

    2018-05-01

    The present material is focused on the modelling of small-scale laboratory NaCl-RbCl crystal growth furnace. First steps towards fully transient simulations are taken in the form of stationary simulations that deal with the optimization of material properties to match the model to experimental conditions. For this purpose, simulation software primarily used for the modelling of industrial-scale silicon crystal growth process was successfully applied. Finally, transient simulations of the crystal growth are presented, giving a sufficient agreement to experimental results.

  19. Evaluation of Boreal Summer Monsoon Intraseasonal Variability in the GASS-YOTC Multi-Model Physical Processes Experiment

    NASA Astrophysics Data System (ADS)

    Mani, N. J.; Waliser, D. E.; Jiang, X.

    2014-12-01

    While the boreal summer monsoon intraseasonal variability (BSISV) exerts profound influence on the south Asian monsoon, the capability of present day dynamical models in simulating and predicting the BSISV is still limited. The global model evaluation project on vertical structure and diabatic processes of the Madden Julian Oscillations (MJO) is a joint venture, coordinated by the Working Group on Numerical Experimentation (WGNE) MJO Task Force and GEWEX Atmospheric System Study (GASS) program, for assessing the model deficiencies in simulating the ISV and for improving our understanding of the underlying processes. In this study the simulation of the northward propagating BSISV is investigated in 26 climate models with special focus on the vertical diabatic heating structure and clouds. Following parallel lines of inquiry as the MJO Task Force has done with the eastward propagating MJO, we utilize previously proposed and newly developed model performance metrics and process diagnostics and apply them to the global climate model simulations of BSISV.

  20. Workflow for Integrating Mesoscale Heterogeneities in Materials Structure with Process Simulation of Titanium Alloys (Postprint)

    DTIC Science & Technology

    2014-10-01

    offer a practical solution to calculating the grain -scale hetero- geneity present in the deformation field. Consequently, crystal plasticity models...process/performance simulation codes (e.g., crystal plasticity finite element method). 15. SUBJECT TERMS ICME; microstructure informatics; higher...iii) protocols for direct and efficient linking of materials models/databases into process/performance simulation codes (e.g., crystal plasticity

  1. Simulating the flow of entangled polymers.

    PubMed

    Masubuchi, Yuichi

    2014-01-01

    To optimize automation for polymer processing, attempts have been made to simulate the flow of entangled polymers. In industry, fluid dynamics simulations with phenomenological constitutive equations have been practically established. However, to account for molecular characteristics, a method to obtain the constitutive relationship from the molecular structure is required. Molecular dynamics simulations with atomic description are not practical for this purpose; accordingly, coarse-grained models with reduced degrees of freedom have been developed. Although the modeling of entanglement is still a challenge, mesoscopic models with a priori settings to reproduce entangled polymer dynamics, such as tube models, have achieved remarkable success. To use the mesoscopic models as staging posts between atomistic and fluid dynamics simulations, studies have been undertaken to establish links from the coarse-grained model to the atomistic and macroscopic simulations. Consequently, integrated simulations from materials chemistry to predict the macroscopic flow in polymer processing are forthcoming.

  2. A Simulation Model Articulation of the REA Ontology

    NASA Astrophysics Data System (ADS)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  3. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  4. FACE-IT. A Science Gateway for Food Security Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montella, Raffaele; Kelly, David; Xiong, Wei

    Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less

  5. Synchronization Of Parallel Discrete Event Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  6. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  7. Simulation of the Onset of the Southeast Asian Monsoon During 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Lau, W.; Baker, R.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo-China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the low-level temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation. The model results will be compared to the simulation of the 6-7 May 2000 Missouri flash flood event. In addition, the impact of model initialization and land surface treatment on timing, intensity, and location of extreme precipitation will be examined.

  8. Simulation of the Onset of the Southeast Asian Monsoon during 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Wang, Y.; Lau, W.; Baker, R. D.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo-China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the low-level temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation. The model results will be compared to the simulation of the 6-7 May 2000 Missouri flash flood event. In addition, the impact of model initialization and land surface treatment on timing, intensity, and location of extreme precipitation will be examined.

  9. Design, development, and application of LANDIS-II, a spatial landscape simulation model with flexible temporal and spatial resolution

    Treesearch

    Robert M. Scheller; James B. Domingo; Brian R. Sturtevant; Jeremy S. Williams; Arnold Rudy; Eric J. Gustafson; David J. Mladenoff

    2007-01-01

    We introduce LANDIS-II, a landscape model designed to simulate forest succession and disturbances. LANDIS-II builds upon and preserves the functionality of previous LANDIS forest landscape simulation models. LANDIS-II is distinguished by the inclusion of variable time steps for different ecological processes; our use of a rigorous development and testing process used...

  10. Rapid Automated Aircraft Simulation Model Updating from Flight Data

    NASA Technical Reports Server (NTRS)

    Brian, Geoff; Morelli, Eugene A.

    2011-01-01

    Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.

  11. Identification of the dominant hydrological process and appropriate model structure of a karst catchment through stepwise simplification of a complex conceptual model

    NASA Astrophysics Data System (ADS)

    Chang, Yong; Wu, Jichun; Jiang, Guanghui; Kang, Zhiqiang

    2017-05-01

    Conceptual models often suffer from the over-parameterization problem due to limited available data for the calibration. This leads to the problem of parameter nonuniqueness and equifinality, which may bring much uncertainty of the simulation result. How to find out the appropriate model structure supported by the available data to simulate the catchment is still a big challenge in the hydrological research. In this paper, we adopt a multi-model framework to identify the dominant hydrological process and appropriate model structure of a karst spring, located in Guilin city, China. For this catchment, the spring discharge is the only available data for the model calibration. This framework starts with a relative complex conceptual model according to the perception of the catchment and then this complex is simplified into several different models by gradually removing the model component. The multi-objective approach is used to compare the performance of these different models and the regional sensitivity analysis (RSA) is used to investigate the parameter identifiability. The results show this karst spring is mainly controlled by two different hydrological processes and one of the processes is threshold-driven which is consistent with the fieldwork investigation. However, the appropriate model structure to simulate the discharge of this spring is much simpler than the actual aquifer structure and hydrological processes understanding from the fieldwork investigation. A simple linear reservoir with two different outlets is enough to simulate this spring discharge. The detail runoff process in the catchment is not needed in the conceptual model to simulate the spring discharge. More complex model should need more other additional data to avoid serious deterioration of model predictions.

  12. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    USDA-ARS?s Scientific Manuscript database

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  13. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory

  14. Optimal Estimation with Two Process Models and No Measurements

    DTIC Science & Technology

    2015-08-01

    models will be lost if either of the models includes deterministic modeling errors. 12 5. References and Notes 1. Brown RG, Hwang PYC. Introduction to...independent process models when no measurements are present. The observer follows a derivation similar to that of the discrete time Kalman filter. A simulation...discrete time Kalman filter. A simulation example is provided in which a process model based on the dynamics of a ballistic projectile is blended with an

  15. Lattice Boltzmann simulations of immiscible displacement process with large viscosity ratios

    NASA Astrophysics Data System (ADS)

    Rao, Parthib; Schaefer, Laura

    2017-11-01

    Immiscible displacement is a key physical mechanism involved in enhanced oil recovery and carbon sequestration processes. This multiphase flow phenomenon involves a complex interplay of viscous, capillary, inertial and wettability effects. The lattice Boltzmann (LB) method is an accurate and efficient technique for modeling and simulating multiphase/multicomponent flows especially in complex flow configurations and media. In this presentation we present numerical simulation results of displacement process in thin long channels. The results are based on a new psuedo-potential multicomponent LB model with multiple relaxation time collision (MRT) model and explicit forcing scheme. We demonstrate that the proposed model is capable of accurately simulating the displacement process involving fluids with a wider range of viscosity ratios (>100) and which also leads to viscosity-independent interfacial tension and reduction of some important numerical artifacts.

  16. A Low Cost Microcomputer System for Process Dynamics and Control Simulations.

    ERIC Educational Resources Information Center

    Crowl, D. A.; Durisin, M. J.

    1983-01-01

    Discusses a video simulator microcomputer system used to provide real-time demonstrations to strengthen students' understanding of process dynamics and control. Also discusses hardware/software and simulations developed using the system. The four simulations model various configurations of a process liquid level tank system. (JN)

  17. A simulation study on garment manufacturing process

    NASA Astrophysics Data System (ADS)

    Liong, Choong-Yeun; Rahim, Nur Azreen Abdul

    2015-02-01

    Garment industry is an important industry and continues to evolve in order to meet the consumers' high demands. Therefore, elements of innovation and improvement are important. In this work, research studies were conducted at a local company in order to model the sewing process of clothes manufacturing by using simulation modeling. Clothes manufacturing at the company involves 14 main processes, which are connecting the pattern, center sewing and side neating, pockets sewing, backside-sewing, attaching the front and back, sleeves preparation, attaching the sleeves and over lock, collar preparation, collar sewing, bottomedge sewing, buttonholing sewing, removing excess thread, marking button, and button cross sewing. Those fourteen processes are operated by six tailors only. The last four sets of processes are done by a single tailor. Data collection was conducted by on site observation and the probability distribution of processing time for each of the processes is determined by using @Risk's Bestfit. Then a simulation model is developed using Arena Software based on the data collected. Animated simulation model is developed in order to facilitate understanding and verifying that the model represents the actual system. With such model, what if analysis and different scenarios of operations can be experimented with virtually. The animation and improvement models will be presented in further work.

  18. A Simplified Finite Element Simulation for Straightening Process of Thin-Walled Tube

    NASA Astrophysics Data System (ADS)

    Zhang, Ziqian; Yang, Huilin

    2017-12-01

    The finite element simulation is an effective way for the study of thin-walled tube in the two cross rolls straightening process. To determine the accurate radius of curvature of the roll profile more efficiently, a simplified finite element model based on the technical parameters of an actual two cross roll straightening machine, was developed to simulate the complex straightening process. Then a dynamic simulation was carried out using ANSYS LS-DYNA program. The result implied that the simplified finite element model was reasonable for simulate the two cross rolls straightening process, and can be obtained the radius of curvature of the roll profile with the tube’s straightness 2 mm/m.

  19. A Modified Isotropic-Kinematic Hardening Model to Predict the Defects in Tube Hydroforming Process

    NASA Astrophysics Data System (ADS)

    Jin, Kai; Guo, Qun; Tao, Jie; Guo, Xun-zhong

    2017-11-01

    Numerical simulations of tube hydroforming process of hollow crankshafts were conducted by using finite element analysis method. Moreover, the modified model involving the integration of isotropic-kinematic hardening model with ductile criteria model was used to more accurately optimize the process parameters such as internal pressure, feed distance and friction coefficient. Subsequently, hydroforming experiments were performed based on the simulation results. The comparison between experimental and simulation results indicated that the prediction of tube deformation, crack and wrinkle was quite accurate for the tube hydroforming process. Finally, hollow crankshafts with high thickness uniformity were obtained and the thickness distribution between numerical and experimental results was well consistent.

  20. Integrated water flow model and modflow-farm process: A comparison of theory, approaches, and features of two integrated hydrologic models

    USGS Publications Warehouse

    Dogrul, Emin C.; Schmid, Wolfgang; Hanson, Randall T.; Kadir, Tariq; Chung, Francis

    2016-01-01

    Effective modeling of conjunctive use of surface and subsurface water resources requires simulation of land use-based root zone and surface flow processes as well as groundwater flows, streamflows, and their interactions. Recently, two computer models developed for this purpose, the Integrated Water Flow Model (IWFM) from the California Department of Water Resources and the MODFLOW with Farm Process (MF-FMP) from the US Geological Survey, have been applied to complex basins such as the Central Valley of California. As both IWFM and MFFMP are publicly available for download and can be applied to other basins, there is a need to objectively compare the main approaches and features used in both models. This paper compares the concepts, as well as the method and simulation features of each hydrologic model pertaining to groundwater, surface water, and landscape processes. The comparison is focused on the integrated simulation of water demand and supply, water use, and the flow between coupled hydrologic processes. The differences in the capabilities and features of these two models could affect the outcome and types of water resource problems that can be simulated.

  1. A Process for the Creation of T-MATS Propulsion System Models from NPSS data

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink (Math Works, Inc.) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.

  2. A Process for the Creation of T-MATS Propulsion System Models from NPSS Data

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink(Trademark) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.

  3. A Process for the Creation of T-MATS Propulsion System Models from NPSS Data

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink(Registered TradeMark) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.

  4. An Interactive Teaching System for Bond Graph Modeling and Simulation in Bioengineering

    ERIC Educational Resources Information Center

    Roman, Monica; Popescu, Dorin; Selisteanu, Dan

    2013-01-01

    The objective of the present work was to implement a teaching system useful in modeling and simulation of biotechnological processes. The interactive system is based on applications developed using 20-sim modeling and simulation software environment. A procedure for the simulation of bioprocesses modeled by bond graphs is proposed and simulators…

  5. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, K.M.; Meservey, R.H.; Landon, M.D.

    1999-08-10

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D and D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded. 3 figs.

  6. Modelacio de sedimentadors en plantes de tractament d'aigues residuals. Aplicacio al proces de fermentacio - elutracio de fangs primaris

    NASA Astrophysics Data System (ADS)

    Ribes Bertomeu, Josep

    Wastewater treatments require the execution of many conversion processes simultaneously and/or consecutively, making them a tricky object of study. Furthermore, complexity of treatment processes is increasing not only for the more stringent effluent standards required, but also for the new trends towards sustainable development, which in this process are mainly focused on energy saving and nutrient recovery from wastewaters in order to improve their life cycle. For this reason it becomes necessary to use simulation tools which are able to represent all these processes by means of a suitable mathematical model. They can help in determining and predicting the behaviour of the different treatment schemes. These simulators have become essential for the design, control and optimization of wastewater treatment plants (WWTP). Settling processes have a significant role in the accomplishment of effluent standards and the correct operation of the plant. However, many models that are currently employed for WWTP design and simulation do not take into account settling processes or they are handled in a very simple way, by neglecting the biochemical processes that can occur during sedimentation. People of CALAGUA research group have focussed their efforts towards a new philosophy of simulating treatment plants, which is based on the use of a unique model to represent all physical, chemical and biological processes taking place in WWTPs. In this research topic, they have worked on the development of a general quality model that considers biological conversion processes carried out by different microorganism groups, acid base chemical interactions affecting the pH value in the system, and gas-liquid transfer processes. However, a generalized use of such a quality model requires its combination with a flux model, principally for those processes where completely mixture can not be assumed, as for instance, settlers and thickeners in WWTPs. The main objective of this work has been the development and validation of a general settling model that allows simulating the main settling operations taking place in a WWTP, considering both primary and secondary settlers and thickeners. It consists in a one-dimensional model based on the flux theory of Kynch and the double-exponential settling function of Takacs that takes into account flocculation, hindered settling and compression processes. The model has been applied to simulation of settlers and thickeners by means of splitting the system into several horizontal layers, all of them considered as completely mixed reactors which are interconnected by mass flux obtained from the settling model. In order to simulate the conversion processes taking place during sedimentation, the general quality model BNRM1 has been added, and it has been proposed an iterative procedure for solving the equations for each layer in which the settler has been divided. The settling flux model validation, along with the quality model, has been carried out by applying them to a simulation of primary sludge fermentation - elutriation process. This process has been studied on a pilot plant located in the Carraixet WWTP in Alboraia (Valencia). In order to simulate the observed decrease in solids separation efficiency in the studied fermentation - elutriation process, the quality model has been modified with the addition of a new process called "disintegration of complex particulate material". This process influences the settleability of the sludge because it is considered that the disintegrated solids become non-settleable solids. This modification implies the addition of two new kinetic parameters (the specific disintegration velocity for volatile particulate material and the specific disintegration velocity for non volatile particulate material). However, the settling parameter that represents the non-settleable fraction of total suspended solids is eliminated from the model and it has been transformed into an experimental variable which is quite easy to analyze. The result of this modification is a more general model, which is applicable to fermentation - elutriation process working at any operating condition. Finally, the behaviour and capabilities of the developed model have been tested by simulating a complete WWTP on the DESASS simulation software, developed by the research group. This example includes the most important processes that can be used in a WWTP: biological nutrient removal, primary sludge fermentation and sludge digestion. The model allows considering both settling processes and biochemical processes as a whole (denitrification in secondary settlers, primary sludge fermentation and VFA elutriation, phosphorus release in thickeners because of the PAO decay, etc.). The developed model implies an important advance in study of new wastewater treatment processes because it allows dealing with global process optimization problems, by means of full plants simulation. It is very useful for studying the effects of a modification in operation conditions of one element over the operation of the rest of the elements of the WWTP. (Abstract shortened by UMI.).

  7. RFI and SCRIMP Model Development and Verification

    NASA Technical Reports Server (NTRS)

    Loos, Alfred C.; Sayre, Jay

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.

  8. Modeling and FE Simulation of Quenchable High Strength Steels Sheet Metal Hot Forming Process

    NASA Astrophysics Data System (ADS)

    Liu, Hongsheng; Bao, Jun; Xing, Zhongwen; Zhang, Dejin; Song, Baoyu; Lei, Chengxi

    2011-08-01

    High strength steel (HSS) sheet metal hot forming process is investigated by means of numerical simulations. With regard to a reliable numerical process design, the knowledge of the thermal and thermo-mechanical properties is essential. In this article, tensile tests are performed to examine the flow stress of the material HSS 22MnB5 at different strains, strain rates, and temperatures. Constitutive model based on phenomenological approach is developed to describe the thermo-mechanical properties of the material 22MnB5 by fitting the experimental data. A 2D coupled thermo-mechanical finite element (FE) model is developed to simulate the HSS sheet metal hot forming process for U-channel part. The ABAQUS/explicit model is used conduct the hot forming stage simulations, and ABAQUS/implicit model is used for accurately predicting the springback which happens at the end of hot forming stage. Material modeling and FE numerical simulations are carried out to investigate the effect of the processing parameters on the hot forming process. The processing parameters have significant influence on the microstructure of U-channel part. The springback after hot forming stage is the main factor impairing the shape precision of hot-formed part. The mechanism of springback is advanced and verified through numerical simulations and tensile loading-unloading tests. Creep strain is found in the tensile loading-unloading test under isothermal condition and has a distinct effect on springback. According to the numerical and experimental results, it can be concluded that springback is mainly caused by different cooling rats and the nonhomogengeous shrink of material during hot forming process, the creep strain is the main factor influencing the amount of the springback.

  9. Numerical Uncertainties in the Simulation of Reversible Isentropic Processes and Entropy Conservation.

    NASA Astrophysics Data System (ADS)

    Johnson, Donald R.; Lenzen, Allen J.; Zapotocny, Tom H.; Schaack, Todd K.

    2000-11-01

    A challenge common to weather, climate, and seasonal numerical prediction is the need to simulate accurately reversible isentropic processes in combination with appropriate determination of sources/sinks of energy and entropy. Ultimately, this task includes the distribution and transport of internal, gravitational, and kinetic energies, the energies of water substances in all forms, and the related thermodynamic processes of phase changes involved with clouds, including condensation, evaporation, and precipitation processes.All of the processes noted above involve the entropies of matter, radiation, and chemical substances, conservation during transport, and/or changes in entropies by physical processes internal to the atmosphere. With respect to the entropy of matter, a means to study a model's accuracy in simulating internal hydrologic processes is to determine its capability to simulate the appropriate conservation of potential and equivalent potential temperature as surrogates of dry and moist entropy under reversible adiabatic processes in which clouds form, evaporate, and precipitate. In this study, a statistical strategy utilizing the concept of `pure error' is set forth to assess the numerical accuracies of models to simulate reversible processes during 10-day integrations of the global circulation corresponding to the global residence time of water vapor. During the integrations, the sums of squared differences between equivalent potential temperature e numerically simulated by the governing equations of mass, energy, water vapor, and cloud water and a proxy equivalent potential temperature te numerically simulated as a conservative property are monitored. Inspection of the differences of e and te in time and space and the relative frequency distribution of the differences details bias and random errors that develop from nonlinear numerical inaccuracies in the advection and transport of potential temperature and water substances within the global atmosphere.A series of nine global simulations employing various versions of Community Climate Models CCM2 and CCM3-all Eulerian spectral numerics, all semi-Lagrangian numerics, mixed Eulerian spectral, and semi-Lagrangian numerics-and the University of Wisconsin-Madison (UW) isentropic-sigma gridpoint model provides an interesting comparison of numerical accuracies in the simulation of reversibility. By day 10, large bias and random differences were identified in the simulation of reversible processes in all of the models except for the UW isentropic-sigma model. The CCM2 and CCM3 simulations yielded systematic differences that varied zonally, vertically, and temporally. Within the comparison, the UW isentropic-sigma model was superior in transporting water vapor and cloud water/ice and in simulating reversibility involving the conservation of dry and moist entropy. The only relative frequency distribution of differences that appeared optimal, in that the distribution remained unbiased and equilibrated with minimal variance as it remained statistically stationary, was the distribution from the UW isentropic-sigma model. All other distributions revealed nonstationary characteristics with spreading and/or shifting of the maxima as the biases and variances of the numerical differences of e and te amplified.

  10. Watershed Simulation of Nutrient Processes

    EPA Science Inventory

    In this presentation, nitrogen processes simulated in watershed models were reviewed and compared. Furthermore, current researches on nitrogen losses from agricultural fields were also reviewed. Finally, applications with those models were reviewed and selected successful and u...

  11. Second Generation Crop Yield Models Review

    NASA Technical Reports Server (NTRS)

    Hodges, T. (Principal Investigator)

    1982-01-01

    Second generation yield models, including crop growth simulation models and plant process models, may be suitable for large area crop yield forecasting in the yield model development project. Subjective and objective criteria for model selection are defined and models which might be selected are reviewed. Models may be selected to provide submodels as input to other models; for further development and testing; or for immediate testing as forecasting tools. A plant process model may range in complexity from several dozen submodels simulating (1) energy, carbohydrates, and minerals; (2) change in biomass of various organs; and (3) initiation and development of plant organs, to a few submodels simulating key physiological processes. The most complex models cannot be used directly in large area forecasting but may provide submodels which can be simplified for inclusion into simpler plant process models. Both published and unpublished models which may be used for development or testing are reviewed. Several other models, currently under development, may become available at a later date.

  12. A simplified computational memory model from information processing.

    PubMed

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  13. CFD simulation of reverse water-hammer induced by collapse of draft-tube cavity in a model pump-turbine during runaway process

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoxi; Cheng, Yongguang; Xia, Linsheng; Yang, Jiandong

    2016-11-01

    This paper reports the preliminary progress in the CFD simulation of the reverse water-hammer induced by the collapse of a draft-tube cavity in a model pump-turbine during the runaway process. Firstly, the Fluent customized 1D-3D coupling model for hydraulic transients and the Schnerr & Sauer cavitation model for cavity development are introduced. Then, the methods are validated by simulating the benchmark reverse water-hammer in a long pipe caused by a valve instant closure. The simulated head history at the valve agrees well with the measured data in literature. After that, the more complicated reverse water-hammer in the draft-tube of a runaway model pump-turbine, which is installed in a model pumped-storage power plant, is simulated. The dynamic processes of a vapor cavity, from generation, expansion, shrink to collapse, are shown. After the cavity collapsed, a sudden increase of pressure can be evidently observed. The process is featured by a locally expending and collapsing vapor cavity that is around the runner cone, which is different from the conventional recognition of violent water- column separation. This work reveals the possibility for simulating the reverse water-hammer phenomenon in turbines by 3D CFD.

  14. Internal Catchment Process Simulation in a Snow-Dominated Basin: Performance Evaluation with Spatiotemporally Variable Runoff Generation and Groundwater Dynamics

    NASA Astrophysics Data System (ADS)

    Kuras, P. K.; Weiler, M.; Alila, Y.; Spittlehouse, D.; Winkler, R.

    2006-12-01

    Hydrologic models have been increasingly used in forest hydrology to overcome the limitations of paired watershed experiments, where vegetative recovery and natural variability obscure the inferences and conclusions that can be drawn from such studies. Models, however, are also plagued by uncertainty stemming from a limited understanding of hydrological processes in forested catchments and parameter equifinality is a common concern. This has created the necessity to improve our understanding of how hydrological systems work, through the development of hydrological measures, analyses and models that address the question: are we getting the right answers for the right reasons? Hence, physically-based, spatially-distributed hydrologic models should be validated with high-quality experimental data describing multiple concurrent internal catchment processes under a range of hydrologic regimes. The distributed hydrology soil vegetation model (DHSVM) frequently used in forest management applications is an example of a process-based model used to address the aforementioned circumstances, and this study takes a novel approach at collectively examining the ability of a pre-calibrated model application to realistically simulate outlet flows along with the spatial-temporal variation of internal catchment processes including: continuous groundwater dynamics at 9 locations, stream and road network flow at 67 locations for six individual days throughout the freshet, and pre-melt season snow distribution. Model efficiency was improved over prior evaluations due to continuous efforts in improving the quality of meteorological data in the watershed. Road and stream network flows were very well simulated for a range of hydrological conditions, and the spatial distribution of the pre-melt season snowpack was in general agreement with observed values. The model was effective in simulating the spatial variability of subsurface flow generation, except at locations where strong stream-groundwater interactions existed, as the model is not capable of simulating such processes and subsurface flows always drain to the stream network. The model has proven overall to be quite capable in realistically simulating internal catchment processes in the watershed, which creates more confidence in future model applications exploring the effects of various forest management scenarios on the watershed's hydrological processes.

  15. Process-Oriented Diagnostics of Tropical Cyclones in Global Climate Models

    NASA Astrophysics Data System (ADS)

    Moon, Y.; Kim, D.; Camargo, S. J.; Wing, A. A.; Sobel, A. H.; Bosilovich, M. G.; Murakami, H.; Reed, K. A.; Vecchi, G. A.; Wehner, M. F.; Zarzycki, C. M.; Zhao, M.

    2017-12-01

    Simulating tropical cyclone (TC) activity with global climate models (GCMs) remains a challenging problem. While some GCMs are able to simulate TC activity that is in good agreement with the observations, many other models exhibit strong biases. Decreasing horizontal grid spacing of the GCM simulations tends to improve the characteristics of simulated TCs, but this enhancement alone does not necessarily lead to greater skill in simulating TC activity. This study uses process-based diagnostics to identify model characteristics that could explain why some GCM simulations are able to produce more realistic TC activity than others. The diagnostics examine how convection, moisture, clouds and related processes are coupled at individual grid points, which yields useful information into how convective parameterizations interact with resolved model dynamics. These diagnostics share similarities with those originally developed to examine the Madden-Julian Oscillations in climate models. This study will examine TCs in eight different GCM simulations performed at NOAA/GFDL, NCAR and NASA that have different horizontal resolutions and ocean coupling. Preliminary results suggest that stronger TCs are closely associated with greater rainfall - thus greater diabatic heating - in the inner-core regions of the storms, which is consistent with previous theoretical studies. Other storm characteristics that can be used to infer why GCM simulations with comparable horizontal grid spacings produce different TC activity will be examined.

  16. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    PubMed

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  17. Spatiotemporal stochastic models for earth science and engineering applications

    NASA Astrophysics Data System (ADS)

    Luo, Xiaochun

    1998-12-01

    Spatiotemporal processes occur in many areas of earth sciences and engineering. However, most of the available theoretical tools and techniques of space-time daft processing have been designed to operate exclusively in time or in space, and the importance of spatiotemporal variability was not fully appreciated until recently. To address this problem, a systematic framework of spatiotemporal random field (S/TRF) models for geoscience/engineering applications is presented and developed in this thesis. The space-tune continuity characterization is one of the most important aspects in S/TRF modelling, where the space-time continuity is displayed with experimental spatiotemporal variograms, summarized in terms of space-time continuity hypotheses, and modelled using spatiotemporal variogram functions. Permissible spatiotemporal covariance/variogram models are addressed through permissibility criteria appropriate to spatiotemporal processes. The estimation of spatiotemporal processes is developed in terms of spatiotemporal kriging techniques. Particular emphasis is given to the singularity analysis of spatiotemporal kriging systems. The impacts of covariance, functions, trend forms, and data configurations on the singularity of spatiotemporal kriging systems are discussed. In addition, the tensorial invariance of universal spatiotemporal kriging systems is investigated in terms of the space-time trend. The conditional simulation of spatiotemporal processes is proposed with the development of the sequential group Gaussian simulation techniques (SGGS), which is actually a series of sequential simulation algorithms associated with different group sizes. The simulation error is analyzed with different covariance models and simulation grids. The simulated annealing technique honoring experimental variograms, is also proposed, providing a way of conditional simulation without the covariance model fitting which is prerequisite for most simulation algorithms. The proposed techniques were first applied for modelling of the pressure system in a carbonate reservoir, and then applied for modelling of springwater contents in the Dyle watershed. The results of these case studies as well as the theory suggest that these techniques are realistic and feasible.

  18. Integration of Tuyere, Raceway and Shaft Models for Predicting Blast Furnace Process

    NASA Astrophysics Data System (ADS)

    Fu, Dong; Tang, Guangwu; Zhao, Yongfu; D'Alessio, John; Zhou, Chenn Q.

    2018-06-01

    A novel modeling strategy is presented for simulating the blast furnace iron making process. Such physical and chemical phenomena are taking place across a wide range of length and time scales, and three models are developed to simulate different regions of the blast furnace, i.e., the tuyere model, the raceway model and the shaft model. This paper focuses on the integration of the three models to predict the entire blast furnace process. Mapping output and input between models and an iterative scheme are developed to establish communications between models. The effects of tuyere operation and burden distribution on blast furnace fuel efficiency are investigated numerically. The integration of different models provides a way to realistically simulate the blast furnace by improving the modeling resolution on local phenomena and minimizing the model assumptions.

  19. Development of an alkaline/surfactant/polymer compositional reservoir simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhuyan, D.

    1989-01-01

    The mathematical formulation of a generalized three-dimensional compositional reservoir simulator for high-pH chemical flooding processes is presented in this work. The model assumes local thermodynamic equilibrium with respect to both reaction chemistry and phase behavior and calculates equilibrium electrolyte and phase compositions as a function of time and position. The reaction chemistry considers aqueous electrolytic chemistry, precipitation/dissolution of minerals, ion exchange reactions on matrix surface, reaction of acidic components of crude oil with the bases in the aqueous solution and cation exchange reactions with the micelles. The simulator combines this detailed reaction chemistry associated with these processes with the extensivemore » physical and flow property modeling schemes of an existing chemical flood simulator (UTCHEM) to model the multiphase, multidimensional displacement processes. The formulation of the chemical equilibrium model is quite general and is adaptable to simulate a variety of chemical descriptions. In addition to its use in the simulation of high-pH chemical flooding processes, the model will find application in the simulation of other reactive flow problems like the ground water contamination, reinjection of produced water, chemical waste disposal, etc. in one, two or three dimensions and under multiphase flow conditions. In this work, the model is used to simulate several hypothetical cases of high-pH chemical floods, which include cases from a simple alkaline preflush of a micellar/polymer flood to surfactant enhanced alkaline-polymer flooding and the results are analyzed. Finally, a few published alkaline, alkaline-polymer and surfactant-alkaline-polymer corefloods are simulated and compared with the experimental results.« less

  20. Simulation and optimization of an experimental membrane wastewater treatment plant using computational intelligence methods.

    PubMed

    Ludwig, T; Kern, P; Bongards, M; Wolf, C

    2011-01-01

    The optimization of relaxation and filtration times of submerged microfiltration flat modules in membrane bioreactors used for municipal wastewater treatment is essential for efficient plant operation. However, the optimization and control of such plants and their filtration processes is a challenging problem due to the underlying highly nonlinear and complex processes. This paper presents the use of genetic algorithms for this optimization problem in conjunction with a fully calibrated simulation model, as computational intelligence methods are perfectly suited to the nonconvex multi-objective nature of the optimization problems posed by these complex systems. The simulation model is developed and calibrated using membrane modules from the wastewater simulation software GPS-X based on the Activated Sludge Model No.1 (ASM1). Simulation results have been validated at a technical reference plant. They clearly show that filtration process costs for cleaning and energy can be reduced significantly by intelligent process optimization.

  1. Modeling the Gas Nitriding Process of Low Alloy Steels

    NASA Astrophysics Data System (ADS)

    Yang, M.; Zimmerman, C.; Donahue, D.; Sisson, R. D.

    2013-07-01

    The effort to simulate the nitriding process has been ongoing for the last 20 years. Most of the work has been done to simulate the nitriding process of pure iron. In the present work a series of experiments have been done to understand the effects of the nitriding process parameters such as the nitriding potential, temperature, and time as well as surface condition on the gas nitriding process for the steels. The compound layer growth model has been developed to simulate the nitriding process of AISI 4140 steel. In this paper the fundamentals of the model are presented and discussed including the kinetics of compound layer growth and the determination of the nitrogen diffusivity in the diffusion zone. The excellent agreements have been achieved for both as-washed and pre-oxided nitrided AISI 4140 between the experimental data and simulation results. The nitrogen diffusivity in the diffusion zone is determined to be constant and only depends on the nitriding temperature, which is ~5 × 10-9 cm2/s at 548 °C. It proves the concept of utilizing the compound layer growth model in other steels. The nitriding process of various steels can thus be modeled and predicted in the future.

  2. Fast simulation of reconstructed phylogenies under global time-dependent birth-death processes.

    PubMed

    Höhna, Sebastian

    2013-06-01

    Diversification rates and patterns may be inferred from reconstructed phylogenies. Both the time-dependent and the diversity-dependent birth-death process can produce the same observed patterns of diversity over time. To develop and test new models describing the macro-evolutionary process of diversification, generic and fast algorithms to simulate under these models are necessary. Simulations are not only important for testing and developing models but play an influential role in the assessment of model fit. In the present article, I consider as the model a global time-dependent birth-death process where each species has the same rates but rates may vary over time. For this model, I derive the likelihood of the speciation times from a reconstructed phylogenetic tree and show that each speciation event is independent and identically distributed. This fact can be used to simulate efficiently reconstructed phylogenetic trees when conditioning on the number of species, the time of the process or both. I show the usability of the simulation by approximating the posterior predictive distribution of a birth-death process with decreasing diversification rates applied on a published bird phylogeny (family Cettiidae). The methods described in this manuscript are implemented in the R package TESS, available from the repository CRAN (http://cran.r-project.org/web/packages/TESS/). Supplementary data are available at Bioinformatics online.

  3. Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems

    NASA Technical Reports Server (NTRS)

    Allada, Rama Kumar; Lange, Kevin; Anderson, Molly

    2011-01-01

    Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA) that were developed using the Aspen Custom Modeler and Aspen Plus process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.

  4. Improved simulation of poorly drained forests using Biome-BGC.

    PubMed

    Bond-Lamberty, Ben; Gower, Stith T; Ahl, Douglas E

    2007-05-01

    Forested wetlands and peatlands are important in boreal and terrestrial biogeochemical cycling, but most general-purpose forest process models are designed and parameterized for upland systems. We describe changes made to Biome-BGC, an ecophysiological process model, that improve its ability to simulate poorly drained forests. Model changes allowed for: (1) lateral water inflow from a surrounding watershed, and variable surface and subsurface drainage; (2) adverse effects of anoxic soil on decomposition and nutrient mineralization; (3) closure of leaf stomata in flooded soils; and (4) growth of nonvascular plants (i.e., bryophytes). Bryophytes were treated as ectohydric broadleaf evergreen plants with zero stomatal conductance, whose cuticular conductance to CO(2) was dependent on plant water content. Individual model changes were parameterized with published data, and ecosystem-level model performance was assessed by comparing simulated output to field data from the northern BOREAS site in Manitoba, Canada. The simulation of the poorly drained forest model exhibited reduced decomposition and vascular plant growth (-90%) compared with that of the well-drained forest model; the integrated bryophyte photosynthetic response accorded well with published data. Simulated net primary production, biomass and soil carbon accumulation broadly agreed with field measurements, although simulated net primary production was higher than observed data in well-drained stands. Simulated net primary production in the poorly drained forest was most sensitive to oxygen restriction on soil processes, and secondarily to stomatal closure in flooded conditions. The modified Biome-BGC remains unable to simulate true wetlands that are subject to prolonged flooding, because it does not track organic soil formation, water table changes, soil redox potential or anaerobic processes.

  5. Numerical Simulation of Sintering Process in Ceramic Powder Injection Moulded Components

    NASA Astrophysics Data System (ADS)

    Song, J.; Barriere, T.; Liu, B.; Gelin, J. C.

    2007-05-01

    A phenomenological model based on viscoplastic constitutive law is presented to describe the sintering process of ceramic components obtained by powder injection moulding. The parameters entering in the model are identified through sintering experiments in dilatometer with the proposed optimization method. The finite element simulations are carried out to predict the density variations and dimensional changes of the components during sintering. A simulation example on the sintering process of hip implant in alumina has been conducted. The simulation results have been compared with the experimental ones. A good agreement is obtained.

  6. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  7. A constitutive model and numerical simulation of sintering processes at macroscopic level

    NASA Astrophysics Data System (ADS)

    Wawrzyk, Krzysztof; Kowalczyk, Piotr; Nosewicz, Szymon; Rojek, Jerzy

    2018-01-01

    This paper presents modelling of both single and double-phase powder sintering processes at the macroscopic level. In particular, its constitutive formulation, numerical implementation and numerical tests are described. The macroscopic constitutive model is based on the assumption that the sintered material is a continuous medium. The parameters of the constitutive model for material under sintering are determined by simulation of sintering at the microscopic level using a micro-scale model. Numerical tests were carried out for a cylindrical specimen under hydrostatic and uniaxial pressure. Results of macroscopic analysis are compared against the microscopic model results. Moreover, numerical simulations are validated by comparison with experimental results. The simulations and preparation of the model are carried out by Abaqus FEA - a software for finite element analysis and computer-aided engineering. A mechanical model is defined by the user procedure "Vumat" which is developed by the first author in Fortran programming language. Modelling presented in the paper can be used to optimize and to better understand the process.

  8. Animated-simulation modeling facilitates clinical-process costing.

    PubMed

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.

  9. Simulation Assessment Validation Environment (SAVE). Software User’s Manual

    DTIC Science & Technology

    2000-09-01

    requirements and decisions are made. The integration is leveraging work from other DoD organizations so that high -end results are attainable much faster than...planning through the modeling and simulation data capture and visualization process. The planners can complete the manufacturing process plan with a high ...technologies. This tool is also used to perform “ high level” factory process simulation prior to full CAD model development and help define feasible

  10. A modified dynamical model of drying process of polymer blend solution coated on a flat substrate

    NASA Astrophysics Data System (ADS)

    Kagami, Hiroyuki

    2008-05-01

    We have proposed and modified a model of drying process of polymer solution coated on a flat substrate for flat polymer film fabrication. And for example numerical simulation of the model reproduces a typical thickness profile of the polymer film formed after drying. Then we have clarified dependence of distribution of polymer molecules on a flat substrate on a various parameters based on analysis of numerical simulations. Then we drove nonlinear equations of drying process from the dynamical model and the fruits were reported. The subject of above studies was limited to solution having one kind of solute though the model could essentially deal with solution having some kinds of solutes. But nowadays discussion of drying process of a solution having some kinds of solutes is needed because drying process of solution having some kinds of solutes appears in many industrial scenes. Polymer blend solution is one instance. And typical resist consists of a few kinds of polymers. Then we introduced a dynamical model of drying process of polymer blend solution coated on a flat substrate and results of numerical simulations of the dynamical model. But above model was the simplest one. In this study, we modify above dynamical model of drying process of polymer blend solution adding effects that some parameters change with time as functions of some variables to it. Then we consider essence of drying process of polymer blend solution through comparison between results of numerical simulations of the modified model and those of the former model.

  11. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.

  12. Stochastic simulation by image quilting of process-based geological models

    NASA Astrophysics Data System (ADS)

    Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef

    2017-09-01

    Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.

  13. Polar Processes in a 50-year Simulation of Stratospheric Chemistry and Transport

    NASA Technical Reports Server (NTRS)

    Kawa, S.R.; Douglass, A. R.; Patrick, L. C.; Allen, D. R.; Randall, C. E.

    2004-01-01

    The unique chemical, dynamical, and microphysical processes that occur in the winter polar lower stratosphere are expected to interact strongly with changing climate and trace gas abundances. Significant changes in ozone have been observed and prediction of future ozone and climate interactions depends on modeling these processes successfully. We have conducted an off-line model simulation of the stratosphere for trace gas conditions representative of 1975-2025 using meteorology from the NASA finite-volume general circulation model. The objective of this simulation is to examine the sensitivity of stratospheric ozone and chemical change to varying meteorology and trace gas inputs. This presentation will examine the dependence of ozone and related processes in polar regions on the climatological and trace gas changes in the model. The model past performance is base-lined against available observations, and a future ozone recovery scenario is forecast. Overall the model ozone simulation is quite realistic, but initial analysis of the detailed evolution of some observable processes suggests systematic shortcomings in our description of the polar chemical rates and/or mechanisms. Model sensitivities, strengths, and weaknesses will be discussed with implications for uncertainty and confidence in coupled climate chemistry predictions.

  14. Numerical investigation of coupled density-driven flow and hydrogeochemical processes below playas

    NASA Astrophysics Data System (ADS)

    Hamann, Enrico; Post, Vincent; Kohfahl, Claus; Prommer, Henning; Simmons, Craig T.

    2015-11-01

    Numerical modeling approaches with varying complexity were explored to investigate coupled groundwater flow and geochemical processes in saline basins. Long-term model simulations of a playa system gain insights into the complex feedback mechanisms between density-driven flow and the spatiotemporal patterns of precipitating evaporites and evolving brines. Using a reactive multicomponent transport model approach, the simulations reproduced, for the first time in a numerical study, the evaporite precipitation sequences frequently observed in saline basins ("bull's eyes"). Playa-specific flow, evapoconcentration, and chemical divides were found to be the primary controls for the location of evaporites formed, and the resulting brine chemistry. Comparative simulations with the computationally far less demanding surrogate single-species transport models showed that these were still able to replicate the major flow patterns obtained by the more complex reactive transport simulations. However, the simulated degree of salinization was clearly lower than in reactive multicomponent transport simulations. For example, in the late stages of the simulations, when the brine becomes halite-saturated, the nonreactive simulation overestimated the solute mass by almost 20%. The simulations highlight the importance of the consideration of reactive transport processes for understanding and quantifying geochemical patterns, concentrations of individual dissolved solutes, and evaporite evolution.

  15. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  16. Study of CFB Simulation Model with Coincidence at Multi-Working Condition

    NASA Astrophysics Data System (ADS)

    Wang, Z.; He, F.; Yang, Z. W.; Li, Z.; Ni, W. D.

    A circulating fluidized bed (CFB) two-stage simulation model was developed. To realize the model results coincident with the design value or real operation value at specified multi-working conditions and with capability of real-time calculation, only the main key processes were taken into account and the dominant factors were further abstracted out of these key processes. The simulation results showed a sound accordance at multi-working conditions, and confirmed the advantage of the two-stage model over the original single-stage simulation model. The combustion-support effect of secondary air was investigated using the two-stage model. This model provides a solid platform for investigating the pant-leg structured CFB furnace, which is now under design for a supercritical power plant.

  17. 78 FR 6269 - Amendment to the International Traffic in Arms Regulations: Revision of U.S. Munitions List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... remain subject to USML control are modeling or simulation tools that model or simulate the environments... USML revision process, the public is asked to provide specific examples of nuclear-related items whose...) Modeling or simulation tools that model or simulate the environments generated by nuclear detonations or...

  18. Modelling and Simulation for Requirements Engineering and Options Analysis

    DTIC Science & Technology

    2010-05-01

    should be performed to work successfully in the domain; and process-based techniques model the processes that occur in the work domain. There is a crisp ...acad/sed/sedres/ dm /erg/cwa. DRDC Toronto CR 2010-049 39 23. Can the current technique for developing simulation models for assessments

  19. A Method for Combining Experimentation and Molecular Dynamics Simulation to Improve Cohesive Zone Models for Metallic Microstructures

    NASA Technical Reports Server (NTRS)

    Hochhalter, J. D.; Glaessgen, E. H.; Ingraffea, A. R.; Aquino, W. A.

    2009-01-01

    Fracture processes within a material begin at the nanometer length scale at which the formation, propagation, and interaction of fundamental damage mechanisms occur. Physics-based modeling of these atomic processes quickly becomes computationally intractable as the system size increases. Thus, a multiscale modeling method, based on the aggregation of fundamental damage processes occurring at the nanoscale within a cohesive zone model, is under development and will enable computationally feasible and physically meaningful microscale fracture simulation in polycrystalline metals. This method employs atomistic simulation to provide an optimization loop with an initial prediction of a cohesive zone model (CZM). This initial CZM is then applied at the crack front region within a finite element model. The optimization procedure iterates upon the CZM until the finite element model acceptably reproduces the near-crack-front displacement fields obtained from experimental observation. With this approach, a comparison can be made between the original CZM predicted by atomistic simulation and the converged CZM that is based on experimental observation. Comparison of the two CZMs gives insight into how atomistic simulation scales.

  20. Comparison of tropical cyclogenesis processes in climate model and cloud-resolving model simulations using moist static energy budget analysis

    NASA Astrophysics Data System (ADS)

    Wing, Allison; Camargo, Suzana; Sobel, Adam; Kim, Daehyun; Murakami, Hiroyuki; Reed, Kevin; Vecchi, Gabriel; Wehner, Michael; Zarzycki, Colin; Zhao, Ming

    2017-04-01

    In recent years, climate models have improved such that high-resolution simulations are able to reproduce the climatology of tropical cyclone activity with some fidelity and show some skill in seasonal forecasting. However biases remain in many models, motivating a better understanding of what factors control the representation of tropical cyclone activity in climate models. We explore the tropical cyclogenesis processes in five high-resolution climate models, including both coupled and uncoupled configurations. Our analysis framework focuses on how convection, moisture, clouds and related processes are coupled and employs budgets of column moist static energy and the spatial variance of column moist static energy. The latter was originally developed to study the mechanisms of tropical convective organization in idealized cloud-resolving models, and allows us to quantify the different feedback processes responsible for the amplification of moist static energy anomalies associated with the organization of convection and cyclogenesis. We track the formation and evolution of tropical cyclones in the climate model simulations and apply our analysis both along the individual tracks and composited over many tropical cyclones. We then compare the genesis processes; in particular, the role of cloud-radiation interactions, to those of spontaneous tropical cyclogenesis in idealized cloud-resolving model simulations.

  1. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    USGS Publications Warehouse

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a weighted least-squares objective function is minimized with respect to the parameter values using a modified Gauss-Newton method or a double-dogleg technique. Sensitivities needed for the method can be read from files produced by process models that can calculate sensitivities, such as MODFLOW-2000, or can be calculated by UCODE_2005 using a more general, but less accurate, forward- or central-difference perturbation technique. Problems resulting from inaccurate sensitivities and solutions related to the perturbation techniques are discussed in the report. Statistics are calculated and printed for use in (1) diagnosing inadequate data and identifying parameters that probably cannot be estimated; (2) evaluating estimated parameter values; and (3) evaluating how well the model represents the simulated processes. Results from UCODE_2005 and codes RESIDUAL_ANALYSIS and RESIDUAL_ANALYSIS_ADV can be used to evaluate how accurately the model represents the processes it simulates. Results from LINEAR_UNCERTAINTY can be used to quantify the uncertainty of model simulated values if the model is sufficiently linear. Results from MODEL_LINEARITY and MODEL_LINEARITY_ADV can be used to evaluate model linearity and, thereby, the accuracy of the LINEAR_UNCERTAINTY results. UCODE_2005 can also be used to calculate nonlinear confidence and predictions intervals, which quantify the uncertainty of model simulated values when the model is not linear. CORFAC_PLUS can be used to produce factors that allow intervals to account for model intrinsic nonlinearity and small-scale variations in system characteristics that are not explicitly accounted for in the model or the observation weighting. The six post-processing programs are independent of UCODE_2005 and can use the results of other programs that produce the required data-exchange files. UCODE_2005 and the other six codes are intended for use on any computer operating system. The programs con

  2. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    NASA Astrophysics Data System (ADS)

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  3. Modeling and simulation of offshore wind farm O&M processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joschko, Philip, E-mail: joschko@informatik.uni-hamburg.de; Widok, Andi H., E-mail: a.widok@htw-berlin.de; Appel, Susanne, E-mail: susanne.appel@hs-bremen.de

    2015-04-15

    This paper describes a holistic approach to operation and maintenance (O&M) processes in the domain of offshore wind farm power generation. The acquisition and process visualization is followed by a risk analysis of all relevant processes. Hereafter, a tool was designed, which is able to model the defined processes in a BPMN 2.0 notation, as well as connect and simulate them. Furthermore, the notation was enriched with new elements, representing other relevant factors that were, to date, only displayable with much higher effort. In that regard a variety of more complex situations were integrated, such as for example new processmore » interactions depending on different weather influences, in which case a stochastic weather generator was combined with the business simulation or other wind farm aspects important to the smooth running of the offshore wind farms. In addition, the choices for different methodologies, such as the simulation framework or the business process notation will be presented and elaborated depending on the impact they had on the development of the approach and the software solution. - Highlights: • Analysis of operation and maintenance processes of offshore wind farms • Process modeling with BPMN 2.0 • Domain-specific simulation tool.« less

  4. A fortran program for Monte Carlo simulation of oil-field discovery sequences

    USGS Publications Warehouse

    Bohling, Geoffrey C.; Davis, J.C.

    1993-01-01

    We have developed a program for performing Monte Carlo simulation of oil-field discovery histories. A synthetic parent population of fields is generated as a finite sample from a distribution of specified form. The discovery sequence then is simulated by sampling without replacement from this parent population in accordance with a probabilistic discovery process model. The program computes a chi-squared deviation between synthetic and actual discovery sequences as a function of the parameters of the discovery process model, the number of fields in the parent population, and the distributional parameters of the parent population. The program employs the three-parameter log gamma model for the distribution of field sizes and employs a two-parameter discovery process model, allowing the simulation of a wide range of scenarios. ?? 1993.

  5. Modeling laser velocimeter signals as triply stochastic Poisson processes

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1976-01-01

    Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.

  6. Continuity-based model interfacing for plant-wide simulation: a general approach.

    PubMed

    Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A

    2006-08-01

    In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.

  7. [Numerical simulation and operation optimization of biological filter].

    PubMed

    Zou, Zong-Sen; Shi, Han-Chang; Chen, Xiang-Qiang; Xie, Xiao-Qing

    2014-12-01

    BioWin software and two sensitivity analysis methods were used to simulate the Denitrification Biological Filter (DNBF) + Biological Aerated Filter (BAF) process in Yuandang Wastewater Treatment Plant. Based on the BioWin model of DNBF + BAF process, the operation data of September 2013 were used for sensitivity analysis and model calibration, and the operation data of October 2013 were used for model validation. The results indicated that the calibrated model could accurately simulate practical DNBF + BAF processes, and the most sensitive parameters were the parameters related to biofilm, OHOs and aeration. After the validation and calibration of model, it was used for process optimization with simulating operation results under different conditions. The results showed that, the best operation condition for discharge standard B was: reflux ratio = 50%, ceasing methanol addition, influent C/N = 4.43; while the best operation condition for discharge standard A was: reflux ratio = 50%, influent COD = 155 mg x L(-1) after methanol addition, influent C/N = 5.10.

  8. Modelling, simulation and verification of the screening process of a swing-bar sieve based on the DEM

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Yu, Jianqun; Yu, Yajun

    2018-05-01

    To solve the problems in the DEM simulations of the screening process of a swing-bar sieve, in this paper we propose the real-virtual boundary method to build the geometrical model of the screen deck on a swing-bar sieve. The motion of the swing-bar sieve is modelled by the planer multi-body kinematics. A coupled model of the discrete element method (DEM) with multi-body kinematics (MBK) is presented to simulate the flowing and passing processes of soybean particles on the screen deck. By the comparison of the simulated results with the experimental results of the screening process of the LA-LK laboratory scale swing-bar sieve, the feasibility and validity of the real-virtual boundary method and the coupled DEM-MBK model we proposed in this paper can be verified. This work provides the basis for the optimization design of the swing-bar sieve with circular apertures and complex motion.

  9. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  10. Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process

    NASA Astrophysics Data System (ADS)

    Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph

    2012-08-01

    Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.

  11. Understanding Islamist political violence through computational social simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less

  12. A simplified computational memory model from information processing

    PubMed Central

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  13. Simulation of the Onset of the Southeast Asian Monsoon During 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Wang, Yansen; Tao, W.-K.; Lau, K.-M.; Wetzel, Peter J.

    2003-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data fiom the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo- China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the lowlevel temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation.

  14. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  15. Superior memory efficiency of quantum devices for the simulation of continuous-time stochastic processes

    NASA Astrophysics Data System (ADS)

    Elliott, Thomas J.; Gu, Mile

    2018-03-01

    Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.

  16. Three-Dimension Visualization for Primary Wheat Diseases Based on Simulation Model

    NASA Astrophysics Data System (ADS)

    Shijuan, Li; Yeping, Zhu

    Crop simulation model has been becoming the core of agricultural production management and resource optimization management. Displaying crop growth process makes user observe the crop growth and development intuitionisticly. On the basis of understanding and grasping the occurrence condition, popularity season, key impact factors for main wheat diseases of stripe rust, leaf rust, stem rust, head blight and powdery mildew from research material and literature, we designed 3D visualization model for wheat growth and diseases occurrence. The model system will help farmer, technician and decision-maker to use crop growth simulation model better and provide decision-making support. Now 3D visualization model for wheat growth on the basis of simulation model has been developed, and the visualization model for primary wheat diseases is in the process of development.

  17. Modelling and simulation of a pervaporation process using tubular module for production of anhydrous ethanol

    NASA Astrophysics Data System (ADS)

    Hieu, Nguyen Huu

    2017-09-01

    Pervaporation is a potential process for the final step of ethanol biofuel production. In this study, a mathematical model was developed based on the resistance-in-series model and a simulation was carried out using the specialized simulation software COMSOL Multiphysics to describe a tubular type pervaporation module with membranes for the dehydration of ethanol solution. The permeance of membranes, operating conditions, and feed conditions in the simulation were referred from experimental data reported previously in literature. Accordingly, the simulated temperature and density profiles of pure water and ethanol-water mixture were validated based on existing published data.

  18. Application of two hydrologic models with different runoff mechanisms to a hillslope dominated watershed in the northeastern US: A comparison of HSPF and SMR

    USGS Publications Warehouse

    Johnson, M.S.; Coon, W.F.; Mehta, V.K.; Steenhuis, T.S.; Brooks, E.S.; Boll, J.

    2003-01-01

    Differences in the simulation of hydrologic processes by watershed models directly affect the accuracy of results. Surface runoff generation can be simulated as either: (1) infiltration-excess (or Hortonian) overland flow, or (2) saturation-excess overland flow. This study compared the Hydrological Simulation Program - FORTRAN (HSPF) and the Soil Moisture Routing (SMR) models, each representing one of these mechanisms. These two models were applied to a 102 km2 watershed in the upper part of the Irondequoit Creek basin in central New York State over a seven-year simulation period. The models differed in both the complexity of simulating snowmelt and baseflow processes as well as the detail in which the geographic information was preserved by each model. Despite their differences in structure and representation of hydrologic processes, the two models simulated streamflow with almost equal accuracy. Since streamflow is an integral response and depends mainly on the watershed water balance, this was not unexpected. Model efficiency values for the seven-year simulation period were 0.67 and 0.65 for SMR and HSPF, respectively. HSPF simulated winter streamflow slightly better than SMR as a result of its complex snowmelt routine, whereas SMR simulated summer flows better than HSPF as a result of its runoff and baseflow processes. An important difference between model results was the ability to predict the spatial distribution of soil moisture content. HSPF aggregates soil moisture content, which is generally related to a specific pervious land unit across the entire watershed, whereas SMR predictions of moisture content distribution are geographically specific and matched field observations reasonably well. Important is that the saturated area was predicted well by SMR and confirmed the validity of using saturation-excess mechanisms for this hillslope dominated watershed. ?? 2003 Elsevier B.V. All rights reserved.

  19. Exact simulation of max-stable processes.

    PubMed

    Dombry, Clément; Engelke, Sebastian; Oesting, Marco

    2016-06-01

    Max-stable processes play an important role as models for spatial extreme events. Their complex structure as the pointwise maximum over an infinite number of random functions makes their simulation difficult. Algorithms based on finite approximations are often inexact and computationally inefficient. We present a new algorithm for exact simulation of a max-stable process at a finite number of locations. It relies on the idea of simulating only the extremal functions, that is, those functions in the construction of a max-stable process that effectively contribute to the pointwise maximum. We further generalize the algorithm by Dieker & Mikosch (2015) for Brown-Resnick processes and use it for exact simulation via the spectral measure. We study the complexity of both algorithms, prove that our new approach via extremal functions is always more efficient, and provide closed-form expressions for their implementation that cover most popular models for max-stable processes and multivariate extreme value distributions. For simulation on dense grids, an adaptive design of the extremal function algorithm is proposed.

  20. DoD Lead System Integrator (LSI) Transformation - Creating a Model Based Acquisition Framework (MBAF)

    DTIC Science & Technology

    2014-04-30

    cost to acquire systems as design maturity could be verified incrementally as the system was developed vice waiting for specific large “ big bang ...Framework (MBAF) be applied to simulate or optimize process variations on programs? LSI Roles and Responsibilities A review of the roles and...the model/process optimization process. It is the current intent that NAVAIR will use the model to run simulations on process changes in an attempt to

  1. Towards Systematic Benchmarking of Climate Model Performance

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine performance tests readily accessible will help advance a more transparent model evaluation process.

  2. Simulation of aerobic and anaerobic biodegradation processes at a crude oil spill site

    USGS Publications Warehouse

    Essaid, Hedeff I.; Bekins, Barbara A.; Godsy, E. Michael; Warren, Ean; Baedecker, Mary Jo; Cozzarelli, Isabelle M.

    1995-01-01

    A two-dimensional, multispecies reactive solute transport model with sequential aerobic and anaerobic degradation processes was developed and tested. The model was used to study the field-scale solute transport and degradation processes at the Bemidji, Minnesota, crude oil spill site. The simulations included the biodegradation of volatile and nonvolatile fractions of dissolved organic carbon by aerobic processes, manganese and iron reduction, and methanogenesis. Model parameter estimates were constrained by published Monod kinetic parameters, theoretical yield estimates, and field biomass measurements. Despite the considerable uncertainty in the model parameter estimates, results of simulations reproduced the general features of the observed groundwater plume and the measured bacterial concentrations. In the simulation, 46% of the total dissolved organic carbon (TDOC) introduced into the aquifer was degraded. Aerobic degradation accounted for 40% of the TDOC degraded. Anaerobic processes accounted for the remaining 60% of degradation of TDOC: 5% by Mn reduction, 19% by Fe reduction, and 36% by methanogenesis. Thus anaerobic processes account for more than half of the removal of DOC at this site.

  3. Absorptivity Measurements and Heat Source Modeling to Simulate Laser Cladding

    NASA Astrophysics Data System (ADS)

    Wirth, Florian; Eisenbarth, Daniel; Wegener, Konrad

    The laser cladding process gains importance, as it does not only allow the application of surface coatings, but also additive manufacturing of three-dimensional parts. In both cases, process simulation can contribute to process optimization. Heat source modeling is one of the main issues for an accurate model and simulation of the laser cladding process. While the laser beam intensity distribution is readily known, the other two main effects on the process' heat input are non-trivial. Namely the measurement of the absorptivity of the applied materials as well as the powder attenuation. Therefore, calorimetry measurements were carried out. The measurement method and the measurement results for laser cladding of Stellite 6 on structural steel S 235 and for the processing of Inconel 625 are presented both using a CO2 laser as well as a high power diode laser (HPDL). Additionally, a heat source model is deduced.

  4. Finite Element Simulation of Compression Molding of Woven Fabric Carbon Fiber/Epoxy Composites: Part I Material Model Development

    DOE PAGES

    Li, Yang; Zhao, Qiangsheng; Mirdamadi, Mansour; ...

    2016-01-06

    Woven fabric carbon fiber/epoxy composites made through compression molding are one of the promising choices of material for the vehicle light-weighting strategy. Previous studies have shown that the processing conditions can have substantial influence on the performance of this type of the material. Therefore the optimization of the compression molding process is of great importance to the manufacturing practice. An efficient way to achieve the optimized design of this process would be through conducting finite element (FE) simulations of compression molding for woven fabric carbon fiber/epoxy composites. However, performing such simulation remains a challenging task for FE as multiple typesmore » of physics are involved during the compression molding process, including the epoxy resin curing and the complex mechanical behavior of woven fabric structure. In the present study, the FE simulation of the compression molding process of resin based woven fabric composites at continuum level is conducted, which is enabled by the implementation of an integrated material modeling methodology in LS-Dyna. Specifically, the chemo-thermo-mechanical problem of compression molding is solved through the coupling of three material models, i.e., one thermal model for temperature history in the resin, one mechanical model to update the curing-dependent properties of the resin and another mechanical model to simulate the behavior of the woven fabric composites. Preliminary simulations of the carbon fiber/epoxy woven fabric composites in LS-Dyna are presented as a demonstration, while validations and models with real part geometry are planned in the future work.« less

  5. Vertical structure and physical processes of the Madden-Julian oscillation: Exploring key model physics in climate simulations

    DOE PAGES

    Jiang, Xianan; Waliser, Duane E.; Xavier, Prince K.; ...

    2015-05-27

    Aimed at reducing deficiencies in representing the Madden-Julian oscillation (MJO) in general circulation models (GCMs), a global model evaluation project on vertical structure and physical processes of the MJO was coordinated. In this paper, results from the climate simulation component of this project are reported. Here, it is shown that the MJO remains a great challenge in these latest generation GCMs. The systematic eastward propagation of the MJO is only well simulated in about one fourth of the total participating models. The observed vertical westward tilt with altitude of the MJO is well simulated in good MJO models but notmore » in the poor ones. Damped Kelvin wave responses to the east of convection in the lower troposphere could be responsible for the missing MJO preconditioning process in these poor MJO models. Several process-oriented diagnostics were conducted to discriminate key processes for realistic MJO simulations. While large-scale rainfall partition and low-level mean zonal winds over the Indo-Pacific in a model are not found to be closely associated with its MJO skill, two metrics, including the low-level relative humidity difference between high- and low-rain events and seasonal mean gross moist stability, exhibit statistically significant correlations with the MJO performance. It is further indicated that increased cloud-radiative feedback tends to be associated with reduced amplitude of intraseasonal variability, which is incompatible with the radiative instability theory previously proposed for the MJO. Finally, results in this study confirm that inclusion of air-sea interaction can lead to significant improvement in simulating the MJO.« less

  6. The Monash University Interactive Simple Climate Model

    NASA Astrophysics Data System (ADS)

    Dommenget, D.

    2013-12-01

    The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.

  7. Evaluating Vertical Moisture Structure of the Madden-Julian Oscillation in Contemporary GCMs

    NASA Astrophysics Data System (ADS)

    Guan, B.; Jiang, X.; Waliser, D. E.

    2013-12-01

    The Madden-Julian Oscillation (MJO) remains a major challenge in our understanding and modeling of the tropical convection and circulation. Many models have troubles in realistically simulating key characteristics of the MJO, such as the strength, period, and eastward propagation. For models that do simulate aspects of the MJO, it remains to be understood what parameters and processes are the most critical in determining the quality of the simulations. This study focuses on the vertical structure of moisture in MJO simulations, with the aim to identify and understand the relationship between MJO simulation qualities and key parameters related to moisture. A series of 20-year simulations conducted by 26 GCMs are analyzed, including four that are coupled to ocean models and two that have a two-dimensional cloud resolving model embedded (i.e., superparameterized). TRMM precipitation and ERA-Interim reanalysis are used to evaluate the model simulations. MJO simulation qualities are evaluated based on pattern correlations of lead/lag regressions of precipitation - a measure of the model representation of the eastward propagating MJO convection. Models with strongest and weakest MJOs (top and bottom quartiles) are compared in terms of differences in moisture content, moisture convergence, moistening rate, and moist static energy. It is found that models with strongest MJOs have better representations of the observed vertical tilt of moisture. Relative importance of convection, advection, boundary layer, and large scale convection/precipitation are discussed in terms of their contribution to the moistening process. The results highlight the overall importance of vertical moisture structure in MJO simulations. The work contributes to the climatological component of the joint WCRP-WWRP/THORPEX YOTC MJO Task Force and the GEWEX Atmosphere System Study (GASS) global model evaluation project focused on the vertical structure and diabatic processes of the MJO.

  8. The Gravitational Process Path (GPP) model (v1.0) - a GIS-based simulation framework for gravitational processes

    NASA Astrophysics Data System (ADS)

    Wichmann, Volker

    2017-09-01

    The Gravitational Process Path (GPP) model can be used to simulate the process path and run-out area of gravitational processes based on a digital terrain model (DTM). The conceptual model combines several components (process path, run-out length, sink filling and material deposition) to simulate the movement of a mass point from an initiation site to the deposition area. For each component several modeling approaches are provided, which makes the tool configurable for different processes such as rockfall, debris flows or snow avalanches. The tool can be applied to regional-scale studies such as natural hazard susceptibility mapping but also contains components for scenario-based modeling of single events. Both the modeling approaches and precursor implementations of the tool have proven their applicability in numerous studies, also including geomorphological research questions such as the delineation of sediment cascades or the study of process connectivity. This is the first open-source implementation, completely re-written, extended and improved in many ways. The tool has been committed to the main repository of the System for Automated Geoscientific Analyses (SAGA) and thus will be available with every SAGA release.

  9. Simulation modelling of central order processing system under resource sharing strategy in demand-driven garment supply chains

    NASA Astrophysics Data System (ADS)

    Ma, K.; Thomassey, S.; Zeng, X.

    2017-10-01

    In this paper we proposed a central order processing system under resource sharing strategy for demand-driven garment supply chains to increase supply chain performances. We examined this system by using simulation technology. Simulation results showed that significant improvement in various performance indicators was obtained in new collaborative model with proposed system.

  10. Computer simulation to predict energy use, greenhouse gas emissions and costs for production of fluid milk using alternative processing methods

    USDA-ARS?s Scientific Manuscript database

    Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...

  11. Combining Simulation and Optimization Models for Hardwood Lumber Production

    Treesearch

    G.A. Mendoza; R.J. Meimban; W.G. Luppold; Philip A. Araman

    1991-01-01

    Published literature contains a number of optimization and simulation models dealing with the primary processing of hardwood and softwood logs. Simulation models have been developed primarily as descriptive models for characterizing the general operations and performance of a sawmill. Optimization models, on the other hand, were developed mainly as analytical tools for...

  12. Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems

    NASA Technical Reports Server (NTRS)

    Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.

    2012-01-01

    Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA). These dynamic models were developed using the Aspen Custom Modeler (Registered TradeMark) and Aspen Plus(Registered TradeMark) process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.

  13. The technique for Simulation of Transient Combustion Processes in the Rocket Engine Operating with Gaseous Fuel “Hydrogen and Oxygen”

    NASA Astrophysics Data System (ADS)

    Zubanov, V. M.; Stepanov, D. V.; Shabliy, L. S.

    2017-01-01

    The article describes the method for simulation of transient combustion processes in the rocket engine. The engine operates on gaseous propellant: oxygen and hydrogen. Combustion simulation was performed using the ANSYS CFX software. Three reaction mechanisms for the stationary mode were considered and described in detail. Reactions mechanisms have been taken from several sources and verified. The method for converting ozone properties from the Shomate equation to the NASA-polynomial format was described in detail. The way for obtaining quick CFD-results with intermediate combustion components using an EDM model was found. Modeling difficulties with combustion model Finite Rate Chemistry, associated with a large scatter of reference data were identified and described. The way to generate the Flamelet library with CFX-RIF is described. Formulated adequate reaction mechanisms verified at a steady state have also been tested for transient simulation. The Flamelet combustion model was recognized as adequate for the transient mode. Integral parameters variation relates to the values obtained during stationary simulation. A cyclic irregularity of the temperature field, caused by precession of the vortex core, was detected in the chamber with the proposed simulation technique. Investigations of unsteady processes of rocket engines including the processes of ignition were proposed as the area for application of the described simulation technique.

  14. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  15. A Framework to Design and Optimize Chemical Flooding Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  16. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  17. Development of capability for microtopography-resolving simulations of hydrologic processes in permafrost affected regions

    NASA Astrophysics Data System (ADS)

    Painter, S.; Moulton, J. D.; Berndt, M.; Coon, E.; Garimella, R.; Lewis, K. C.; Manzini, G.; Mishra, P.; Travis, B. J.; Wilson, C. J.

    2012-12-01

    The frozen soils of the Arctic and subarctic regions contain vast amounts of stored organic carbon. This carbon is vulnerable to release to the atmosphere as temperatures warm and permafrost degrades. Understanding the response of the subsurface and surface hydrologic system to degrading permafrost is key to understanding the rate, timing, and chemical form of potential carbon releases to the atmosphere. Simulating the hydrologic system in degrading permafrost regions is challenging because of the potential for topographic evolution and associated drainage network reorganization as permafrost thaws and massive ground ice melts. The critical process models required for simulating hydrology include subsurface thermal hydrology of freezing/thawing soils, thermal processes within ice wedges, mechanical deformation processes, overland flow, and surface energy balances including snow dynamics. A new simulation tool, the Arctic Terrestrial Simulator (ATS), is being developed to simulate these coupled processes. The computational infrastructure must accommodate fully unstructured grids that track evolving topography, allow accurate solutions on distorted grids, provide robust and efficient solutions on highly parallel computer architectures, and enable flexibility in the strategies for coupling among the various processes. The ATS is based on Amanzi (Moulton et al. 2012), an object-oriented multi-process simulator written in C++ that provides much of the necessary computational infrastructure. Status and plans for the ATS including major hydrologic process models and validation strategies will be presented. Highly parallel simulations of overland flow using high-resolution digital elevation maps of polygonal patterned ground landscapes demonstrate the feasibility of the approach. Simulations coupling three-phase subsurface thermal hydrology with a simple thaw-induced subsidence model illustrate the strong feedbacks among the processes. D. Moulton, M. Berndt, M. Day, J. Meza, et al., High-Level Design of Amanzi, the Multi-Process High Performance Computing Simulator, Technical Report ASCEM-HPC-2011-03-1, DOE Environmental Management, 2012.

  18. PopGen Fishbowl: A Free Online Simulation Model of Microevolutionary Processes

    ERIC Educational Resources Information Center

    Jones, Thomas C.; Laughlin, Thomas F.

    2010-01-01

    Natural selection and other components of evolutionary theory are known to be particularly challenging concepts for students to understand. To help illustrate these concepts, we developed a simulation model of microevolutionary processes. The model features all the components of Hardy-Weinberg theory, with population size, selection, gene flow,…

  19. Simulation modeling of forest landscape disturbances: An overview

    Treesearch

    Ajith H. Perera; Brian R. Sturtevant; Lisa J. Buse

    2015-01-01

    Quantification of ecological processes and formulation of the mathematical expressions that describe those processes in computer models has been a cornerstone of landscape ecology research and its application. Consequently, the body of publications on simulation models in landscape ecology has grown rapidly in recent decades. This trend is also evident in the subfield...

  20. Computer Simulation in Predicting Biochemical Processes and Energy Balance at WWTPs

    NASA Astrophysics Data System (ADS)

    Drewnowski, Jakub; Zaborowska, Ewa; Hernandez De Vega, Carmen

    2018-02-01

    Nowadays, the use of mathematical models and computer simulation allow analysis of many different technological solutions as well as testing various scenarios in a short time and at low financial budget in order to simulate the scenario under typical conditions for the real system and help to find the best solution in design or operation process. The aim of the study was to evaluate different concepts of biochemical processes and energy balance modelling using a simulation platform GPS-x and a comprehensive model Mantis2. The paper presents the example of calibration and validation processes in the biological reactor as well as scenarios showing an influence of operational parameters on the WWTP energy balance. The results of batch tests and full-scale campaign obtained in the former work were used to predict biochemical and operational parameters in a newly developed plant model. The model was extended with sludge treatment devices, including anaerobic digester. Primary sludge removal efficiency was found as a significant factor determining biogas production and further renewable energy production in cogeneration. Water and wastewater utilities, which run and control WWTP, are interested in optimizing the process in order to save environment, their budget and decrease the pollutant emissions to water and air. In this context, computer simulation can be the easiest and very useful tool to improve the efficiency without interfering in the actual process performance.

  1. Real-time simulation of a spiking neural network model of the basal ganglia circuitry using general purpose computing on graphics processing units.

    PubMed

    Igarashi, Jun; Shouno, Osamu; Fukai, Tomoki; Tsujino, Hiroshi

    2011-11-01

    Real-time simulation of a biologically realistic spiking neural network is necessary for evaluation of its capacity to interact with real environments. However, the real-time simulation of such a neural network is difficult due to its high computational costs that arise from two factors: (1) vast network size and (2) the complicated dynamics of biologically realistic neurons. In order to address these problems, mainly the latter, we chose to use general purpose computing on graphics processing units (GPGPUs) for simulation of such a neural network, taking advantage of the powerful computational capability of a graphics processing unit (GPU). As a target for real-time simulation, we used a model of the basal ganglia that has been developed according to electrophysiological and anatomical knowledge. The model consists of heterogeneous populations of 370 spiking model neurons, including computationally heavy conductance-based models, connected by 11,002 synapses. Simulation of the model has not yet been performed in real-time using a general computing server. By parallelization of the model on the NVIDIA Geforce GTX 280 GPU in data-parallel and task-parallel fashion, faster-than-real-time simulation was robustly realized with only one-third of the GPU's total computational resources. Furthermore, we used the GPU's full computational resources to perform faster-than-real-time simulation of three instances of the basal ganglia model; these instances consisted of 1100 neurons and 33,006 synapses and were synchronized at each calculation step. Finally, we developed software for simultaneous visualization of faster-than-real-time simulation output. These results suggest the potential power of GPGPU techniques in real-time simulation of realistic neural networks. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. A Hybrid Multiscale Framework for Subsurface Flow and Transport Simulations

    DOE PAGES

    Scheibe, Timothy D.; Yang, Xiaofan; Chen, Xingyuan; ...

    2015-06-01

    Extensive research efforts have been invested in reducing model errors to improve the predictive ability of biogeochemical earth and environmental system simulators, with applications ranging from contaminant transport and remediation to impacts of biogeochemical elemental cycling (e.g., carbon and nitrogen) on local ecosystems and regional to global climate. While the bulk of this research has focused on improving model parameterizations in the face of observational limitations, the more challenging type of model error/uncertainty to identify and quantify is model structural error which arises from incorrect mathematical representations of (or failure to consider) important physical, chemical, or biological processes, properties, ormore » system states in model formulations. While improved process understanding can be achieved through scientific study, such understanding is usually developed at small scales. Process-based numerical models are typically designed for a particular characteristic length and time scale. For application-relevant scales, it is generally necessary to introduce approximations and empirical parameterizations to describe complex systems or processes. This single-scale approach has been the best available to date because of limited understanding of process coupling combined with practical limitations on system characterization and computation. While computational power is increasing significantly and our understanding of biological and environmental processes at fundamental scales is accelerating, using this information to advance our knowledge of the larger system behavior requires the development of multiscale simulators. Accordingly there has been much recent interest in novel multiscale methods in which microscale and macroscale models are explicitly coupled in a single hybrid multiscale simulation. A limited number of hybrid multiscale simulations have been developed for biogeochemical earth systems, but they mostly utilize application-specific and sometimes ad-hoc approaches for model coupling. We are developing a generalized approach to hierarchical model coupling designed for high-performance computational systems, based on the Swift computing workflow framework. In this presentation we will describe the generalized approach and provide two use cases: 1) simulation of a mixing-controlled biogeochemical reaction coupling pore- and continuum-scale models, and 2) simulation of biogeochemical impacts of groundwater – river water interactions coupling fine- and coarse-grid model representations. This generalized framework can be customized for use with any pair of linked models (microscale and macroscale) with minimal intrusiveness to the at-scale simulators. It combines a set of python scripts with the Swift workflow environment to execute a complex multiscale simulation utilizing an approach similar to the well-known Heterogeneous Multiscale Method. User customization is facilitated through user-provided input and output file templates and processing function scripts, and execution within a high-performance computing environment is handled by Swift, such that minimal to no user modification of at-scale codes is required.« less

  3. JAMS - a software platform for modular hydrological modelling

    NASA Astrophysics Data System (ADS)

    Kralisch, Sven; Fischer, Christian

    2015-04-01

    Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.

  4. Study on wet scavenging of atmospheric pollutants in south Brazil

    NASA Astrophysics Data System (ADS)

    Wiegand, Flavio; Pereira, Felipe Norte; Teixeira, Elba Calesso

    2011-09-01

    The present paper presents the study of in-cloud and below-cloud SO 2 and SO 42-scavenging processes by applying numerical models in the Candiota region, located in the state of Rio Grande do Sul, South Brazil. The BRAMS (Brazilian Regional Atmospheric Modeling System) model was applied to simulate the vertical structure of the clouds, and the B.V.2 (Below-Cloud Beheng Version 2) scavenging model was applied to simulate in-cloud and below-cloud scavenging processes of the pollutants SO 2 and SO 42-. Five events in 2004 were selected for this study and were sampled at the Candiota Airport station. The concentrations of SO 2 and SO 42- sampled in the air and the simulated meteorological parameters of rainfall episodes were used as input data in the B.V.2, which simulates raindrop interactions associated with the scavenging process. Results for the Candiota region showed that in-cloud scavenging processes were more significant than below-cloud scavenging processes for two of the five events studied, with a contribution of approximately 90-100% of SO 2 and SO 42- concentrations in rainwater. A few adjustments to the original version of B.V.2 were made to allow simulation of scavenging processes in several types of clouds, not only cumulus humilis and cumulus congestus.

  5. Modelling and simulation of the consolidation behavior during thermoplastic prepreg composites forming process

    NASA Astrophysics Data System (ADS)

    Xiong, H.; Hamila, N.; Boisse, P.

    2017-10-01

    Pre-impregnated thermoplastic composites have recently attached increasing interest in the automotive industry for their excellent mechanical properties and their rapid cycle manufacturing process, modelling and numerical simulations of forming processes for composites parts with complex geometry is necessary to predict and optimize manufacturing practices, especially for the consolidation effects. A viscoelastic relaxation model is proposed to characterize the consolidation behavior of thermoplastic prepregs based on compaction tests with a range of temperatures. The intimate contact model is employed to predict the evolution of the consolidation which permits the microstructure prediction of void presented through the prepreg. Within a hyperelastic framework, several simulation tests are launched by combining a new developed solid shell finite element and the consolidation models.

  6. Evaluating the Credibility of Transport Processes in the Global Modeling Initiative 3D Model Simulations of Ozone Recovery

    NASA Technical Reports Server (NTRS)

    Strahan, Susan E.; Douglass, Anne R.

    2003-01-01

    The Global Modeling Initiative has integrated two 35-year simulations of an ozone recovery scenario with an offline chemistry and transport model using two different meteorological inputs. Physically based diagnostics, derived from satellite and aircraft data sets, are described and then used to evaluate the realism of temperature and transport processes in the simulations. Processes evaluated include barrier formation in the subtropics and polar regions, and extratropical wave-driven transport. Some diagnostics are especially relevant to simulation of lower stratospheric ozone, but most are applicable to any stratospheric simulation. The temperature evaluation, which is relevant to gas phase chemical reactions, showed that both sets of meteorological fields have near climatological values at all latitudes and seasons at 30 hPa and below. Both simulations showed weakness in upper stratospheric wave driving. The simulation using input from a general circulation model (GMI(sub GCM)) showed a very good residual circulation in the tropics and northern hemisphere. The simulation with input from a data assimilation system (GMI(sub DAS)) performed better in the midlatitudes than at high latitudes. Neither simulation forms a realistic barrier at the vortex edge, leading to uncertainty in the fate of ozone-depleted vortex air. Overall, tracer transport in the offline GMI(sub GCM) has greater fidelity throughout the stratosphere than the GMI(sub DAS).

  7. Surrogate safety measures from traffic simulation models

    DOT National Transportation Integrated Search

    2003-01-01

    This project investigates the potential for deriving surrogate measures of safety from existing microscopic traffic simulation models for intersections. The process of computing the measures in the simulation, extracting the required data, and summar...

  8. Simulations of ecosystem hydrological processes using a unified multi-scale model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaofan; Liu, Chongxuan; Fang, Yilin

    2015-01-01

    This paper presents a unified multi-scale model (UMSM) that we developed to simulate hydrological processes in an ecosystem containing both surface water and groundwater. The UMSM approach modifies the Navier–Stokes equation by adding a Darcy force term to formulate a single set of equations to describe fluid momentum and uses a generalized equation to describe fluid mass balance. The advantage of the approach is that the single set of the equations can describe hydrological processes in both surface water and groundwater where different models are traditionally required to simulate fluid flow. This feature of the UMSM significantly facilitates modelling ofmore » hydrological processes in ecosystems, especially at locations where soil/sediment may be frequently inundated and drained in response to precipitation, regional hydrological and climate changes. In this paper, the UMSM was benchmarked using WASH123D, a model commonly used for simulating coupled surface water and groundwater flow. Disney Wilderness Preserve (DWP) site at the Kissimmee, Florida, where active field monitoring and measurements are ongoing to understand hydrological and biogeochemical processes, was then used as an example to illustrate the UMSM modelling approach. The simulations results demonstrated that the DWP site is subject to the frequent changes in soil saturation, the geometry and volume of surface water bodies, and groundwater and surface water exchange. All the hydrological phenomena in surface water and groundwater components including inundation and draining, river bank flow, groundwater table change, soil saturation, hydrological interactions between groundwater and surface water, and the migration of surface water and groundwater interfaces can be simultaneously simulated using the UMSM. Overall, the UMSM offers a cross-scale approach that is particularly suitable to simulate coupled surface and ground water flow in ecosystems with strong surface water and groundwater interactions.« less

  9. NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2013-01-01

    The purpose of this Handbook is to provide technical information, clarification, examples, processes, and techniques to help institute good modeling and simulation practices in the National Aeronautics and Space Administration (NASA). As a companion guide to NASA-STD- 7009, Standard for Models and Simulations, this Handbook provides a broader scope of information than may be included in a Standard and promotes good practices in the production, use, and consumption of NASA modeling and simulation products. NASA-STD-7009 specifies what a modeling and simulation activity shall or should do (in the requirements) but does not prescribe how the requirements are to be met, which varies with the specific engineering discipline, or who is responsible for complying with the requirements, which depends on the size and type of project. A guidance document, which is not constrained by the requirements of a Standard, is better suited to address these additional aspects and provide necessary clarification. This Handbook stems from the Space Shuttle Columbia Accident Investigation (2003), which called for Agency-wide improvements in the "development, documentation, and operation of models and simulations"' that subsequently elicited additional guidance from the NASA Office of the Chief Engineer to include "a standard method to assess the credibility of the models and simulations."2 General methods applicable across the broad spectrum of model and simulation (M&S) disciplines were sought to help guide the modeling and simulation processes within NASA and to provide for consistent reporting ofM&S activities and analysis results. From this, the standardized process for the M&S activity was developed. The major contents of this Handbook are the implementation details of the general M&S requirements ofNASA-STD-7009, including explanations, examples, and suggestions for improving the credibility assessment of an M&S-based analysis.

  10. Simulating an underwater vehicle self-correcting guidance system with Simulink

    NASA Astrophysics Data System (ADS)

    Fan, Hui; Zhang, Yu-Wen; Li, Wen-Zhe

    2008-09-01

    Underwater vehicles have already adopted self-correcting directional guidance algorithms based on multi-beam self-guidance systems, not waiting for research to determine the most effective algorithms. The main challenges facing research on these guidance systems have been effective modeling of the guidance algorithm and a means to analyze the simulation results. A simulation structure based on Simulink that dealt with both issues was proposed. Initially, a mathematical model of relative motion between the vehicle and the target was developed, which was then encapsulated as a subsystem. Next, steps for constructing a model of the self-correcting guidance algorithm based on the Stateflow module were examined in detail. Finally, a 3-D model of the vehicle and target was created in VRML, and by processing mathematical results, the model was shown moving in a visual environment. This process gives more intuitive results for analyzing the simulation. The results showed that the simulation structure performs well. The simulation program heavily used modularization and encapsulation, so has broad applicability to simulations of other dynamic systems.

  11. FE Simulation Models for Hot Stamping an Automobile Component with Tailor-Welded High-Strength Steels

    NASA Astrophysics Data System (ADS)

    Tang, Bingtao; Wang, Qiaoling; Wei, Zhaohui; Meng, Xianju; Yuan, Zhengjun

    2016-05-01

    Ultra-high-strength in sheet metal parts can be achieved with hot stamping process. To improve the crash performance and save vehicle weight, it is necessary to produce components with tailored properties. The use of tailor-welded high-strength steel is a relatively new hot stamping process for saving weight and obtaining desired local stiffness and crash performance. The simulation of hot stamping boron steel, especially tailor-welded blanks (TWBs) stamping, is more complex and challenging. Information about thermal/mechanical properties of tools and sheet materials, heat transfer, and friction between the deforming material and the tools is required in detail. In this study, the boron-manganese steel B1500HS and high-strength low-alloy steel B340LA are tailor welded and hot stamped. In order to precisely simulate the hot stamping process, modeling and simulation of hot stamping tailor-welded high-strength steels, including phase transformation modeling, thermal modeling, and thermal-mechanical modeling, is investigated. Meanwhile, the welding zone of tailor-welded blanks should be sufficiently accurate to describe thermal, mechanical, and metallurgical parameters. FE simulation model using TWBs with the thickness combination of 1.6 mm boron steel and 1.2 mm low-alloy steel is established. In order to evaluate the mechanical properties of the hot stamped automotive component (mini b-pillar), hardness and microstructure at each region are investigated. The comparisons between simulated results and experimental observations show the reliability of thermo-mechanical and metallurgical modeling strategies of TWBs hot stamping process.

  12. Material model validation for laser shock peening process simulation

    NASA Astrophysics Data System (ADS)

    Amarchinta, H. K.; Grandhi, R. V.; Langer, K.; Stargel, D. S.

    2009-01-01

    Advanced mechanical surface enhancement techniques have been used successfully to increase the fatigue life of metallic components. These techniques impart deep compressive residual stresses into the component to counter potentially damage-inducing tensile stresses generated under service loading. Laser shock peening (LSP) is an advanced mechanical surface enhancement technique used predominantly in the aircraft industry. To reduce costs and make the technique available on a large-scale basis for industrial applications, simulation of the LSP process is required. Accurate simulation of the LSP process is a challenging task, because the process has many parameters such as laser spot size, pressure profile and material model that must be precisely determined. This work focuses on investigating the appropriate material model that could be used in simulation and design. In the LSP process material is subjected to strain rates of 106 s-1, which is very high compared with conventional strain rates. The importance of an accurate material model increases because the material behaves significantly different at such high strain rates. This work investigates the effect of multiple nonlinear material models for representing the elastic-plastic behavior of materials. Elastic perfectly plastic, Johnson-Cook and Zerilli-Armstrong models are used, and the performance of each model is compared with available experimental results.

  13. Different modelling approaches to evaluate nitrogen transport and turnover at the watershed scale

    NASA Astrophysics Data System (ADS)

    Epelde, Ane Miren; Antiguedad, Iñaki; Brito, David; Jauch, Eduardo; Neves, Ramiro; Garneau, Cyril; Sauvage, Sabine; Sánchez-Pérez, José Miguel

    2016-08-01

    This study presents the simulation of hydrological processes and nutrient transport and turnover processes using two integrated numerical models: Soil and Water Assessment Tool (SWAT) (Arnold et al., 1998), an empirical and semi-distributed numerical model; and Modelo Hidrodinâmico (MOHID) (Neves, 1985), a physics-based and fully distributed numerical model. This work shows that both models reproduce satisfactorily water and nitrate exportation at the watershed scale at annual and daily basis, MOHID providing slightly better results. At the watershed scale, both SWAT and MOHID simulated similarly and satisfactorily the denitrification amount. However, as MOHID numerical model was the only one able to reproduce adequately the spatial variation of the soil hydrological conditions and water table level fluctuation, it proved to be the only model able of reproducing the spatial variation of the nutrient cycling processes that are dependent to the soil hydrological conditions such as the denitrification process. This evidences the strength of the fully distributed and physics-based models to simulate the spatial variability of nutrient cycling processes that are dependent to the hydrological conditions of the soils.

  14. SSEM: A model for simulating runoff and erosion of saline-sodic soil slopes under coastal reclamation

    NASA Astrophysics Data System (ADS)

    Liu, Dongdong; She, Dongli

    2018-06-01

    Current physically based erosion models do not carefully consider the dynamic variations of soil properties during rainfall and are unable to simulate saline-sodic soil slope erosion processes. The aim of this work was to build upon a complete model framework, SSEM, to simulate runoff and erosion processes for saline-sodic soils by coupling dynamic saturated hydraulic conductivity Ks and soil erodibility Kτ. Sixty rainfall simulation rainfall experiments (2 soil textures × 5 sodicity levels × 2 slope gradients × 3 duplicates) provided data for model calibration and validation. SSEM worked very well for simulating the runoff and erosion processes of saline-sodic silty clay. The runoff and erosion processes of saline-sodic silt loam were more complex than those of non-saline soils or soils with higher clay contents; thus, SSEM did not perform very well for some validation events. We further examined the model performances of four concepts: Dynamic Ks and Kτ (Case 1, SSEM), Dynamic Ks and Constant Kτ (Case 2), Constant Ks and Dynamic Kτ (Case 3) and Constant Ks and Constant Kτ (Case 4). The results demonstrated that the model, which considers dynamic variations in soil saturated hydraulic conductivity and soil erodibility, can provide more reasonable runoff and erosion prediction results for saline-sodic soils.

  15. Simulation of groundwater flow in the glacial aquifer system of northeastern Wisconsin with variable model complexity

    USGS Publications Warehouse

    Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.

    2017-05-04

    The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle tracking is anticipated to evaluate if these model design considerations are similarly important for understanding the primary modeling objective - to simulate reasonable groundwater age distributions.

  16. Multithreaded Stochastic PDES for Reactions and Diffusions in Neurons.

    PubMed

    Lin, Zhongwei; Tropper, Carl; Mcdougal, Robert A; Patoary, Mohammand Nazrul Ishlam; Lytton, William W; Yao, Yiping; Hines, Michael L

    2017-07-01

    Cells exhibit stochastic behavior when the number of molecules is small. Hence a stochastic reaction-diffusion simulator capable of working at scale can provide a more accurate view of molecular dynamics within the cell. This paper describes a parallel discrete event simulator, Neuron Time Warp-Multi Thread (NTW-MT), developed for the simulation of reaction diffusion models of neurons. To the best of our knowledge, this is the first parallel discrete event simulator oriented towards stochastic simulation of chemical reactions in a neuron. The simulator was developed as part of the NEURON project. NTW-MT is optimistic and thread-based, which attempts to capitalize on multi-core architectures used in high performance machines. It makes use of a multi-level queue for the pending event set and a single roll-back message in place of individual anti-messages to disperse contention and decrease the overhead of processing rollbacks. Global Virtual Time is computed asynchronously both within and among processes to get rid of the overhead for synchronizing threads. Memory usage is managed in order to avoid locking and unlocking when allocating and de-allocating memory and to maximize cache locality. We verified our simulator on a calcium buffer model. We examined its performance on a calcium wave model, comparing it to the performance of a process based optimistic simulator and a threaded simulator which uses a single priority queue for each thread. Our multi-threaded simulator is shown to achieve superior performance to these simulators. Finally, we demonstrated the scalability of our simulator on a larger CICR model and a more detailed CICR model.

  17. The Soil Model Development and Intercomparison Panel (SoilMIP) of the International Soil Modeling Consortium (ISMC)

    NASA Astrophysics Data System (ADS)

    Vanderborght, Jan; Priesack, Eckart

    2017-04-01

    The Soil Model Development and Intercomparison Panel (SoilMIP) is an initiative of the International Soil Modeling Consortium. Its mission is to foster the further development of soil models that can predict soil functions and their changes (i) due to soil use and land management and (ii) due to external impacts of climate change and pollution. Since soil functions and soil threats are diverse but linked with each other, the overall aim is to develop holistic models that represent the key functions of the soil system and the links between them. These models should be scaled up and integrated in terrestrial system models that describe the feedbacks between processes in the soil and the other terrestrial compartments. We propose and illustrate a few steps that could be taken to achieve these goals. A first step is the development of scenarios that compare simulations by models that predict the same or different soil services. Scenarios can be considered at three different levels of comparisons: scenarios that compare the numerics (accuracy but also speed) of models, scenarios that compare the effect of differences in process descriptions, and scenarios that compare simulations with experimental data. A second step involves the derivation of metrics or summary statistics that effectively compare model simulations and disentangle parameterization from model concept differences. These metrics can be used to evaluate how more complex model simulations can be represented by simpler models using an appropriate parameterization. A third step relates to the parameterization of models. Application of simulation models implies that appropriate model parameters have to be defined for a range of environmental conditions and locations. Spatial modelling approaches are used to derive parameter distributions. Considering that soils and their properties emerge from the interaction between physical, chemical and biological processes, the combination of spatial models with process models would lead to consistent parameter distributions correlations and could potentially represent self-organizing processes in soils and landscapes.

  18. Exposing earth surface process model simulations to a large audience

    NASA Astrophysics Data System (ADS)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical modeling as a tool to create understanding of environmental processes.

  19. A density-adaptive SPH method with kernel gradient correction for modeling explosive welding

    NASA Astrophysics Data System (ADS)

    Liu, M. B.; Zhang, Z. L.; Feng, D. L.

    2017-09-01

    Explosive welding involves processes like the detonation of explosive, impact of metal structures and strong fluid-structure interaction, while the whole process of explosive welding has not been well modeled before. In this paper, a novel smoothed particle hydrodynamics (SPH) model is developed to simulate explosive welding. In the SPH model, a kernel gradient correction algorithm is used to achieve better computational accuracy. A density adapting technique which can effectively treat large density ratio is also proposed. The developed SPH model is firstly validated by simulating a benchmark problem of one-dimensional TNT detonation and an impact welding problem. The SPH model is then successfully applied to simulate the whole process of explosive welding. It is demonstrated that the presented SPH method can capture typical physics in explosive welding including explosion wave, welding surface morphology, jet flow and acceleration of the flyer plate. The welding angle obtained from the SPH simulation agrees well with that from a kinematic analysis.

  20. Analysis of mixed model in gear transmission based on ADAMS

    NASA Astrophysics Data System (ADS)

    Li, Xiufeng; Wang, Yabin

    2012-09-01

    The traditional method of mechanical gear driving simulation includes gear pair method and solid to solid contact method. The former has higher solving efficiency but lower results accuracy; the latter usually obtains higher precision of results while the calculation process is complex, also it is not easy to converge. Currently, most of the researches are focused on the description of geometric models and the definition of boundary conditions. However, none of them can solve the problems fundamentally. To improve the simulation efficiency while ensure the results with high accuracy, a mixed model method which uses gear tooth profiles to take the place of the solid gear to simulate gear movement is presented under these circumstances. In the process of modeling, build the solid models of the mechanism in the SolidWorks firstly; Then collect the point coordinates of outline curves of the gear using SolidWorks API and create fit curves in Adams based on the point coordinates; Next, adjust the position of those fitting curves according to the position of the contact area; Finally, define the loading conditions, boundary conditions and simulation parameters. The method provides gear shape information by tooth profile curves; simulates the mesh process through tooth profile curve to curve contact and offer mass as well as inertia data via solid gear models. This simulation process combines the two models to complete the gear driving analysis. In order to verify the validity of the method presented, both theoretical derivation and numerical simulation on a runaway escapement are conducted. The results show that the computational efficiency of the mixed model method is 1.4 times over the traditional method which contains solid to solid contact. Meanwhile, the simulation results are more closely to theoretical calculations. Consequently, mixed model method has a high application value regarding to the study of the dynamics of gear mechanism.

  1. Assessment of input uncertainty by seasonally categorized latent variables using SWAT

    USDA-ARS?s Scientific Manuscript database

    Watershed processes have been explored with sophisticated simulation models for the past few decades. It has been stated that uncertainty attributed to alternative sources such as model parameters, forcing inputs, and measured data should be incorporated during the simulation process. Among varyin...

  2. Evaluating crown fire rate of spread predictions from physics-based models

    Treesearch

    C. M. Hoffman; J. Ziegler; J. Canfield; R. R. Linn; W. Mell; C. H. Sieg; F. Pimont

    2015-01-01

    Modeling the behavior of crown fires is challenging due to the complex set of coupled processes that drive the characteristics of a spreading wildfire and the large range of spatial and temporal scales over which these processes occur. Detailed physics-based modeling approaches such as FIRETEC and the Wildland Urban Interface Fire Dynamics Simulator (WFDS) simulate...

  3. Dynamic Simulation of a Helium Liquefier

    NASA Astrophysics Data System (ADS)

    Maekawa, R.; Ooba, K.; Nobutoki, M.; Mito, T.

    2004-06-01

    Dynamic behavior of a helium liquefier has been studied in detail with a Cryogenic Process REal-time SimulaTor (C-PREST) at the National Institute for Fusion Science (NIFS). The C-PREST is being developed to integrate large-scale helium cryogenic plant design, operation and maintenance for optimum process establishment. As a first step of simulations of cooldown to 4.5 K with the helium liquefier model is conducted, which provides a plant-process validation platform. The helium liquefier consists of seven heat exchangers, a liquid-nitrogen (LN2) precooler, two expansion turbines and a liquid-helium (LHe) reservoir. Process simulations are fulfilled with sequence programs, which were implemented with C-PREST based on an existing liquefier operation. The interactions of a JT valve, a JT-bypass valve and a reservoir-return valve have been dynamically simulated. The paper discusses various aspects of refrigeration process simulation, including its difficulties such as a balance between complexity of the adopted models and CPU time.

  4. Realization of process improvement at a diagnostic radiology department with aid of simulation modeling.

    PubMed

    Oh, Hong-Choon; Toh, Hong-Guan; Giap Cheong, Eddy Seng

    2011-11-01

    Using the classical process improvement framework of Plan-Do-Study-Act (PDSA), the diagnostic radiology department of a tertiary hospital identified several patient cycle time reduction strategies. Experimentation of these strategies (which included procurement of new machines, hiring of new staff, redesign of queue system, etc.) through pilot scale implementation was impractical because it might incur substantial expenditure or be operationally disruptive. With this in mind, simulation modeling was used to test these strategies via performance of "what if" analyses. Using the output generated by the simulation model, the team was able to identify a cost-free cycle time reduction strategy, which subsequently led to a reduction of patient cycle time and achievement of a management-defined performance target. As healthcare professionals work continually to improve healthcare operational efficiency in response to rising healthcare costs and patient expectation, simulation modeling offers an effective scientific framework that can complement established process improvement framework like PDSA to realize healthcare process enhancement. © 2011 National Association for Healthcare Quality.

  5. P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)

    NASA Astrophysics Data System (ADS)

    Kropp, Derek L.

    2009-05-01

    One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.

  6. TKKMOD: A computer simulation program for an integrated wind diesel system. Version 1.0: Document and user guide

    NASA Astrophysics Data System (ADS)

    Manninen, L. M.

    1993-12-01

    The document describes TKKMOD, a simulation model developed at Helsinki University of Technology for a specific wind-diesel system layout, with special emphasis on the battery submodel and its use in simulation. The model has been included into the European wind-diesel modeling software package WDLTOOLS under the CEC JOULE project 'Engineering Design Tools for Wind-Diesel Systems' (JOUR-0078). WDLTOOLS serves as the user interface and processes the input and output data of different logistic simulation models developed by the project participants. TKKMOD cannot be run without this shell. The report only describes the simulation principles and model specific parameters of TKKMOD and gives model specific user instructions. The input and output data processing performed outside this model is described in the documentation of the shell. The simulation model is utilized for calculation of long-term performance of the reference system configuration for given wind and load conditions. The main results are energy flows, losses in the system components, diesel fuel consumption, and the number of diesel engine starts.

  7. A large-scale forest landscape model incorporating multi-scale processes and utilizing forest inventory data

    Treesearch

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson III; David R. Larsen; Jacob S. Fraser; Jian Yang

    2013-01-01

    Two challenges confronting forest landscape models (FLMs) are how to simulate fine, standscale processes while making large-scale (i.e., .107 ha) simulation possible, and how to take advantage of extensive forest inventory data such as U.S. Forest Inventory and Analysis (FIA) data to initialize and constrain model parameters. We present the LANDIS PRO model that...

  8. ISPE: A knowledge-based system for fluidization studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, S.

    1991-01-01

    Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all specified goals'' are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less

  9. ASPEN simulation of a fixed-bed integrated gasification combined-cycle power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, K.R.

    1986-03-01

    A fixed-bed integrated gasification combined-cycle (IGCC) power plant has been modeled using the Advanced System for Process ENgineering (ASPEN). The ASPEN simulation is based on a conceptual design of a 509-MW IGCC power plant that uses British Gas Corporation (BGC)/Lurgi slagging gasifiers and the Lurgi acid gas removal process. The 39.3-percent thermal efficiency of the plant that was calculated by the simulation compares very favorably with the 39.4 percent that was reported by EPRI. The simulation addresses only thermal performance and does not calculate capital cost or process economics. Portions of the BGC-IGCC simulation flowsheet are based on the SLAGGERmore » fixed-bed gasifier model (Stefano May 1985), and the Kellogg-Rust-Westinghouse (KRW) iGCC, and the Texaco-IGCC simulations (Stone July 1985) that were developed at the Department of Energy (DOE), Morgantown Energy Technology Center (METC). The simulation runs in 32 minutes of Central Processing Unit (CPU) time on the VAX-11/780. The BGC-IGCC simulation was developed to give accurate mass and energy balances and to track coal tars and environmental species such as SO/sub x/ and NO/sub x/ for a fixed-bed, coal-to-electricity system. This simulation is the third in a series of three IGCC simulations that represent fluidized-bed, entrained-flow, and fixed-bed gasification processes. Alternate process configurations can be considered by adding, deleting, or rearranging unit operation blocks. The gasifier model is semipredictive; it can properly respond to a limited range of coal types and gasifier operating conditions. However, some models in the flowsheet are based on correlations that were derived from the EPRI study, and are therefore limited to coal types and operating conditions that are reasonably close to those given in the EPRI design. 4 refs., 7 figs., 2 tabs.« less

  10. Multi-material 3D Models for Temporal Bone Surgical Simulation.

    PubMed

    Rose, Austin S; Kimbell, Julia S; Webster, Caroline E; Harrysson, Ola L A; Formeister, Eric J; Buchman, Craig A

    2015-07-01

    A simulated, multicolor, multi-material temporal bone model can be created using 3-dimensional (3D) printing that will prove both safe and beneficial in training for actual temporal bone surgical cases. As the process of additive manufacturing, or 3D printing, has become more practical and affordable, a number of applications for the technology in the field of Otolaryngology-Head and Neck Surgery have been considered. One area of promise is temporal bone surgical simulation. Three-dimensional representations of human temporal bones were created from temporal bone computed tomography (CT) scans using biomedical image processing software. Multi-material models were then printed and dissected in a temporal bone laboratory by attending and resident otolaryngologists. A 5-point Likert scale was used to grade the models for their anatomical accuracy and suitability as a simulation of cadaveric and operative temporal bone drilling. The models produced for this study demonstrate significant anatomic detail and a likeness to human cadaver specimens for drilling and dissection. Simulated temporal bones created by this process have potential benefit in surgical training, preoperative simulation for challenging otologic cases, and the standardized testing of temporal bone surgical skills. © The Author(s) 2015.

  11. Simulation of wheat growth and development based on organ-level photosynthesis and assimilate allocation.

    PubMed

    Evers, J B; Vos, J; Yin, X; Romero, P; van der Putten, P E L; Struik, P C

    2010-05-01

    Intimate relationships exist between form and function of plants, determining many processes governing their growth and development. However, in most crop simulation models that have been created to simulate plant growth and, for example, predict biomass production, plant structure has been neglected. In this study, a detailed simulation model of growth and development of spring wheat (Triticum aestivum) is presented, which integrates degree of tillering and canopy architecture with organ-level light interception, photosynthesis, and dry-matter partitioning. An existing spatially explicit 3D architectural model of wheat development was extended with routines for organ-level microclimate, photosynthesis, assimilate distribution within the plant structure according to organ demands, and organ growth and development. Outgrowth of tiller buds was made dependent on the ratio between assimilate supply and demand of the plants. Organ-level photosynthesis, biomass production, and bud outgrowth were simulated satisfactorily. However, to improve crop simulation results more efforts are needed mechanistically to model other major plant physiological processes such as nitrogen uptake and distribution, tiller death, and leaf senescence. Nevertheless, the work presented here is a significant step forwards towards a mechanistic functional-structural plant model, which integrates plant architecture with key plant processes.

  12. Simulating carbon and water fluxes at Arctic and boreal ecosystems in Alaska by optimizing the modified BIOME-BGC with eddy covariance data

    NASA Astrophysics Data System (ADS)

    Ueyama, M.; Kondo, M.; Ichii, K.; Iwata, H.; Euskirchen, E. S.; Zona, D.; Rocha, A. V.; Harazono, Y.; Nakai, T.; Oechel, W. C.

    2013-12-01

    To better predict carbon and water cycles in Arctic ecosystems, we modified a process-based ecosystem model, BIOME-BGC, by introducing new processes: change in active layer depth on permafrost and phenology of tundra vegetation. The modified BIOME-BGC was optimized using an optimization method. The model was constrained using gross primary productivity (GPP) and net ecosystem exchange (NEE) at 23 eddy covariance sites in Alaska, and vegetation/soil carbon from a literature survey. The model was used to simulate regional carbon and water fluxes of Alaska from 1900 to 2011. Simulated regional fluxes were validated with upscaled GPP, ecosystem respiration (RE), and NEE based on two methods: (1) a machine learning technique and (2) a top-down model. Our initial simulation suggests that the original BIOME-BGC with default ecophysiological parameters substantially underestimated GPP and RE for tundra and overestimated those fluxes for boreal forests. We will discuss how optimization using the eddy covariance data impacts the historical simulation by comparing the new version of the model with simulated results from the original BIOME-BGC with default ecophysiological parameters. This suggests that the incorporation of the active layer depth and plant phenology processes is important to include when simulating carbon and water fluxes in Arctic ecosystems.

  13. A finite element simulation of biological conversion processes in landfills.

    PubMed

    Robeck, M; Ricken, T; Widmann, R

    2011-04-01

    Landfills are the most common way of waste disposal worldwide. Biological processes convert the organic material into an environmentally harmful landfill gas, which has an impact on the greenhouse effect. After the depositing of waste has been stopped, current conversion processes continue and emissions last for several decades and even up to 100years and longer. A good prediction of these processes is of high importance for landfill operators as well as for authorities, but suitable models for a realistic description of landfill processes are rather poor. In order to take the strong coupled conversion processes into account, a constitutive three-dimensional model based on the multiphase Theory of Porous Media (TPM) has been developed at the University of Duisburg-Essen. The theoretical formulations are implemented in the finite element code FEAP. With the presented calculation concept we are able to simulate the coupled processes that occur in an actual landfill. The model's theoretical background and the results of the simulations as well as the meantime successfully performed simulation of a real landfill body will be shown in the following. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Emissions model of waste treatment operations at the Idaho Chemical Processing Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schindler, R.E.

    1995-03-01

    An integrated model of the waste treatment systems at the Idaho Chemical Processing Plant (ICPP) was developed using a commercially-available process simulation software (ASPEN Plus) to calculate atmospheric emissions of hazardous chemicals for use in an application for an environmental permit to operate (PTO). The processes covered by the model are the Process Equipment Waste evaporator, High Level Liquid Waste evaporator, New Waste Calcining Facility and Liquid Effluent Treatment and Disposal facility. The processes are described along with the model and its assumptions. The model calculates emissions of NO{sub x}, CO, volatile acids, hazardous metals, and organic chemicals. Some calculatedmore » relative emissions are summarized and insights on building simulations are discussed.« less

  15. Simulating bimodal tall fescue growth with a degree-day-based process-oriented plant model

    USDA-ARS?s Scientific Manuscript database

    Plant growth simulation models have a temperature response function driving development, with a base temperature and an optimum temperature defined. Such growth simulation models often function well when plant development rate shows a continuous change throughout the growing season. This approach ...

  16. Fabrication and characterization of resonant SOI micromechanical silicon sensors based on DRIE micromachining, freestanding release process and silicon direct bonding

    NASA Astrophysics Data System (ADS)

    Gigan, Olivier; Chen, Hua; Robert, Olivier; Renard, Stephane; Marty, Frederic

    2002-11-01

    This paper is dedicated to the fabrication and technological aspect of a silicon microresonator sensor. The entire project includes the fabrication processes, the system modelling/simulation, and the electronic interface. The mechanical model of such resonator is presented including description of frequency stability and Hysterises behaviour of the electrostatically driven resonator. Numeric model and FEM simulations are used to simulate the system dynamic behaviour. The complete fabrication process is based on standard microelectronics technology with specific MEMS technological steps. The key steps are described: micromachining on SOI by Deep Reactive Ion Etching (DRIE), specific release processes to prevent sticking (resist and HF-vapour release process) and collective vacuum encapsulation by Silicon Direct Bonding (SDB). The complete process has been validated and prototypes have been fabricated. The ASIC was designed to interface the sensor and to control the vibration amplitude. This electronic was simulated and designed to work up to 200°C and implemented in a standard 0.6μ CMOS technology. Characterizations of sensor prototypes are done both mechanically and electrostatically. These measurements showed good agreements with theory and FEM simulations.

  17. Appendices to the model description document for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.

  18. Numerical Simulation and Experimental Casting of Nickel-Based Single-Crystal Superalloys by HRS and LMC Directional Solidification Processes

    NASA Astrophysics Data System (ADS)

    Yan, Xuewei; Wang, Run'nan; Xu, Qingyan; Liu, Baicheng

    2017-04-01

    Mathematical models for dynamic heat radiation and convection boundary in directional solidification processes are established to simulate the temperature fields. Cellular automaton (CA) method and Kurz-Giovanola-Trivedi (KGT) growth model are used to describe nucleation and growth. Primary dendritic arm spacing (PDAS) and secondary dendritic arm spacing (SDAS) are calculated by the Ma-Sham (MS) and Furer-Wunderlin (FW) models respectively. The mushy zone shape is investigated based on the temperature fields, for both high-rate solidification (HRS) and liquid metal cooling (LMC) processes. The evolution of the microstructure and crystallographic orientation are analyzed by simulation and electron back-scattered diffraction (EBSD) technique, respectively. Comparison of the simulation results from PDAS and SDAS with experimental results reveals a good agreement with each other. The results show that LMC process can provide both dendritic refinement and superior performance for castings due to the increased cooling rate and thermal gradient.

  19. Validation of an intermediate-complexity model for simulating marine biogeochemistry under anoxic conditions in the modern Black Sea

    NASA Astrophysics Data System (ADS)

    Romaniello, Stephen J.; Derry, Louis A.

    2010-08-01

    We test the ability of a new 1-D intermediate-complexity box model (ICBM) that includes process-based C, N, P, O, and S biogeochemistry to simulate profiles and fluxes of biogeochemically reactive species across a wide range of ocean redox states. The ICBM was developed to simulate whole ocean processes for paleoceanographic applications and has been tested with data from the modern global ocean. Here we adapt the circulation submodel of the ICBM to simulate water mass exchange and eddy diffusion processes in the Black Sea but make only very minor changes to the biogeochemical submodel. We force the model with estimated natural and anthropogenic inputs of tracers and nutrients to the Black Sea and compare the results of the simulations to modern observations. Ventilation of the Black Sea is modeled by depth-dependent entrainment of Cold Intermediate Layer water into Bosphorus plume water and subsequent intrusion into deep layers. The simulated profiles of circulation tracers θ, salinity, CFC-12, and radiocarbon agree well with available data, suggesting that the model does a reasonable job of representing physical exchange. Vertical profiles of biogeochemically active components are in good overall agreement with observations. The lack of trace metal (Mn and Fe) cycling in the model results in some discrepancies between the simulated profiles and observation across the suboxic zone; however, the overall redox balance is not sensitive to this difference. We compare modeled basin-wide biogeochemical fluxes to available estimates, but in a number of cases uncertainties in modern budgets limit our ability to test the model rigorously. In agreement with earlier work we find that fixed N losses via thiodenitrification are likely a major pathway in the Black Sea N cycle. Overall, the same biogeochemical submodel used to simulate the modern global ocean appears to perform well in simulating Black Sea processes without requiring significant modification. The ability of a single model to perform across a wide range of redox states is an important prerequisite for applying the ICBM to deep time paleoceanographic problems. The model source code is available as MATLAB™ 7 m-files provided as auxiliary material.

  20. TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY

    PubMed Central

    Somogyi, Endre; Hagar, Amit; Glazier, James A.

    2017-01-01

    Living tissues are dynamic, heterogeneous compositions of objects, including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes. Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology (CCOPM) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models. PMID:29282379

  1. TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY.

    PubMed

    Somogyi, Endre; Hagar, Amit; Glazier, James A

    2016-12-01

    Living tissues are dynamic, heterogeneous compositions of objects , including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes . Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology ( CCOPM ) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models.

  2. An efficient surrogate-based simulation-optimization method for calibrating a regional MODFLOW model

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.

    2017-01-01

    Simulation-optimization method entails a large number of model simulations, which is computationally intensive or even prohibitive if the model simulation is extremely time-consuming. Statistical models have been examined as a surrogate of the high-fidelity physical model during simulation-optimization process to tackle this problem. Among them, Multivariate Adaptive Regression Splines (MARS), a non-parametric adaptive regression method, is superior in overcoming problems of high-dimensions and discontinuities of the data. Furthermore, the stability and accuracy of MARS model can be improved by bootstrap aggregating methods, namely, bagging. In this paper, Bagging MARS (BMARS) method is integrated to a surrogate-based simulation-optimization framework to calibrate a three-dimensional MODFLOW model, which is developed to simulate the groundwater flow in an arid hardrock-alluvium region in northwestern Oman. The physical MODFLOW model is surrogated by the statistical model developed using BMARS algorithm. The surrogate model, which is fitted and validated using training dataset generated by the physical model, can approximate solutions rapidly. An efficient Sobol' method is employed to calculate global sensitivities of head outputs to input parameters, which are used to analyze their importance for the model outputs spatiotemporally. Only sensitive parameters are included in the calibration process to further improve the computational efficiency. Normalized root mean square error (NRMSE) between measured and simulated heads at observation wells is used as the objective function to be minimized during optimization. The reasonable history match between the simulated and observed heads demonstrated feasibility of this high-efficient calibration framework.

  3. Parameter-induced uncertainty quantification of crop yields, soil N2O and CO2 emission for 8 arable sites across Europe using the LandscapeDNDC model

    NASA Astrophysics Data System (ADS)

    Santabarbara, Ignacio; Haas, Edwin; Kraus, David; Herrera, Saul; Klatt, Steffen; Kiese, Ralf

    2014-05-01

    When using biogeochemical models to estimate greenhouse gas emissions at site to regional/national levels, the assessment and quantification of the uncertainties of simulation results are of significant importance. The uncertainties in simulation results of process-based ecosystem models may result from uncertainties of the process parameters that describe the processes of the model, model structure inadequacy as well as uncertainties in the observations. Data for development and testing of uncertainty analisys were corp yield observations, measurements of soil fluxes of nitrous oxide (N2O) and carbon dioxide (CO2) from 8 arable sites across Europe. Using the process-based biogeochemical model LandscapeDNDC for simulating crop yields, N2O and CO2 emissions, our aim is to assess the simulation uncertainty by setting up a Bayesian framework based on Metropolis-Hastings algorithm. Using Gelman statistics convergence criteria and parallel computing techniques, enable multi Markov Chains to run independently in parallel and create a random walk to estimate the joint model parameter distribution. Through means distribution we limit the parameter space, get probabilities of parameter values and find the complex dependencies among them. With this parameter distribution that determines soil-atmosphere C and N exchange, we are able to obtain the parameter-induced uncertainty of simulation results and compare them with the measurements data.

  4. LES ARM Symbiotic Simulation and Observation (LASSO) Implementation Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustafson Jr., WI; Vogelmann, AM

    2015-09-01

    This document illustrates the design of the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) workflow to provide a routine, high-resolution modeling capability to augment the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility’s high-density observations. LASSO will create a powerful new capability for furthering ARM’s mission to advance understanding of cloud, radiation, aerosol, and land-surface processes. The combined observational and modeling elements will enable a new level of scientific inquiry by connecting processes and context to observations and providing needed statistics for details that cannot be measured. The result will be improved process understandingmore » that facilitates concomitant improvements in climate model parameterizations. The initial LASSO implementation will be for ARM’s Southern Great Plains site in Oklahoma and will focus on shallow convection, which is poorly simulated by climate models due in part to clouds’ typically small spatial scale compared to model grid spacing, and because the convection involves complicated interactions of microphysical and boundary layer processes.« less

  5. Numerical simulation of the laser welding process for the prediction of temperature distribution on welded aluminium aircraft components

    NASA Astrophysics Data System (ADS)

    Tsirkas, S. A.

    2018-03-01

    The present investigation is focused to the modelling of the temperature field in aluminium aircraft components welded by a CO2 laser. A three-dimensional finite element model has been developed to simulate the laser welding process and predict the temperature distribution in T-joint laser welded plates with fillet material. The simulation of the laser beam welding process was performed using a nonlinear heat transfer analysis, based on a keyhole formation model analysis. The model employs the technique of element ;birth and death; in order to simulate the weld fillet. Various phenomena associated with welding like temperature dependent material properties and heat losses through convection and radiation were accounted for in the model. The materials considered were 6056-T78 and 6013-T4 aluminium alloys, commonly used for aircraft components. The temperature distribution during laser welding process has been calculated numerically and validated by experimental measurements on different locations of the welded structure. The numerical results are in good agreement with the experimental measurements.

  6. GPU based 3D feature profile simulation of high-aspect ratio contact hole etch process under fluorocarbon plasmas

    NASA Astrophysics Data System (ADS)

    Chun, Poo-Reum; Lee, Se-Ah; Yook, Yeong-Geun; Choi, Kwang-Sung; Cho, Deog-Geun; Yu, Dong-Hun; Chang, Won-Seok; Kwon, Deuk-Chul; Im, Yeon-Ho

    2013-09-01

    Although plasma etch profile simulation has been attracted much interest for developing reliable plasma etching, there still exist big gaps between current research status and predictable modeling due to the inherent complexity of plasma process. As an effort to address this issue, we present 3D feature profile simulation coupled with well-defined plasma-surface kinetic model for silicon dioxide etching process under fluorocarbon plasmas. To capture the realistic plasma surface reaction behaviors, a polymer layer based surface kinetic model was proposed to consider the simultaneous polymer deposition and oxide etching. Finally, the realistic plasma surface model was used for calculation of speed function for 3D topology simulation, which consists of multiple level set based moving algorithm, and ballistic transport module. In addition, the time consumable computations in the ballistic transport calculation were improved drastically by GPU based numerical computation, leading to the real time computation. Finally, we demonstrated that the surface kinetic model could be coupled successfully for 3D etch profile simulations in high-aspect ratio contact hole plasma etching.

  7. Surrogate modelling for the prediction of spatial fields based on simultaneous dimensionality reduction of high-dimensional input/output spaces.

    PubMed

    Crevillén-García, D

    2018-04-01

    Time-consuming numerical simulators for solving groundwater flow and dissolution models of physico-chemical processes in deep aquifers normally require some of the model inputs to be defined in high-dimensional spaces in order to return realistic results. Sometimes, the outputs of interest are spatial fields leading to high-dimensional output spaces. Although Gaussian process emulation has been satisfactorily used for computing faithful and inexpensive approximations of complex simulators, these have been mostly applied to problems defined in low-dimensional input spaces. In this paper, we propose a method for simultaneously reducing the dimensionality of very high-dimensional input and output spaces in Gaussian process emulators for stochastic partial differential equation models while retaining the qualitative features of the original models. This allows us to build a surrogate model for the prediction of spatial fields in such time-consuming simulators. We apply the methodology to a model of convection and dissolution processes occurring during carbon capture and storage.

  8. Simulation of Structural Transformations in Heating of Alloy Steel

    NASA Astrophysics Data System (ADS)

    Kurkin, A. S.; Makarov, E. L.; Kurkin, A. B.; Rubtsov, D. E.; Rubtsov, M. E.

    2017-07-01

    Amathematical model for computer simulation of structural transformations in an alloy steel under the conditions of the thermal cycle of multipass welding is presented. The austenitic transformation under the heating and the processes of decomposition of bainite and martensite under repeated heating are considered. Amethod for determining the necessary temperature-time parameters of the model from the chemical composition of the steel is described. Published data are processed and the results used to derive regression models of the temperature ranges and parameters of transformation kinetics of alloy steels. The method developed is used in computer simulation of the process of multipass welding of pipes by the finite-element method.

  9. State of the Art Assessment of Simulation in Advanced Materials Development

    NASA Technical Reports Server (NTRS)

    Wise, Kristopher E.

    2008-01-01

    Advances in both the underlying theory and in the practical implementation of molecular modeling techniques have increased their value in the advanced materials development process. The objective is to accelerate the maturation of emerging materials by tightly integrating modeling with the other critical processes: synthesis, processing, and characterization. The aims of this report are to summarize the state of the art of existing modeling tools and to highlight a number of areas in which additional development is required. In an effort to maintain focus and limit length, this survey is restricted to classical simulation techniques including molecular dynamics and Monte Carlo simulations.

  10. Experimental and Numerical Simulations of Phase Transformations Occurring During Continuous Annealing of DP Steel Strips

    NASA Astrophysics Data System (ADS)

    Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej

    2016-04-01

    Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.

  11. Numerical and experimental studies on effects of moisture content on combustion characteristics of simulated municipal solid wastes in a fixed bed.

    PubMed

    Sun, Rui; Ismail, Tamer M; Ren, Xiaohan; Abd El-Salam, M

    2015-05-01

    In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on the combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. An approach to developing an integrated pyroprocessing simulator

    NASA Astrophysics Data System (ADS)

    Lee, Hyo Jik; Ko, Won Il; Choi, Sung Yeol; Kim, Sung Ki; Kim, In Tae; Lee, Han Soo

    2014-02-01

    Pyroprocessing has been studied for a decade as one of the promising fuel recycling options in Korea. We have built a pyroprocessing integrated inactive demonstration facility (PRIDE) to assess the feasibility of integrated pyroprocessing technology and scale-up issues of the processing equipment. Even though such facility cannot be replaced with a real integrated facility using spent nuclear fuel (SF), many insights can be obtained in terms of the world's largest integrated pyroprocessing operation. In order to complement or overcome such limited test-based research, a pyroprocessing Modelling and simulation study began in 2011. The Korea Atomic Energy Research Institute (KAERI) suggested a Modelling architecture for the development of a multi-purpose pyroprocessing simulator consisting of three-tiered models: unit process, operation, and plant-level-model. The unit process model can be addressed using governing equations or empirical equations as a continuous system (CS). In contrast, the operation model describes the operational behaviors as a discrete event system (DES). The plant-level model is an integrated model of the unit process and an operation model with various analysis modules. An interface with different systems, the incorporation of different codes, a process-centered database design, and a dynamic material flow are discussed as necessary components for building a framework of the plant-level model. As a sample model that contains methods decoding the above engineering issues was thoroughly reviewed, the architecture for building the plant-level-model was verified. By analyzing a process and operation-combined model, we showed that the suggested approach is effective for comprehensively understanding an integrated dynamic material flow. This paper addressed the current status of the pyroprocessing Modelling and simulation activity at KAERI, and also predicted its path forward.

  13. An approach to developing an integrated pyroprocessing simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyo Jik; Ko, Won Il; Choi, Sung Yeol

    Pyroprocessing has been studied for a decade as one of the promising fuel recycling options in Korea. We have built a pyroprocessing integrated inactive demonstration facility (PRIDE) to assess the feasibility of integrated pyroprocessing technology and scale-up issues of the processing equipment. Even though such facility cannot be replaced with a real integrated facility using spent nuclear fuel (SF), many insights can be obtained in terms of the world's largest integrated pyroprocessing operation. In order to complement or overcome such limited test-based research, a pyroprocessing Modelling and simulation study began in 2011. The Korea Atomic Energy Research Institute (KAERI) suggestedmore » a Modelling architecture for the development of a multi-purpose pyroprocessing simulator consisting of three-tiered models: unit process, operation, and plant-level-model. The unit process model can be addressed using governing equations or empirical equations as a continuous system (CS). In contrast, the operation model describes the operational behaviors as a discrete event system (DES). The plant-level model is an integrated model of the unit process and an operation model with various analysis modules. An interface with different systems, the incorporation of different codes, a process-centered database design, and a dynamic material flow are discussed as necessary components for building a framework of the plant-level model. As a sample model that contains methods decoding the above engineering issues was thoroughly reviewed, the architecture for building the plant-level-model was verified. By analyzing a process and operation-combined model, we showed that the suggested approach is effective for comprehensively understanding an integrated dynamic material flow. This paper addressed the current status of the pyroprocessing Modelling and simulation activity at KAERI, and also predicted its path forward.« less

  14. Numerical simulation of hot-melt extrusion processes for amorphous solid dispersions using model-based melt viscosity.

    PubMed

    Bochmann, Esther S; Steffens, Kristina E; Gryczke, Andreas; Wagner, Karl G

    2018-03-01

    Simulation of HME processes is a valuable tool for increased process understanding and ease of scale-up. However, the experimental determination of all required input parameters is tedious, namely the melt rheology of the amorphous solid dispersion (ASD) in question. Hence, a procedure to simplify the application of hot-melt extrusion (HME) simulation for forming amorphous solid dispersions (ASD) is presented. The commercial 1D simulation software Ludovic ® was used to conduct (i) simulations using a full experimental data set of all input variables including melt rheology and (ii) simulations using model-based melt viscosity data based on the ASDs glass transition and the physical properties of polymeric matrix only. Both types of HME computation were further compared to experimental HME results. Variation in physical properties (e.g. heat capacity, density) and several process characteristics of HME (residence time distribution, energy consumption) among the simulations and experiments were evaluated. The model-based melt viscosity was calculated by using the glass transition temperature (T g ) of the investigated blend and the melt viscosity of the polymeric matrix by means of a T g -viscosity correlation. The results of measured melt viscosity and model-based melt viscosity were similar with only few exceptions, leading to similar HME simulation outcomes. At the end, the experimental effort prior to HME simulation could be minimized and the procedure enables a good starting point for rational development of ASDs by means of HME. As model excipients, Vinylpyrrolidone-vinyl acetate copolymer (COP) in combination with various APIs (carbamazepine, dipyridamole, indomethacin, and ibuprofen) or polyethylene glycol (PEG 1500) as plasticizer were used to form the ASDs. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Comparing Noah-MP simulations of energy and water fluxes in the soil-vegetation-atmosphere continuum with plot scale measurements

    NASA Astrophysics Data System (ADS)

    Gayler, Sebastian; Wöhling, Thomas; Högy, Petra; Ingwersen, Joachim; Wizemann, Hans-Dieter; Wulfmeyer, Volker; Streck, Thilo

    2013-04-01

    During the last years, land-surface models have proven to perform well in several studies that compared simulated fluxes of water and energy from the land surface to the atmosphere against measured fluxes at the plot-scale. In contrast, considerable deficits of land-surface models have been identified to simulate soil water fluxes and vertical soil moisture distribution. For example, Gayler et al. (2013) showed that simplifications in the representation of root water uptake can result in insufficient simulations of the vertical distribution of soil moisture and its dynamics. However, in coupled simulations of the terrestrial water cycle, both sub-systems, the atmosphere and the subsurface hydrogeo-system, must fit together and models are needed, which are able to adequately simulate soil moisture, latent heat flux, and their interrelationship. Consequently, land-surface models must be further improved, e.g. by incorporation of advanced biogeophysics models. To improve the conceptual realism in biophysical and hydrological processes in the community land surface model Noah, this model was recently enhanced to Noah-MP by a multi-options framework to parameterize individual processes (Niu et al., 2011). Thus, in Noah-MP the user can choose from several alternative models for vegetation and hydrology processes that can be applied in different combinations. In this study, we evaluate the performance of different Noah-MP model settings to simulate water and energy fluxes across the land surface at two contrasting field sites in South-West Germany. The evaluation is done in 1D offline-mode, i.e. without coupling to an atmospheric model. The atmospheric forcing is provided by measured time series of the relevant variables. Simulation results are compared with eddy covariance measurements of turbulent fluxes and measured time series of soil moisture at different depths. The aims of the study are i) to carve out the most appropriate combination of process parameterizations in Noah-MP to simultaneously match the different components of the water and energy cycle at the field sites under consideration, and ii) to estimate the uncertainty in model structure. We further investigate the potential to improve simulation results by incorporating concepts of more advanced root water uptake models from agricultural field scale models into the land-surface-scheme. Gayler S, Ingwersen J, Priesack E, Wöhling T, Wulfmeyer V, Streck T (2013): Assessing the relevance of sub surface processes for the simulation of evapotranspiration and soil moisture dynamics with CLM3.5: Comparison with field data and crop model simulations. Environ. Earth Sci., 69(2), under revision. Niu G-Y, Yang Z-L, Mitchell KE, Chen F, Ek MB, Barlage M, Kumar A, Manning K, Niyogi D, Rosero E, Tewari M and Xia Y (2011): The community Noah land surface model with multiparameterization options (Noah-MP): 1. Model description and evaluation with local-scale measurements. Journal of Geophysical Research 116(D12109).

  16. Range Process Simulation Tool

    NASA Technical Reports Server (NTRS)

    Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga

    2005-01-01

    Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

  17. Use NU-WRF and GCE Model to Simulate the Precipitation Processes During MC3E Campaign

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Wu, Di; Matsui, Toshi; Li, Xiaowen; Zeng, Xiping; Peter-Lidard, Christa; Hou, Arthur

    2012-01-01

    One of major CRM approaches to studying precipitation processes is sometimes referred to as "cloud ensemble modeling". This approach allows many clouds of various sizes and stages of their lifecycles to be present at any given simulation time. Large-scale effects derived from observations are imposed into CRMs as forcing, and cyclic lateral boundaries are used. The advantage of this approach is that model results in terms of rainfall and QI and Q2 usually are in good agreement with observations. In addition, the model results provide cloud statistics that represent different types of clouds/cloud systems during their lifetime (life cycle). The large-scale forcing derived from MC3EI will be used to drive GCE model simulations. The model-simulated results will be compared with observations from MC3E. These GCE model-simulated datasets are especially valuable for LH algorithm developers. In addition, the regional scale model with very high-resolution, NASA Unified WRF is also used to real time forecast during the MC3E campaign to ensure that the precipitation and other meteorological forecasts are available to the flight planning team and to interpret the forecast results in terms of proposed flight scenarios. Post Mission simulations are conducted to examine the sensitivity of initial and lateral boundary conditions to cloud and precipitation processes and rainfall. We will compare model results in terms of precipitation and surface rainfall using GCE model and NU-WRF

  18. Comparative simulation of a fluidised bed reformer using industrial process simulators

    NASA Astrophysics Data System (ADS)

    Bashiri, Hamed; Sotudeh-Gharebagh, Rahmat; Sarvar-Amini, Amin; Haghtalab, Ali; Mostoufi, Navid

    2016-08-01

    A simulation model is developed by commercial simulators in order to predict the performance of a fluidised bed reformer. As many physical and chemical phenomena take place in the reformer, two sub-models (hydrodynamic and reaction sub-models) are needed. The hydrodynamic sub-model is based on the dynamic two-phase model and the reaction sub-model is derived from the literature. In the overall model, the bed is divided into several sections. In each section, the flow of the gas is considered as plug flow through the bubble phase and perfectly mixed through the emulsion phase. Experimental data from the literature were used to validate the model. Close agreement was found between the model of both ASPEN Plus (ASPEN PLUS 2004 ©) and HYSYS (ASPEN HYSYS 2004 ©) and the experimental data using various sectioning of the reactor ranged from one to four. The experimental conversion lies between one and four sections as expected. The model proposed in this work can be used as a framework in developing the complicated models for non-ideal reactors inside of the process simulators.

  19. Flight Dynamic Simulation of Fighter In the Asymmetric External Store Release Process

    NASA Astrophysics Data System (ADS)

    Safi’i, Imam; Arifianto, Ony; Nurohman, Chandra

    2018-04-01

    In the fighter design, it is important to evaluate and analyze the flight dynamic of the aircraft earlier in the development process. One of the case is the dynamics of external store release process. A simulation tool can be used to analyze the fighter/external store system’s dynamics in the preliminary design stage. This paper reports the flight dynamics of Jet Fighter Experiment (JF-1 E) in asymmetric Advance Medium Range Air to Air Missile (AMRAAM) release process through simulations. The JF-1 E and AIM 120 AMRAAAM models are built by using Advanced Aircraft Analysis (AAA) and Missile Datcom software. By using these softwares, the aerodynamic stability and control derivatives can be obtained and used to model the dynamic characteristic of the fighter and the external store. The dynamic system is modeled by using MATLAB/Simulink software. By using this software, both the fighter/external store integration and the external store release process is simulated, and the dynamic of the system can be analyzed.

  20. In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway

    NASA Astrophysics Data System (ADS)

    Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun

    2016-12-01

    HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.

  1. Numerical simulation study on rolling-chemical milling process of aluminum-lithium alloy skin panel

    NASA Astrophysics Data System (ADS)

    Huang, Z. B.; Sun, Z. G.; Sun, X. F.; Li, X. Q.

    2017-09-01

    Single curvature parts such as aircraft fuselage skin panels are usually manufactured by rolling-chemical milling process, which is usually faced with the problem of geometric accuracy caused by springback. In most cases, the methods of manual adjustment and multiple roll bending are used to control or eliminate the springback. However, these methods can cause the increase of product cost and cycle, and lead to material performance degradation. Therefore, it is of significance to precisely control the springback of rolling-chemical milling process. In this paper, using the method of experiment and numerical simulation on rolling-chemical milling process, the simulation model for rolling-chemical milling process of 2060-T8 aluminum-lithium alloy skin was established and testified by the comparison between numerical simulation and experiment results for the validity. Then, based on the numerical simulation model, the relative technological parameters which influence on the curvature of the skin panel were analyzed. Finally, the prediction of springback and the compensation can be realized by controlling the process parameters.

  2. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    NASA Technical Reports Server (NTRS)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  3. Applying Multiagent Simulation to Planetary Surface Operations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Sims, Michael H.; Clancey, William J.; Lee, Pascal; Swanson, Keith (Technical Monitor)

    2000-01-01

    This paper describes a multiagent modeling and simulation approach for designing cooperative systems. Issues addressed include the use of multiagent modeling and simulation for the design of human and robotic operations, as a theory for human/robot cooperation on planetary surface missions. We describe a design process for cooperative systems centered around the Brahms modeling and simulation environment being developed at NASA Ames.

  4. Chemsheet as a Simulation Platform for Pyrometallurgical Processes

    NASA Astrophysics Data System (ADS)

    Penttilä, Karri; Salminen, Justin; Tripathi, Nagendra; Koukkari, Pertti

    ChemSheet is a thermodynamic multi-phase multi-component simulation software, which is used as an Add-in in Microsoft Excel. In ChemSheet, the unique Constrained Gibbs free energy method can be used to include dynamic constraints and reaction rates of kinetically slow reactions, yet retaining full consistency of the multiphase thermodynamic model. With appropriate data, ChemSheet models can be used to simulate reactors and processes in all fields of thermochemistry. The presentation will cover off-line modeling of Cu-flash smelters and advanced thermochemical simulation coupled with on-line process control of Cu-Ni smelting. The presentation will describe an off-line model of Cu-smelter based on critically assessed properties of the Al-Ca-Cu-Fe-O-S-Si -system (slag, matte and liquid metal) by using the quasichemical model. A four-stage reactor model (shaft, settler, uptake and bath) is used for optimizing process parameters and feed particle distribution. As a second example, an advanced thermochemical model of a Ni-Cu sulphide smelting plant will be given. The on-line model covers the operation of treating Ni-Cu-S concentrate via roasters, electric furnace and converters, producing a high grade Bessemer matte product for further refining. The model integrates the thermochemistry of the roasters and electric furnace, and predicts important process parameters such as degree of sulphur elimination in the fluid-bed roasters, matte grade, iron metallization, slag losses and the iron to silica ratio in the electric furnace slag. Both models can be used to assist process engineers and operators in calculating the addition rates of coke, flux and air for different feed scenarios.

  5. Application of an interactive water simulation model in urban water management: a case study in Amsterdam.

    PubMed

    Leskens, J G; Brugnach, M; Hoekstra, A Y

    2014-01-01

    Water simulation models are available to support decision-makers in urban water management. To use current water simulation models, special expertise is required. Therefore, model information is prepared prior to work sessions, in which decision-makers weigh different solutions. However, this model information quickly becomes outdated when new suggestions for solutions arise and are therefore limited in use. We suggest that new model techniques, i.e. fast and flexible computation algorithms and realistic visualizations, allow this problem to be solved by using simulation models during work sessions. A new Interactive Water Simulation Model was applied for two case study areas in Amsterdam and was used in two workshops. In these workshops, the Interactive Water Simulation Model was positively received. It included non-specialist participants in the process of suggesting and selecting possible solutions and made them part of the accompanying discussions and negotiations. It also provided the opportunity to evaluate and enhance possible solutions more often within the time horizon of a decision-making process. Several preconditions proved to be important for successfully applying the Interactive Water Simulation Model, such as the willingness of the stakeholders to participate and the preparation of different general main solutions that can be used for further iterations during a work session.

  6. Updates on Modeling the Water Cycle with the NASA Ames Mars Global Climate Model

    NASA Technical Reports Server (NTRS)

    Kahre, M. A.; Haberle, R. M.; Hollingsworth, J. L.; Montmessin, F.; Brecht, A. S.; Urata, R.; Klassen, D. R.; Wolff, M. J.

    2017-01-01

    Global Circulation Models (GCMs) have made steady progress in simulating the current Mars water cycle. It is now widely recognized that clouds are a critical component that can significantly affect the nature of the simulated water cycle. Two processes in particular are key to implementing clouds in a GCM: the microphysical processes of formation and dissipation, and their radiative effects on heating/ cooling rates. Together, these processes alter the thermal structure, change the dynamics, and regulate inter-hemispheric transport. We have made considerable progress representing these processes in the NASA Ames GCM, particularly in the presence of radiatively active water ice clouds. We present the current state of our group's water cycle modeling efforts, show results from selected simulations, highlight some of the issues, and discuss avenues for further investigation.­

  7. Parallel STEPS: Large Scale Stochastic Spatial Reaction-Diffusion Simulation with High Performance Computers

    PubMed Central

    Chen, Weiliang; De Schutter, Erik

    2017-01-01

    Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation. PMID:28239346

  8. Parallel STEPS: Large Scale Stochastic Spatial Reaction-Diffusion Simulation with High Performance Computers.

    PubMed

    Chen, Weiliang; De Schutter, Erik

    2017-01-01

    Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation.

  9. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  10. Validated simulator for space debris removal with nets and other flexible tethers applications

    NASA Astrophysics Data System (ADS)

    Gołębiowski, Wojciech; Michalczyk, Rafał; Dyrek, Michał; Battista, Umberto; Wormnes, Kjetil

    2016-12-01

    In the context of active debris removal technologies and preparation activities for the e.Deorbit mission, a simulator for net-shaped elastic bodies dynamics and their interactions with rigid bodies, has been developed. Its main application is to aid net design and test scenarios for space debris deorbitation. The simulator can model all the phases of the debris capturing process: net launch, flight and wrapping around the target. It handles coupled simulation of rigid and flexible bodies dynamics. Flexible bodies were implemented using Cosserat rods model. It allows to simulate flexible threads or wires with elasticity and damping for stretching, bending and torsion. Threads may be combined into structures of any topology, so the software is able to simulate nets, pure tethers, tether bundles, cages, trusses, etc. Full contact dynamics was implemented. Programmatic interaction with simulation is possible - i.e. for control implementation. The underlying model has been experimentally validated and due to significant gravity influence, experiment had to be performed in microgravity conditions. Validation experiment for parabolic flight was a downscaled process of Envisat capturing. The prepacked net was launched towards the satellite model, it expanded, hit the model and wrapped around it. The whole process was recorded with 2 fast stereographic camera sets for full 3D trajectory reconstruction. The trajectories were used to compare net dynamics to respective simulations and then to validate the simulation tool. The experiments were performed on board of a Falcon-20 aircraft, operated by National Research Council in Ottawa, Canada. Validation results show that model reflects phenomenon physics accurately enough, so it may be used for scenario evaluation and mission design purposes. The functionalities of the simulator are described in detail in the paper, as well as its underlying model, sample cases and methodology behind validation. Results are presented and typical use cases are discussed showing that the software may be used to design throw nets for space debris capturing, but also to simulate deorbitation process, chaser control system or general interactions between rigid and elastic bodies - all in convenient and efficient way. The presented work was led by SKA Polska under the ESA contract, within the CleanSpace initiative.

  11. Modelling and simulating reaction-diffusion systems using coloured Petri nets.

    PubMed

    Liu, Fei; Blätke, Mary-Ann; Heiner, Monika; Yang, Ming

    2014-10-01

    Reaction-diffusion systems often play an important role in systems biology when developmental processes are involved. Traditional methods of modelling and simulating such systems require substantial prior knowledge of mathematics and/or simulation algorithms. Such skills may impose a challenge for biologists, when they are not equally well-trained in mathematics and computer science. Coloured Petri nets as a high-level and graphical language offer an attractive alternative, which is easily approachable. In this paper, we investigate a coloured Petri net framework integrating deterministic, stochastic and hybrid modelling formalisms and corresponding simulation algorithms for the modelling and simulation of reaction-diffusion processes that may be closely coupled with signalling pathways, metabolic reactions and/or gene expression. Such systems often manifest multiscaleness in time, space and/or concentration. We introduce our approach by means of some basic diffusion scenarios, and test it against an established case study, the Brusselator model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yang; Zhao, Qiangsheng; Mirdamadi, Mansour

    Woven fabric carbon fiber/epoxy composites made through compression molding are one of the promising choices of material for the vehicle light-weighting strategy. Previous studies have shown that the processing conditions can have substantial influence on the performance of this type of the material. Therefore the optimization of the compression molding process is of great importance to the manufacturing practice. An efficient way to achieve the optimized design of this process would be through conducting finite element (FE) simulations of compression molding for woven fabric carbon fiber/epoxy composites. However, performing such simulation remains a challenging task for FE as multiple typesmore » of physics are involved during the compression molding process, including the epoxy resin curing and the complex mechanical behavior of woven fabric structure. In the present study, the FE simulation of the compression molding process of resin based woven fabric composites at continuum level is conducted, which is enabled by the implementation of an integrated material modeling methodology in LS-Dyna. Specifically, the chemo-thermo-mechanical problem of compression molding is solved through the coupling of three material models, i.e., one thermal model for temperature history in the resin, one mechanical model to update the curing-dependent properties of the resin and another mechanical model to simulate the behavior of the woven fabric composites. Preliminary simulations of the carbon fiber/epoxy woven fabric composites in LS-Dyna are presented as a demonstration, while validations and models with real part geometry are planned in the future work.« less

  13. Collaborative Project: Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.

  14. Final Report Collaborative Project: Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.

  15. Using Dynamic Interface Modeling and Simulation to Develop a Launch and Recovery Flight Simulation for a UH-60A Blackhawk

    NASA Technical Reports Server (NTRS)

    Sweeney, Christopher; Bunnell, John; Chung, William; Giovannetti, Dean; Mikula, Julie; Nicholson, Bob; Roscoe, Mike

    2001-01-01

    Joint Shipboard Helicopter Integration Process (JSHIP) is a Joint Test and Evaluation (JT&E) program sponsored by the Office of the Secretary of Defense (OSD). Under the JSHDP program is a simulation effort referred to as the Dynamic Interface Modeling and Simulation System (DIMSS). The purpose of DIMSS is to develop and test the processes and mechanisms that facilitate ship-helicopter interface testing via man-in-the-loop ground-based flight simulators. Specifically, the DIMSS charter is to develop an accredited process for using a flight simulator to determine the wind-over-the-deck (WOD) launch and recovery flight envelope for the UH-60A ship/helicopter combination. DIMSS is a collaborative effort between the NASA Ames Research Center and OSD. OSD determines the T&E and warfighter training requirements, provides the programmatics and dynamic interface T&E experience, and conducts ship/aircraft interface tests for validating the simulation. NASA provides the research and development element, simulation facility, and simulation technical experience. This paper will highlight the benefits of the NASA/JSHIP collaboration and detail achievements of the project in terms of modeling and simulation. The Vertical Motion Simulator (VMS) at NASA Ames Research Center offers the capability to simulate a wide range of simulation cueing configurations, which include visual, aural, and body-force cueing devices. The system flexibility enables switching configurations io allow back-to-back evaluation and comparison of different levels of cueing fidelity in determining minimum training requirements. The investigation required development and integration of several major simulation system at the VMS. A new UH-60A BlackHawk interchangeable cab that provides an out-the-window (OTW) field-of-view (FOV) of 220 degrees in azimuth and 70 degrees in elevation was built. Modeling efforts involved integrating Computational Fluid Dynamics (CFD) generated data of an LHA ship airwake and integrating a real-time ship motion model developed based on a batch model from Naval Surface Warfare Center. Engineering development and integration of a three degrees-of-freedom (DOF) dynamic seat to simulate high frequency rotor-dynamics dependent motion cues for use in conjunction with the large motion system was accomplished. The development of an LHA visual model in several different levels of resolution and an aural cueing system in which three separate fidelity levels could be selected were developed. VMS also integrated a PC-based E&S simFUSION system to investigate cost effective IG alternatives. The DIMSS project consists of three phases that follow an approved Validation, Verification and accreditation (VV&A) process. The first phase will support the accreditation of the individual subsystems and models. The second will follow the verification and validation of the integrated subsystems and models, and will address fidelity requirements of the integrated models and subsystems. The third and final phase will allow the verification and validation of the full system integration. This VV&A process will address the utility of the simulated WOD launch and recovery envelope. Simulations supporting the first two stages have been completed and the data is currently being reviewed and analyzed.

  16. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  17. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  18. Exploring the Use of Computer Simulations in Unraveling Research and Development Governance Problems

    NASA Technical Reports Server (NTRS)

    Balaban, Mariusz A.; Hester, Patrick T.

    2012-01-01

    Understanding Research and Development (R&D) enterprise relationships and processes at a governance level is not a simple task, but valuable decision-making insight and evaluation capabilities can be gained from their exploration through computer simulations. This paper discusses current Modeling and Simulation (M&S) methods, addressing their applicability to R&D enterprise governance. Specifically, the authors analyze advantages and disadvantages of the four methodologies used most often by M&S practitioners: System Dynamics (SO), Discrete Event Simulation (DES), Agent Based Modeling (ABM), and formal Analytic Methods (AM) for modeling systems at the governance level. Moreover, the paper describes nesting models using a multi-method approach. Guidance is provided to those seeking to employ modeling techniques in an R&D enterprise for the purposes of understanding enterprise governance. Further, an example is modeled and explored for potential insight. The paper concludes with recommendations regarding opportunities for concentration of future work in modeling and simulating R&D governance relationships and processes.

  19. Comparison of DNDC and RZWQM2 for simulating hydrology and nitrogen dynamics in a corn-soybean system with a winter cover crop

    NASA Astrophysics Data System (ADS)

    Desjardins, R.; Smith, W.; Qi, Z.; Grant, B.; VanderZaag, A.

    2017-12-01

    Biophysical models are needed for assessing science-based mitigation options to improve the efficiency and sustainability of agricultural cropping systems. In order to account for trade-offs between environmental indicators such as GHG emissions, soil C change, and water quality it is important that models can encapsulate the complex array of interrelated biogeochemical processes controlling water, nutrient and energy flows in the agroecosystem. The Denitrification Decomposition (DNDC) model is one of the most widely used process-based models, and is arguably the most sophisticated for estimating GHG emissions and soil C&N cycling, however, the model simulates only simple cascade water flow. The purpose of this study was to compare the performance of DNDC to a comprehensive water flow model, the Root Zone Water Quality Model (RZWQM2), to determine which processes in DNDC may be limiting and recommend improvements. Both models were calibrated and validated for simulating crop biomass, soil hydrology, and nitrogen loss to tile drains using detailed observations from a corn-soybean rotation in Iowa, with and without cover crops. Results indicated that crop yields, biomass and the annual estimation of nitrogen and water loss to tiles drains were well simulated by both models (NSE > 0.6 in all cases); however, RZWQM2 performed much better for simulating soil water content, and the dynamics of daily water flow (DNDC: NSE -0.32 to 0.28; RZWQM2: NSE 0.34 to 0.70) to tile drains. DNDC overestimated soil water content near the soil surface and underestimated it deeper in the profile which was presumably caused by the lack of a root distribution algorithm, the inability to simulate a heterogeneous profile and lack of a water table. We recommend these improvements along with the inclusion of enhanced water flow and a mechanistic tile drainage sub-model. The accurate temporal simulation of water and N strongly impacts several biogeochemical processes.

  20. FEA Simulation of Free-Bending - a Preforming Step in the Hydroforming Process Chain

    NASA Astrophysics Data System (ADS)

    Beulich, N.; Craighero, P.; Volk, W.

    2017-09-01

    High-strength steel and aluminum alloys are essential for developing innovative, lightly-weighted space frame concepts. The intended design is built from car body parts with high geometrical complexity and reduced material-thickness. Over the past few years, many complex car body parts have been produced using hydroforming. To increase the accuracy of hydroforming in relation to prospective car concepts, the virtual manufacturing of forming becomes more important. As a part of process digitalization, it is necessary to develop a simulation model for the hydroforming process chain. The preforming of longitudinal welded tubes is therefore implemented by the use of three-dimensional free-bending. This technique is able to reproduce complex deflection curves in combination with innovative low-thickness material design for hydroforming processes. As a first step to the complete process simulation, the content of this paper deals with the development of a finite element simulation model for the free-bending process with 6 degrees of freedom. A mandrel built from spherical segments connected by a steel rope is located inside of the tube to prevent geometrical instability. Critical parameters for the result of the bending process are therefore evaluated and optimized. The simulation model is verified by surface measurements of a two-dimensional bending test.

  1. Learning the Norm of Internality: NetNorm, a Connectionist Model

    ERIC Educational Resources Information Center

    Thierry, Bollon; Adeline, Paignon; Pascal, Pansu

    2011-01-01

    The objective of the present article is to show that connectionist simulations can be used to model some of the socio-cognitive processes underlying the learning of the norm of internality. For our simulations, we developed a connectionist model which we called NetNorm (based on Dual-Network formalism). This model is capable of simulating the…

  2. Introducing Molecular Life Science Students to Model Building Using Computer Simulations

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Kettenis, Dik; Sessink, Olivier; Hartog, Rob; Bisseling, Ton; Janssen, Fred

    2006-01-01

    Computer simulations can facilitate the building of models of natural phenomena in research, such as in the molecular life sciences. In order to introduce molecular life science students to the use of computer simulations for model building, a digital case was developed in which students build a model of a pattern formation process in…

  3. Development of a Water Recovery System Resource Tracking Model

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Stambaugh, Imelda; Sargusingh, Miriam; Shull, Sarah; Moore, Michael

    2015-01-01

    A simulation model has been developed to track water resources in an exploration vehicle using Regenerative Life Support (RLS) systems. The Resource Tracking Model (RTM) integrates the functions of all the vehicle components that affect the processing and recovery of water during simulated missions. The approach used in developing the RTM enables its use as part of a complete vehicle simulation for real time mission studies. Performance data for the components in the RTM is focused on water processing. The data provided to the model has been based on the most recent information available regarding the technology of the component. This paper will describe the process of defining the RLS system to be modeled, the way the modeling environment was selected, and how the model has been implemented. Results showing how the RLS components exchange water are provided in a set of test cases.

  4. Detailed Modeling of Distillation Technologies for Closed-Loop Water Recovery Systems

    NASA Technical Reports Server (NTRS)

    Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.

    2011-01-01

    Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA?s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents efforts to develop chemical process simulations for three technologies: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system and the Wiped-Film Rotating Disk (WFRD) using the Aspen Custom Modeler and Aspen Plus process simulation tools. The paper discusses system design, modeling details, and modeling results for each technology and presents some comparisons between the model results and recent test data. Following these initial comparisons, some general conclusions and forward work are discussed.

  5. Development of a Water Recovery System Resource Tracking Model

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Stambaugh, Imelda; Sarguishm, Miriam; Shull, Sarah; Moore, Michael

    2014-01-01

    A simulation model has been developed to track water resources in an exploration vehicle using regenerative life support (RLS) systems. The model integrates the functions of all the vehicle components that affect the processing and recovery of water during simulated missions. The approach used in developing the model results in the RTM being a part of of a complete vehicle simulation that can be used in real time mission studies. Performance data for the variety of components in the RTM is focused on water processing and has been defined based on the most recent information available for the technology of the component. This paper will describe the process of defining the RLS system to be modeled and then the way the modeling environment was selected and how the model has been implemented. Results showing how the variety of RLS components exchange water are provided in a set of test cases.

  6. Model and system learners, optimal process constructors and kinetic theory-based goal-oriented design: A new paradigm in materials and processes informatics

    NASA Astrophysics Data System (ADS)

    Abisset-Chavanne, Emmanuelle; Duval, Jean Louis; Cueto, Elias; Chinesta, Francisco

    2018-05-01

    Traditionally, Simulation-Based Engineering Sciences (SBES) has relied on the use of static data inputs (model parameters, initial or boundary conditions, … obtained from adequate experiments) to perform simulations. A new paradigm in the field of Applied Sciences and Engineering has emerged in the last decade. Dynamic Data-Driven Application Systems [9, 10, 11, 12, 22] allow the linkage of simulation tools with measurement devices for real-time control of simulations and applications, entailing the ability to dynamically incorporate additional data into an executing application, and in reverse, the ability of an application to dynamically steer the measurement process. It is in that context that traditional "digital-twins" are giving raise to a new generation of goal-oriented data-driven application systems, also known as "hybrid-twins", embracing models based on physics and models exclusively based on data adequately collected and assimilated for filling the gap between usual model predictions and measurements. Within this framework new methodologies based on model learners, machine learning and kinetic goal-oriented design are defining a new paradigm in materials, processes and systems engineering.

  7. Modelling carbon responses of tundra ecosystems to historical and projected climate: A comparison of a plot- and a global-scale ecosystem model to identify process-based uncertainties

    USGS Publications Warehouse

    Clein, Joy S.; Kwiatkowski, B.L.; McGuire, A.D.; Hobbie, J.E.; Rastetter, E.B.; Melillo, J.M.; Kicklighter, D.W.

    2000-01-01

    We are developing a process-based modelling approach to investigate how carbon (C) storage of tundra across the entire Arctic will respond to projected climate change. To implement the approach, the processes that are least understood, and thus have the most uncertainty, need to be identified and studied. In this paper, we identified a key uncertainty by comparing the responses of C storage in tussock tundra at one site between the simulations of two models - one a global-scale ecosystem model (Terrestrial Ecosystem Model, TEM) and one a plot-scale ecosystem model (General Ecosystem Model, GEM). The simulations spanned the historical period (1921-94) and the projected period (1995-2100). In the historical period, the model simulations of net primary production (NPP) differed in their sensitivity to variability in climate. However, the long-term changes in C storage were similar in both simulations, because the dynamics of heterotrophic respiration (RH) were similar in both models. In contrast, the responses of C storage in the two model simulations diverged during the projected period. In the GEM simulation for this period, increases in RH tracked increases in NPP, whereas in the TEM simulation increases in RH lagged increases in NPP. We were able to make the long-term C dynamics of the two simulations agree by parameterizing TEM to the fast soil C pools of GEM. We concluded that the differences between the long-term C dynamics of the two simulations lay in modelling the role of the recalcitrant soil C. These differences, which reflect an incomplete understanding of soil processes, lead to quite different projections of the response of pan-Arctic C storage to global change. For example, the reference parameterization of TEM resulted in an estimate of cumulative C storage of 2032 g C m-2 for moist tundra north of 50??N, which was substantially higher than the 463 g C m-2 estimated for a parameterization of fast soil C dynamics. This uncertainty in the depiction of the role of recalcitrant soil C in long-term ecosystem C dynamics resulted from our incomplete understanding of controls over C and N transformations in Arctic soils. Mechanistic studies of these issues are needed to improve our ability to model the response of Arctic ecosystems to global change.

  8. MJO simulation in CMIP5 climate models: MJO skill metrics and process-oriented diagnosis

    NASA Astrophysics Data System (ADS)

    Ahn, Min-Seop; Kim, Daehyun; Sperber, Kenneth R.; Kang, In-Sik; Maloney, Eric; Waliser, Duane; Hendon, Harry

    2017-12-01

    The Madden-Julian Oscillation (MJO) simulation diagnostics developed by MJO Working Group and the process-oriented MJO simulation diagnostics developed by MJO Task Force are applied to 37 Coupled Model Intercomparison Project phase 5 (CMIP5) models in order to assess model skill in representing amplitude, period, and coherent eastward propagation of the MJO, and to establish a link between MJO simulation skill and parameterized physical processes. Process-oriented diagnostics include the Relative Humidity Composite based on Precipitation (RHCP), Normalized Gross Moist Stability (NGMS), and the Greenhouse Enhancement Factor (GEF). Numerous scalar metrics are developed to quantify the results. Most CMIP5 models underestimate MJO amplitude, especially when outgoing longwave radiation (OLR) is used in the evaluation, and exhibit too fast phase speed while lacking coherence between eastward propagation of precipitation/convection and the wind field. The RHCP-metric, indicative of the sensitivity of simulated convection to low-level environmental moisture, and the NGMS-metric, indicative of the efficiency of a convective atmosphere for exporting moist static energy out of the column, show robust correlations with a large number of MJO skill metrics. The GEF-metric, indicative of the strength of the column-integrated longwave radiative heating due to cloud-radiation interaction, is also correlated with the MJO skill metrics, but shows relatively lower correlations compared to the RHCP- and NGMS-metrics. Our results suggest that modifications to processes associated with moisture-convection coupling and the gross moist stability might be the most fruitful for improving simulations of the MJO. Though the GEF-metric exhibits lower correlations with the MJO skill metrics, the longwave radiation feedback is highly relevant for simulating the weak precipitation anomaly regime that may be important for the establishment of shallow convection and the transition to deep convection.

  9. MJO simulation in CMIP5 climate models: MJO skill metrics and process-oriented diagnosis

    DOE PAGES

    Ahn, Min-Seop; Kim, Daehyun; Sperber, Kenneth R.; ...

    2017-03-23

    The Madden-Julian Oscillation (MJO) simulation diagnostics developed by MJO Working Group and the process-oriented MJO simulation diagnostics developed by MJO Task Force are applied to 37 Coupled Model Intercomparison Project phase 5 (CMIP5) models in order to assess model skill in representing amplitude, period, and coherent eastward propagation of the MJO, and to establish a link between MJO simulation skill and parameterized physical processes. Process-oriented diagnostics include the Relative Humidity Composite based on Precipitation (RHCP), Normalized Gross Moist Stability (NGMS), and the Greenhouse Enhancement Factor (GEF). Numerous scalar metrics are developed to quantify the results. Most CMIP5 models underestimate MJOmore » amplitude, especially when outgoing longwave radiation (OLR) is used in the evaluation, and exhibit too fast phase speed while lacking coherence between eastward propagation of precipitation/convection and the wind field. The RHCP-metric, indicative of the sensitivity of simulated convection to low-level environmental moisture, and the NGMS-metric, indicative of the efficiency of a convective atmosphere for exporting moist static energy out of the column, show robust correlations with a large number of MJO skill metrics. The GEF-metric, indicative of the strength of the column-integrated longwave radiative heating due to cloud-radiation interaction, is also correlated with the MJO skill metrics, but shows relatively lower correlations compared to the RHCP- and NGMS-metrics. Our results suggest that modifications to processes associated with moisture-convection coupling and the gross moist stability might be the most fruitful for improving simulations of the MJO. Though the GEF-metric exhibits lower correlations with the MJO skill metrics, the longwave radiation feedback is highly relevant for simulating the weak precipitation anomaly regime that may be important for the establishment of shallow convection and the transition to deep convection.« less

  10. MJO simulation in CMIP5 climate models: MJO skill metrics and process-oriented diagnosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahn, Min-Seop; Kim, Daehyun; Sperber, Kenneth R.

    The Madden-Julian Oscillation (MJO) simulation diagnostics developed by MJO Working Group and the process-oriented MJO simulation diagnostics developed by MJO Task Force are applied to 37 Coupled Model Intercomparison Project phase 5 (CMIP5) models in order to assess model skill in representing amplitude, period, and coherent eastward propagation of the MJO, and to establish a link between MJO simulation skill and parameterized physical processes. Process-oriented diagnostics include the Relative Humidity Composite based on Precipitation (RHCP), Normalized Gross Moist Stability (NGMS), and the Greenhouse Enhancement Factor (GEF). Numerous scalar metrics are developed to quantify the results. Most CMIP5 models underestimate MJOmore » amplitude, especially when outgoing longwave radiation (OLR) is used in the evaluation, and exhibit too fast phase speed while lacking coherence between eastward propagation of precipitation/convection and the wind field. The RHCP-metric, indicative of the sensitivity of simulated convection to low-level environmental moisture, and the NGMS-metric, indicative of the efficiency of a convective atmosphere for exporting moist static energy out of the column, show robust correlations with a large number of MJO skill metrics. The GEF-metric, indicative of the strength of the column-integrated longwave radiative heating due to cloud-radiation interaction, is also correlated with the MJO skill metrics, but shows relatively lower correlations compared to the RHCP- and NGMS-metrics. Our results suggest that modifications to processes associated with moisture-convection coupling and the gross moist stability might be the most fruitful for improving simulations of the MJO. Though the GEF-metric exhibits lower correlations with the MJO skill metrics, the longwave radiation feedback is highly relevant for simulating the weak precipitation anomaly regime that may be important for the establishment of shallow convection and the transition to deep convection.« less

  11. Computer model to simulate testing at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Owens, Lewis R., Jr.; Wahls, Richard A.; Hannon, Judith A.

    1995-01-01

    A computer model has been developed to simulate the processes involved in the operation of the National Transonic Facility (NTF), a large cryogenic wind tunnel at the Langley Research Center. The simulation was verified by comparing the simulated results with previously acquired data from three experimental wind tunnel test programs in the NTF. The comparisons suggest that the computer model simulates reasonably well the processes that determine the liquid nitrogen (LN2) consumption, electrical consumption, fan-on time, and the test time required to complete a test plan at the NTF. From these limited comparisons, it appears that the results from the simulation model are generally within about 10 percent of the actual NTF test results. The use of actual data acquisition times in the simulation produced better estimates of the LN2 usage, as expected. Additional comparisons are needed to refine the model constants. The model will typically produce optimistic results since the times and rates included in the model are typically the optimum values. Any deviation from the optimum values will lead to longer times or increased LN2 and electrical consumption for the proposed test plan. Computer code operating instructions and listings of sample input and output files have been included.

  12. A generic biogeochemical module for Earth system models: Next Generation BioGeoChemical Module (NGBGC), version 1.0

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Huang, M.; Liu, C.; Li, H.; Leung, L. R.

    2013-11-01

    Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into Earth system models (e.g., community land models (CLMs)), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module, Next Generation BioGeoChemical Module (NGBGC), version 1.0, with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter, and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into CLM. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems. The method presented here could in theory be applied to simulate biogeochemical cycles in other Earth system models.

  13. A Microstructure-Based Constitutive Model for Superplastic Forming

    NASA Astrophysics Data System (ADS)

    Jafari Nedoushan, Reza; Farzin, Mahmoud; Mashayekhi, Mohammad; Banabic, Dorel

    2012-11-01

    A constitutive model is proposed for simulations of hot metal forming processes. This model is constructed based on dominant mechanisms that take part in hot forming and includes intergranular deformation, grain boundary sliding, and grain boundary diffusion. A Taylor type polycrystalline model is used to predict intergranular deformation. Previous works on grain boundary sliding and grain boundary diffusion are extended to drive three-dimensional macro stress-strain rate relationships for each mechanism. In these relationships, the effect of grain size is also taken into account. The proposed model is first used to simulate step strain-rate tests and the results are compared with experimental data. It is shown that the model can be used to predict flow stresses for various grain sizes and strain rates. The yield locus is then predicted for multiaxial stress states, and it is observed that it is very close to the von Mises yield criterion. It is also shown that the proposed model can be directly used to simulate hot forming processes. Bulge forming process and gas pressure tray forming are simulated, and the results are compared with experimental data.

  14. Results from the VALUE perfect predictor experiment: process-based evaluation

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Soares, Pedro; Hertig, Elke; Brands, Swen; Huth, Radan; Cardoso, Rita; Kotlarski, Sven; Casado, Maria; Pongracz, Rita; Bartholy, Judit

    2016-04-01

    Until recently, the evaluation of downscaled climate model simulations has typically been limited to surface climatologies, including long term means, spatial variability and extremes. But these aspects are often, at least partly, tuned in regional climate models to match observed climate. The tuning issue is of course particularly relevant for bias corrected regional climate models. In general, a good performance of a model for these aspects in present climate does therefore not imply a good performance in simulating climate change. It is now widely accepted that, to increase our condidence in climate change simulations, it is necessary to evaluate how climate models simulate relevant underlying processes. In other words, it is important to assess whether downscaling does the right for the right reason. Therefore, VALUE has carried out a broad process-based evaluation study based on its perfect predictor experiment simulations: the downscaling methods are driven by ERA-Interim data over the period 1979-2008, reference observations are given by a network of 85 meteorological stations covering all European climates. More than 30 methods participated in the evaluation. In order to compare statistical and dynamical methods, only variables provided by both types of approaches could be considered. This limited the analysis to conditioning local surface variables on variables from driving processes that are simulated by ERA-Interim. We considered the following types of processes: at the continental scale, we evaluated the performance of downscaling methods for positive and negative North Atlantic Oscillation, Atlantic ridge and blocking situations. At synoptic scales, we considered Lamb weather types for selected European regions such as Scandinavia, the United Kingdom, the Iberian Pensinsula or the Alps. At regional scales we considered phenomena such as the Mistral, the Bora or the Iberian coastal jet. Such process-based evaluation helps to attribute biases in surface variables to underlying processes and ultimately to improve climate models.

  15. Numerical simulations of an advection fog event over Shanghai Pudong International Airport with the WRF model

    NASA Astrophysics Data System (ADS)

    Lin, Caiyan; Zhang, Zhongfeng; Pu, Zhaoxia; Wang, Fengyun

    2017-10-01

    A series of numerical simulations is conducted to understand the formation, evolution, and dissipation of an advection fog event over Shanghai Pudong International Airport (ZSPD) with the Weather Research and Forecasting (WRF) model. Using the current operational settings at the Meteorological Center of East China Air Traffic Management Bureau, the WRF model successfully predicts the fog event at ZSPD. Additional numerical experiments are performed to examine the physical processes associated with the fog event. The results indicate that prediction of this particular fog event is sensitive to microphysical schemes for the time of fog dissipation but not for the time of fog onset. The simulated timing of the arrival and dissipation of the fog, as well as the cloud distribution, is substantially sensitive to the planetary boundary layer and radiation (both longwave and shortwave) processes. Moreover, varying forecast lead times also produces different simulation results for the fog event regarding its onset and duration, suggesting a trade-off between more accurate initial conditions and a proper forecast lead time that allows model physical processes to spin up adequately during the fog simulation. The overall outcomes from this study imply that the complexity of physical processes and their interactions within the WRF model during fog evolution and dissipation is a key area of future research.

  16. Improved representation of situational awareness within a dismounted small combat unit constructive simulation

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Colony, Mike

    2011-06-01

    Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.

  17. Comparison of Two Conceptually Different Physically-based Hydrological Models - Looking Beyond Streamflows

    NASA Astrophysics Data System (ADS)

    Rousseau, A. N.; Álvarez; Yu, X.; Savary, S.; Duffy, C.

    2015-12-01

    Most physically-based hydrological models simulate to various extents the relevant watershed processes occurring at different spatiotemporal scales. These models use different physical domain representations (e.g., hydrological response units, discretized control volumes) and numerical solution techniques (e.g., finite difference method, finite element method) as well as a variety of approximations for representing the physical processes. Despite the fact that several models have been developed so far, very few inter-comparison studies have been conducted to check beyond streamflows whether different modeling approaches could simulate in a similar fashion the other processes at the watershed scale. In this study, PIHM (Qu and Duffy, 2007), a fully coupled, distributed model, and HYDROTEL (Fortin et al., 2001; Turcotte et al., 2003, 2007), a pseudo-coupled, semi-distributed model, were compared to check whether the models could corroborate observed streamflows while equally representing other processes as well such as evapotranspiration, snow accumulation/melt or infiltration, etc. For this study, the Young Womans Creek watershed, PA, was used to compare: streamflows (channel routing), actual evapotranspiration, snow water equivalent (snow accumulation and melt), infiltration, recharge, shallow water depth above the soil surface (surface flow), lateral flow into the river (surface and subsurface flow) and height of the saturated soil column (subsurface flow). Despite a lack of observed data for contrasting most of the simulated processes, it can be said that the two models can be used as simulation tools for streamflows, actual evapotranspiration, infiltration, lateral flows into the river, and height of the saturated soil column. However, each process presents particular differences as a result of the physical parameters and the modeling approaches used by each model. Potentially, these differences should be object of further analyses to definitively confirm or reject modeling hypotheses.

  18. Study on the CFD simulation of refrigerated container

    NASA Astrophysics Data System (ADS)

    Arif Budiyanto, Muhammad; Shinoda, Takeshi; Nasruddin

    2017-10-01

    The objective this study is to performed Computational Fluid Dynamic (CFD) simulation of refrigerated container in the container port. Refrigerated container is a thermal cargo container constructed from an insulation wall to carry kind of perishable goods. CFD simulation was carried out use cross sectional of container walls to predict surface temperatures of refrigerated container and to estimate its cooling load. The simulation model is based on the solution of the partial differential equations governing the fluid flow and heat transfer processes. The physical model of heat-transfer processes considered in this simulation are consist of solar radiation from the sun, heat conduction on the container walls, heat convection on the container surfaces and thermal radiation among the solid surfaces. The validation of simulation model was assessed uses surface temperatures at center points on each container walls obtained from the measurement experimentation in the previous study. The results shows the surface temperatures of simulation model has good agreement with the measurement data on all container walls.

  19. A size-composition resolved aerosol model for simulating the dynamics of externally mixed particles: SCRAM (v 1.0)

    NASA Astrophysics Data System (ADS)

    Zhu, S.; Sartelet, K. N.; Seigneur, C.

    2015-06-01

    The Size-Composition Resolved Aerosol Model (SCRAM) for simulating the dynamics of externally mixed atmospheric particles is presented. This new model classifies aerosols by both composition and size, based on a comprehensive combination of all chemical species and their mass-fraction sections. All three main processes involved in aerosol dynamics (coagulation, condensation/evaporation and nucleation) are included. The model is first validated by comparison with a reference solution and with results of simulations using internally mixed particles. The degree of mixing of particles is investigated in a box model simulation using data representative of air pollution in Greater Paris. The relative influence on the mixing state of the different aerosol processes (condensation/evaporation, coagulation) and of the algorithm used to model condensation/evaporation (bulk equilibrium, dynamic) is studied.

  20. SeSBench - An initiative to benchmark reactive transport models for environmental subsurface processes

    NASA Astrophysics Data System (ADS)

    Jacques, Diederik

    2017-04-01

    As soil functions are governed by a multitude of interacting hydrological, geochemical and biological processes, simulation tools coupling mathematical models for interacting processes are needed. Coupled reactive transport models are a typical example of such coupled tools mainly focusing on hydrological and geochemical coupling (see e.g. Steefel et al., 2015). Mathematical and numerical complexity for both the tool itself or of the specific conceptual model can increase rapidly. Therefore, numerical verification of such type of models is a prerequisite for guaranteeing reliability and confidence and qualifying simulation tools and approaches for any further model application. In 2011, a first SeSBench -Subsurface Environmental Simulation Benchmarking- workshop was held in Berkeley (USA) followed by four other ones. The objective is to benchmark subsurface environmental simulation models and methods with a current focus on reactive transport processes. The final outcome was a special issue in Computational Geosciences (2015, issue 3 - Reactive transport benchmarks for subsurface environmental simulation) with a collection of 11 benchmarks. Benchmarks, proposed by the participants of the workshops, should be relevant for environmental or geo-engineering applications; the latter were mostly related to radioactive waste disposal issues - excluding benchmarks defined for pure mathematical reasons. Another important feature is the tiered approach within a benchmark with the definition of a single principle problem and different sub problems. The latter typically benchmarked individual or simplified processes (e.g. inert solute transport, simplified geochemical conceptual model) or geometries (e.g. batch or one-dimensional, homogeneous). Finally, three codes should be involved into a benchmark. The SeSBench initiative contributes to confidence building for applying reactive transport codes. Furthermore, it illustrates the use of those type of models for different environmental and geo-engineering applications. SeSBench will organize new workshops to add new benchmarks in a new special issue. Steefel, C. I., et al. (2015). "Reactive transport codes for subsurface environmental simulation." Computational Geosciences 19: 445-478.

  1. Towards the Integration of APECS with VE-Suite to Create a Comprehensive Virtual Engineering Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCorkle, D.; Yang, C.; Jordan, T.

    2007-06-01

    Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less

  2. Appendices to the user's manual for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A user's Manual for the Emulation Simulation Computer Model was published previously. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation/simulation combination in the design, development, and test of a piece of ARS hardware - SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from the test. In addition, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system, the second a potential Space Station air revitalization system.

  3. Modelling and simulation techniques for membrane biology.

    PubMed

    Burrage, Kevin; Hancock, John; Leier, André; Nicolau, Dan V

    2007-07-01

    One of the most important aspects of Computational Cell Biology is the understanding of the complicated dynamical processes that take place on plasma membranes. These processes are often so complicated that purely temporal models cannot always adequately capture the dynamics. On the other hand, spatial models can have large computational overheads. In this article, we review some of these issues with respect to chemistry, membrane microdomains and anomalous diffusion and discuss how to select appropriate modelling and simulation paradigms based on some or all the following aspects: discrete, continuous, stochastic, delayed and complex spatial processes.

  4. Quantitative modeling of soil genesis processes

    NASA Technical Reports Server (NTRS)

    Levine, E. R.; Knox, R. G.; Kerber, A. G.

    1992-01-01

    For fine spatial scale simulation, a model is being developed to predict changes in properties over short-, meso-, and long-term time scales within horizons of a given soil profile. Processes that control these changes can be grouped into five major process clusters: (1) abiotic chemical reactions; (2) activities of organisms; (3) energy balance and water phase transitions; (4) hydrologic flows; and (5) particle redistribution. Landscape modeling of soil development is possible using digitized soil maps associated with quantitative soil attribute data in a geographic information system (GIS) framework to which simulation models are applied.

  5. Urban Expansion Modeling Approach Based on Multi-Agent System and Cellular Automata

    NASA Astrophysics Data System (ADS)

    Zeng, Y. N.; Yu, M. M.; Li, S. N.

    2018-04-01

    Urban expansion is a land-use change process that transforms non-urban land into urban land. This process results in the loss of natural vegetation and increase in impervious surfaces. Urban expansion also alters the hydrologic cycling, atmospheric circulation, and nutrient cycling processes and generates enormous environmental and social impacts. Urban expansion monitoring and modeling are crucial to understanding urban expansion process, mechanism, and its environmental impacts, and predicting urban expansion in future scenarios. Therefore, it is important to study urban expansion monitoring and modeling approaches. We proposed to simulate urban expansion by combining CA and MAS model. The proposed urban expansion model based on MSA and CA was applied to a case study area of Changsha-Zhuzhou-Xiangtan urban agglomeration, China. The results show that this model can capture urban expansion with good adaptability. The Kappa coefficient of the simulation results is 0.75, which indicated that the combination of MAS and CA offered the better simulation result.

  6. Computational Models of Laryngeal Aerodynamics: Potentials and Numerical Costs.

    PubMed

    Sadeghi, Hossein; Kniesburges, Stefan; Kaltenbacher, Manfred; Schützenberger, Anne; Döllinger, Michael

    2018-02-07

    Human phonation is based on the interaction between tracheal airflow and laryngeal dynamics. This fluid-structure interaction is based on the energy exchange between airflow and vocal folds. Major challenges in analyzing the phonatory process in-vivo are the small dimensions and the poor accessibility of the region of interest. For improved analysis of the phonatory process, numerical simulations of the airflow and the vocal fold dynamics have been suggested. Even though most of the models reproduced the phonatory process fairly well, development of comprehensive larynx models is still a subject of research. In the context of clinical application, physiological accuracy and computational model efficiency are of great interest. In this study, a simple numerical larynx model is introduced that incorporates the laryngeal fluid flow. It is based on a synthetic experimental model with silicone vocal folds. The degree of realism was successively increased in separate computational models and each model was simulated for 10 oscillation cycles. Results show that relevant features of the laryngeal flow field, such as glottal jet deflection, develop even when applying rather simple static models with oscillating flow rates. Including further phonatory components such as vocal fold motion, mucosal wave propagation, and ventricular folds, the simulations show phonatory key features like intraglottal flow separation and increased flow rate in presence of ventricular folds. The simulation time on 100 CPU cores ranged between 25 and 290 hours, currently restricting clinical application of these models. Nevertheless, results show high potential of numerical simulations for better understanding of phonatory process. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  7. The development of an industrial-scale fed-batch fermentation simulation.

    PubMed

    Goldrick, Stephen; Ştefan, Andrei; Lovett, David; Montague, Gary; Lennox, Barry

    2015-01-10

    This paper describes a simulation of an industrial-scale fed-batch fermentation that can be used as a benchmark in process systems analysis and control studies. The simulation was developed using a mechanistic model and validated using historical data collected from an industrial-scale penicillin fermentation process. Each batch was carried out in a 100,000 L bioreactor that used an industrial strain of Penicillium chrysogenum. The manipulated variables recorded during each batch were used as inputs to the simulator and the predicted outputs were then compared with the on-line and off-line measurements recorded in the real process. The simulator adapted a previously published structured model to describe the penicillin fermentation and extended it to include the main environmental effects of dissolved oxygen, viscosity, temperature, pH and dissolved carbon dioxide. In addition the effects of nitrogen and phenylacetic acid concentrations on the biomass and penicillin production rates were also included. The simulated model predictions of all the on-line and off-line process measurements, including the off-gas analysis, were in good agreement with the batch records. The simulator and industrial process data are available to download at www.industrialpenicillinsimulation.com and can be used to evaluate, study and improve on the current control strategy implemented on this facility. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  8. Digital data processing system dynamic loading analysis

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Tucker, A. E.

    1976-01-01

    Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.

  9. Space shuttle orbiter digital data processing system timing sensitivity analysis OFT ascent phase

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Becker, D. A.

    1977-01-01

    Dynamic loads were investigated to provide simulation and analysis of the space shuttle orbiter digital data processing system (DDPS). Segments of the ascent test (OFT) configuration were modeled utilizing the information management system interpretive model (IMSIM) in a computerized simulation modeling of the OFT hardware and software workload. System requirements for simulation of the OFT configuration were defined, and sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and these sensitivity analyses, a test design was developed for adapting, parameterizing, and executing IMSIM, using varying load and stress conditions for model execution. Analyses of the computer simulation runs are documented, including results, conclusions, and recommendations for DDPS improvements.

  10. Software life cycle dynamic simulation model: The organizational performance submodel

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  11. Extending rule-based methods to model molecular geometry and 3D model resolution.

    PubMed

    Hoard, Brittany; Jacobson, Bruna; Manavi, Kasra; Tapia, Lydia

    2016-08-01

    Computational modeling is an important tool for the study of complex biochemical processes associated with cell signaling networks. However, it is challenging to simulate processes that involve hundreds of large molecules due to the high computational cost of such simulations. Rule-based modeling is a method that can be used to simulate these processes with reasonably low computational cost, but traditional rule-based modeling approaches do not include details of molecular geometry. The incorporation of geometry into biochemical models can more accurately capture details of these processes, and may lead to insights into how geometry affects the products that form. Furthermore, geometric rule-based modeling can be used to complement other computational methods that explicitly represent molecular geometry in order to quantify binding site accessibility and steric effects. We propose a novel implementation of rule-based modeling that encodes details of molecular geometry into the rules and binding rates. We demonstrate how rules are constructed according to the molecular curvature. We then perform a study of antigen-antibody aggregation using our proposed method. We simulate the binding of antibody complexes to binding regions of the shrimp allergen Pen a 1 using a previously developed 3D rigid-body Monte Carlo simulation, and we analyze the aggregate sizes. Then, using our novel approach, we optimize a rule-based model according to the geometry of the Pen a 1 molecule and the data from the Monte Carlo simulation. We use the distances between the binding regions of Pen a 1 to optimize the rules and binding rates. We perform this procedure for multiple conformations of Pen a 1 and analyze the impact of conformation and resolution on the optimal rule-based model. We find that the optimized rule-based models provide information about the average steric hindrance between binding regions and the probability that antibodies will bind to these regions. These optimized models quantify the variation in aggregate size that results from differences in molecular geometry and from model resolution.

  12. Simulating Sustainment for an Unmanned Logistics System Concept of Operation in Support of Distributed Operations

    DTIC Science & Technology

    2017-06-01

    designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily in...systems, simulation, discrete event simulation, design of experiments, data analysis, simplekit, nearly orthogonal and balanced designs 15. NUMBER OF... designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily

  13. Airport Landside. Volume II. The Airport Landside Simulation Model (ALSIM) Description and Users Guide.

    DOT National Transportation Integrated Search

    1982-06-01

    This volume provides a general description of the Airport Landside Simulation Model. A summary of simulated passenger and vehicular processing through the landside is presented. Program operating characteristics and assumptions are documented and a c...

  14. Storage and growth of denitrifiers in aerobic granules: part I. model development.

    PubMed

    Ni, Bing-Jie; Yu, Han-Qing

    2008-02-01

    A mathematical model, based on the Activated Sludge Model No.3 (ASM3), is developed to describe the storage and growth activities of denitrifiers in aerobic granules under anoxic conditions. In this model, mass transfer, hydrolysis, simultaneous anoxic storage and growth, anoxic maintenance, and endogenous decay are all taken into account. The model established is implemented in the well-established AQUASIM simulation software. A combination of completely mixed reactor and biofilm reactor compartments provided by AQUASIM is used to simulate the mass transport and conversion processes occurring in both bulk liquid and granules. The modeling results explicitly show that the external substrate is immediately utilized for storage and growth at feast phase. More external substrates are diverted to storage process than the primary biomass production process. The model simulation indicates that the nitrate utilization rate (NUR) of granules-based denitrification process includes four linear phases of nitrate reduction. Furthermore, the methodology for determining the most important parameter in this model, that is, anoxic reduction factor, is established. (c) 2007 Wiley Periodicals, Inc.

  15. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  16. ISPE: A knowledge-based system for fluidization studies. 1990 Annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, S.

    1991-01-01

    Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all ``specified goals`` are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less

  17. Composite Cure Process Modeling and Simulations using COMPRO(Registered Trademark) and Validation of Residual Strains using Fiber Optics Sensors

    NASA Technical Reports Server (NTRS)

    Sreekantamurthy, Thammaiah; Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.

    2016-01-01

    Composite cure process induced residual strains and warping deformations in composite components present significant challenges in the manufacturing of advanced composite structure. As a part of the Manufacturing Process and Simulation initiative of the NASA Advanced Composite Project (ACP), research is being conducted on the composite cure process by developing an understanding of the fundamental mechanisms by which the process induced factors influence the residual responses. In this regard, analytical studies have been conducted on the cure process modeling of composite structural parts with varied physical, thermal, and resin flow process characteristics. The cure process simulation results were analyzed to interpret the cure response predictions based on the underlying physics incorporated into the modeling tool. In the cure-kinetic analysis, the model predictions on the degree of cure, resin viscosity and modulus were interpreted with reference to the temperature distribution in the composite panel part and tool setup during autoclave or hot-press curing cycles. In the fiber-bed compaction simulation, the pore pressure and resin flow velocity in the porous media models, and the compaction strain responses under applied pressure were studied to interpret the fiber volume fraction distribution predictions. In the structural simulation, the effect of temperature on the resin and ply modulus, and thermal coefficient changes during curing on predicted mechanical strains and chemical cure shrinkage strains were studied to understand the residual strains and stress response predictions. In addition to computational analysis, experimental studies were conducted to measure strains during the curing of laminated panels by means of optical fiber Bragg grating sensors (FBGs) embedded in the resin impregnated panels. The residual strain measurements from laboratory tests were then compared with the analytical model predictions. The paper describes the cure process procedures and residual strain predications, and discusses pertinent experimental results from the validation studies.

  18. Simulation and optimization of a coking wastewater biological treatment process by activated sludge models (ASM).

    PubMed

    Wu, Xiaohui; Yang, Yang; Wu, Gaoming; Mao, Juan; Zhou, Tao

    2016-01-01

    Applications of activated sludge models (ASM) in simulating industrial biological wastewater treatment plants (WWTPs) are still difficult due to refractory and complex components in influents as well as diversity in activated sludges. In this study, an ASM3 modeling study was conducted to simulate and optimize a practical coking wastewater treatment plant (CWTP). First, respirometric characterizations of the coking wastewater and CWTP biomasses were conducted to determine the specific kinetic and stoichiometric model parameters for the consecutive aeration-anoxic-aeration (O-A/O) biological process. All ASM3 parameters have been further estimated and calibrated, through cross validation by the model dynamic simulation procedure. Consequently, an ASM3 model was successfully established to accurately simulate the CWTP performances in removing COD and NH4-N. An optimized CWTP operation condition could be proposed reducing the operation cost from 6.2 to 5.5 €/m(3) wastewater. This study is expected to provide a useful reference for mathematic simulations of practical industrial WWTPs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Intelligent system of coordination and control for manufacturing

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2016-08-01

    This paper wants shaping an intelligent system monitoring and control, which leads to optimizing material and information flows of the company. The paper presents a model for tracking and control system using intelligent real. Production system proposed for simulation analysis provides the ability to track and control the process in real time. Using simulation models be understood: the influence of changes in system structure, commands influence on the general condition of the manufacturing process conditions influence the behavior of some system parameters. Practical character consists of tracking and real-time control of the technological process. It is based on modular systems analyzed using mathematical models, graphic-analytical sizing, configuration, optimization and simulation.

  20. A systematic petri net approach for multiple-scale modeling and simulation of biochemical processes.

    PubMed

    Chen, Ming; Hu, Minjie; Hofestädt, Ralf

    2011-06-01

    A method to exploit hybrid Petri nets for modeling and simulating biochemical processes in a systematic way was introduced. Both molecular biology and biochemical engineering aspects are manipulated. With discrete and continuous elements, the hybrid Petri nets can easily handle biochemical factors such as metabolites concentration and kinetic behaviors. It is possible to translate both molecular biological behavior and biochemical processes workflow into hybrid Petri nets in a natural manner. As an example, penicillin production bioprocess is modeled to illustrate the concepts of the methodology. Results of the dynamic of production parameters in the bioprocess were simulated and observed diagrammatically. Current problems and post-genomic perspectives were also discussed.

  1. Simulation modeling of high-throughput cryopreservation of aquatic germplasm: a case study of blue catfish sperm processing

    PubMed Central

    Hu, E; Liao, T. W.; Tiersch, T. R.

    2013-01-01

    Emerging commercial-level technology for aquatic sperm cryopreservation has not been modeled by computer simulation. Commercially available software (ARENA, Rockwell Automation, Inc. Milwaukee, WI) was applied to simulate high-throughput sperm cryopreservation of blue catfish (Ictalurus furcatus) based on existing processing capabilities. The goal was to develop a simulation model suitable for production planning and decision making. The objectives were to: 1) predict the maximum output for 8-hr workday; 2) analyze the bottlenecks within the process, and 3) estimate operational costs when run for daily maximum output. High-throughput cryopreservation was divided into six major steps modeled with time, resources and logic structures. The modeled production processed 18 fish and produced 1164 ± 33 (mean ± SD) 0.5-ml straws containing one billion cryopreserved sperm. Two such production lines could support all hybrid catfish production in the US and 15 such lines could support the entire channel catfish industry if it were to adopt artificial spawning techniques. Evaluations were made to improve efficiency, such as increasing scale, optimizing resources, and eliminating underutilized equipment. This model can serve as a template for other aquatic species and assist decision making in industrial application of aquatic germplasm in aquaculture, stock enhancement, conservation, and biomedical model fishes. PMID:25580079

  2. Simulating carbon flows in Amazonian rainforests: how intensive C-cycle data can help to reduce vegetation model uncertainty

    NASA Astrophysics Data System (ADS)

    Galbraith, D.; Levine, N. M.; Christoffersen, B. O.; Imbuzeiro, H. A.; Powell, T.; Costa, M. H.; Saleska, S. R.; Moorcroft, P. R.; Malhi, Y.

    2014-12-01

    The mathematical codes embedded within different vegetation models ultimately represent alternative hypotheses of biosphere functioning. While formulations for some processes (e.g. leaf-level photosynthesis) are often shared across vegetation models, other processes (e.g. carbon allocation) are much more variable in their representation across models. This creates the opportunity for equifinality - models can simulate similar values of key metrics such as NPP or biomass through very different underlying causal pathways. Intensive carbon cycle measurements allow for quantification of a comprehensive suite of carbon fluxes such as the productivity and respiration of leaves, roots and wood, allowing for in-depth assessment of carbon flows within ecosystems. Thus, they provide important information on poorly-constrained C-cycle processes such as allocation. We conducted an in-depth evaluation of the ability of four commonly used dynamic global vegetation models (CLM, ED2, IBIS, JULES) to simulate carbon cycle processes at ten lowland Amazonian rainforest sites where individual C-cycle components have been measured. The rigorous model-data comparison procedure allowed identification of biases which were specific to different models, providing clear avenues for model improvement and allowing determination of internal C-cycling pathways that were better supported by data. Furthermore, the intensive C-cycle data allowed for explicit testing of the validity of a number of assumptions made by specific models in the simulation of carbon allocation and plant respiration. For example, the ED2 model assumes that maintenance respiration of stems is negligible while JULES assumes equivalent allocation of NPP to fine roots and leaves. We argue that field studies focusing on simultaneous measurement of a large number of component fluxes are fundamentally important for reducing uncertainty in vegetation model simulations.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Huan; Cheng, Liang; Chuah, Mooi Choo

    In the generation, transmission, and distribution sectors of the smart grid, intelligence of field devices is realized by programmable logic controllers (PLCs). Many smart-grid subsystems are essentially cyber-physical energy systems (CPES): For instance, the power system process (i.e., the physical part) within a substation is monitored and controlled by a SCADA network with hosts running miscellaneous applications (i.e., the cyber part). To study the interactions between the cyber and physical components of a CPES, several co-simulation platforms have been proposed. However, the network simulators/emulators of these platforms do not include a detailed traffic model that takes into account the impactsmore » of the execution model of PLCs on traffic characteristics. As a result, network traces generated by co-simulation only reveal the impacts of the physical process on the contents of the traffic generated by SCADA hosts, whereas the distinction between PLCs and computing nodes (e.g., a hardened computer running a process visualization application) has been overlooked. To generate realistic network traces using co-simulation for the design and evaluation of applications relying on accurate traffic profiles, it is necessary to establish a traffic model for PLCs. In this work, we propose a parameterized model for PLCs that can be incorporated into existing co-simulation platforms. We focus on the DNP3 subsystem of slave PLCs, which automates the processing of packets from the DNP3 master. To validate our approach, we extract model parameters from both the configuration and network traces of real PLCs. Simulated network traces are generated and compared against those from PLCs. Our evaluation shows that our proposed model captures the essential traffic characteristics of DNP3 slave PLCs, which can be used to extend existing co-simulation platforms and gain further insights into the behaviors of CPES.« less

  4. A method of computer modelling the lithium-ion batteries aging process based on the experimental characteristics

    NASA Astrophysics Data System (ADS)

    Czerepicki, A.; Koniak, M.

    2017-06-01

    The paper presents a method of modelling the processes of aging lithium-ion batteries, its implementation as a computer application and results for battery state estimation. Authors use previously developed behavioural battery model, which was built using battery operating characteristics obtained from the experiment. This model was implemented in the form of a computer program using a database to store battery characteristics. Batteries aging process is a new extended functionality of the model. Algorithm of computer simulation uses a real measurements of battery capacity as a function of the battery charge and discharge cycles number. Simulation allows to take into account the incomplete cycles of charge or discharge battery, which are characteristic for transport powered by electricity. The developed model was used to simulate the battery state estimation for different load profiles, obtained by measuring the movement of the selected means of transport.

  5. Physics-based interactive volume manipulation for sharing surgical process.

    PubMed

    Nakao, Megumi; Minato, Kotaro

    2010-05-01

    This paper presents a new set of techniques by which surgeons can interactively manipulate patient-specific volumetric models for sharing surgical process. To handle physical interaction between the surgical tools and organs, we propose a simple surface-constraint-based manipulation algorithm to consistently simulate common surgical manipulations such as grasping, holding and retraction. Our computation model is capable of simulating soft-tissue deformation and incision in real time. We also present visualization techniques in order to rapidly visualize time-varying, volumetric information on the deformed image. This paper demonstrates the success of the proposed methods in enabling the simulation of surgical processes, and the ways in which this simulation facilitates preoperative planning and rehearsal.

  6. Emergency Management Operations Process Mapping: Public Safety Technical Program Study

    DTIC Science & Technology

    2011-02-01

    Enterprise Architectures in industry, and have been successfully applied to assist companies to optimise interdependencies and relationships between...model for more in-depth analysis of EM processes, and for use in tandem with other studies that apply modeling and simulation to assess EM...for use in tandem with other studies that apply modeling and simulation to assess EM operational effectiveness before and after changing elements

  7. A multiple hypotheses uncertainty analysis in hydrological modelling: about model structure, landscape parameterization, and numerical integration

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2016-04-01

    Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.

  8. Simulating patient-specific heart shape and motion using SPECT perfusion images with the MCAT phantom

    NASA Astrophysics Data System (ADS)

    Faber, Tracy L.; Garcia, Ernest V.; Lalush, David S.; Segars, W. Paul; Tsui, Benjamin M.

    2001-05-01

    The spline-based Mathematical Cardiac Torso (MCAT) phantom is a realistic software simulation designed to simulate single photon emission computed tomographic (SPECT) data. It incorporates a heart model of known size and shape; thus, it is invaluable for measuring accuracy of acquisition, reconstruction, and post-processing routines. New functionality has been added by replacing the standard heart model with left ventricular (LV) epicaridal and endocardial surface points detected from actual patient SPECT perfusion studies. LV surfaces detected from standard post-processing quantitation programs are converted through interpolation in space and time into new B-spline models. Perfusion abnormalities are added to the model based on results of standard perfusion quantification. The new LV is translated and rotated to fit within existing atria and right ventricular models, which are scaled based on the size of the LV. Simulations were created for five different patients with myocardial infractions who had undergone SPECT perfusion imaging. Shape, size, and motion of the resulting activity map were compared visually to the original SPECT images. In all cases, size, shape and motion of simulated LVs matched well with the original images. Thus, realistic simulations with known physiologic and functional parameters can be created for evaluating efficacy of processing algorithms.

  9. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    NASA Technical Reports Server (NTRS)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  10. Computer Simulation of a Hardwood Processing Plant

    Treesearch

    D. Earl Kline; Philip A. Araman

    1990-01-01

    The overall purpose of this paper is to introduce computer simulation as a decision support tool that can be used to provide managers with timely information. A simulation/animation modeling procedure is demonstrated for wood products manufacuring systems. Simulation modeling techniques are used to assist in identifying and solving problems. Animation is used for...

  11. Improved global simulation of groundwater-ecosystem interactions via tight coupling of a dynamic global ecosystem model and a global hydrological model

    NASA Astrophysics Data System (ADS)

    Braakhekke, Maarten; Rebel, Karin; Dekker, Stefan; Smith, Benjamin; Sutanudjaja, Edwin; van Beek, Rens; van Kampenhout, Leo; Wassen, Martin

    2017-04-01

    In up to 30% of the global land surface ecosystems are potentially influenced by the presence of a shallow groundwater table. In these regions upward water flux by capillary rise increases soil moisture availability in the root zone, which has a strong effect on evapotranspiration, vegetation dynamics, and fluxes of carbon and nitrogen. Most global hydrological models and several land surface models simulate groundwater table dynamics and their effects on land surface processes. However, these models typically have relatively simplistic representation of vegetation and do not consider changes in vegetation type and structure. Dynamic global vegetation models (DGVMs), describe land surface from an ecological perspective, combining detailed description of vegetation dynamics and structure, and biogeochemical processes and are thus more appropriate to simulate the ecological and biogeochemical effects of groundwater interactions. However, currently virtually all DGVMs ignore these effects, assuming that water tables are too deep to affect soil moisture in the root zone. We have implemented a tight coupling between the dynamic global ecosystem model LPJ-GUESS and the global hydrological model PCR-GLOBWB, which explicitly simulates groundwater dynamics. This coupled model allows us to explicitly account for groundwater effects on terrestrial ecosystem processes at global scale. Results of global simulations indicate that groundwater strongly influences fluxes of water, carbon and nitrogen, in many regions, adding up to a considerable effect at the global scale.

  12. Understanding Differences in Upper Stratospheric Ozone Response to Changes in Chlorine and Temperature as Computed Using CCMVal Models

    NASA Technical Reports Server (NTRS)

    Douglass, A. R.; Stolarski, R. S.; Strahan, S. E.; Oman, L. D.

    2012-01-01

    Projections of future ozone levels are made using models that couple a general circulation model with a representation of atmospheric photochemical processes, allowing interactions among photochemical processes, radiation, and dynamics. Such models are known as chemistry and climate models (CCMs). Although developed from common principles and subject to the same boundary conditions, simulated ozone time series vary for projections of changes in ozone depleting substances (ODSs) and greenhouse gases. In the upper stratosphere photochemical processes control ozone level, and ozone increases as ODSs decrease and temperature decreases due to greenhouse gas increase. Simulations agree broadly but there are quantitative differences in the sensitivity of ozone to chlorine and to temperature. We obtain insight into these differences in sensitivity by examining the relationship between the upper stratosphere annual cycle of ozone and temperature as produced by a suite of models. All simulations conform to expectation in that ozone is less sensitive to temperature when chlorine levels are highest because chlorine catalyzed loss is nearly independent of temperature. Differences in sensitivity are traced to differences in simulated temperature, ozone and reactive nitrogen when chlorine levels are close to background. This work shows that differences in the importance of specific processes underlie differences in simulated sensitivity of ozone to composition change. This suggests a) the multi-model mean is not a best estimate of the sensitivity of upper ozone to changes in ODSs and temperature; b) the spread of values is not an appropriate measure of uncertainty.

  13. Three dimensional modeling of cirrus during the 1991 FIRE IFO 2: Detailed process study

    NASA Technical Reports Server (NTRS)

    Jensen, Eric J.; Toon, Owen B.; Westphal, Douglas L.

    1993-01-01

    A three-dimensional model of cirrus cloud formation and evolution, including microphysical, dynamical, and radiative processes, was used to simulate cirrus observed in the FIRE Phase 2 Cirrus field program (13 Nov. - 7 Dec. 1991). Sulfate aerosols, solution drops, ice crystals, and water vapor are all treated as interactive elements in the model. Ice crystal size distributions are fully resolved based on calculations of homogeneous freezing of solution drops, growth by water vapor deposition, evaporation, aggregation, and vertical transport. Visible and infrared radiative fluxes, and radiative heating rates are calculated using the two-stream algorithm described by Toon et al. Wind velocities, diffusion coefficients, and temperatures were taken from the MAPS analyses and the MM4 mesoscale model simulations. Within the model, moisture is transported and converted to liquid or vapor by the microphysical processes. The simulated cloud bulk and microphysical properties are shown in detail for the Nov. 26 and Dec. 5 case studies. Comparisons with lidar, radar, and in situ data are used to determine how well the simulations reproduced the observed cirrus. The roles played by various processes in the model are described in detail. The potential modes of nucleation are evaluated, and the importance of small-scale variations in temperature and humidity are discussed. The importance of competing ice crystal growth mechanisms (water vapor deposition and aggregation) are evaluated based on model simulations. Finally, the importance of ice crystal shape for crystal growth and vertical transport of ice are discussed.

  14. Terrestrial ecosystem process model Biome-BGCMuSo v4.0: summary of improvements and new modeling possibilities

    NASA Astrophysics Data System (ADS)

    Hidy, Dóra; Barcza, Zoltán; Marjanović, Hrvoje; Zorana Ostrogović Sever, Maša; Dobor, Laura; Gelybó, Györgyi; Fodor, Nándor; Pintér, Krisztina; Churkina, Galina; Running, Steven; Thornton, Peter; Bellocchi, Gianni; Haszpra, László; Horváth, Ferenc; Suyker, Andrew; Nagy, Zoltán

    2016-12-01

    The process-based biogeochemical model Biome-BGC was enhanced to improve its ability to simulate carbon, nitrogen, and water cycles of various terrestrial ecosystems under contrasting management activities. Biome-BGC version 4.1.1 was used as a base model. Improvements included addition of new modules such as the multilayer soil module, implementation of processes related to soil moisture and nitrogen balance, soil-moisture-related plant senescence, and phenological development. Vegetation management modules with annually varying options were also implemented to simulate management practices of grasslands (mowing, grazing), croplands (ploughing, fertilizer application, planting, harvesting), and forests (thinning). New carbon and nitrogen pools have been defined to simulate yield and soft stem development of herbaceous ecosystems. The model version containing all developments is referred to as Biome-BGCMuSo (Biome-BGC with multilayer soil module; in this paper, Biome-BGCMuSo v4.0 is documented). Case studies on a managed forest, cropland, and grassland are presented to demonstrate the effect of model developments on the simulation of plant growth as well as on carbon and water balance.

  15. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  16. Advancing Nucleosynthesis in Core-Collapse Supernovae Models Using 2D CHIMERA Simulations

    NASA Astrophysics Data System (ADS)

    Harris, J. A.; Hix, W. R.; Chertkow, M. A.; Bruenn, S. W.; Lentz, E. J.; Messer, O. B.; Mezzacappa, A.; Blondin, J. M.; Marronetti, P.; Yakunin, K.

    2014-01-01

    The deaths of massive stars as core-collapse supernovae (CCSN) serve as a crucial link in understanding galactic chemical evolution since the birth of the universe via the Big Bang. We investigate CCSN in polar axisymmetric simulations using the multidimensional radiation hydrodynamics code CHIMERA. Computational costs have traditionally constrained the evolution of the nuclear composition in CCSN models to, at best, a 14-species α-network. However, the limited capacity of the α-network to accurately evolve detailed composition, the neutronization and the nuclear energy generation rate has fettered the ability of prior CCSN simulations to accurately reproduce the chemical abundances and energy distributions as known from observations. These deficits can be partially ameliorated by "post-processing" with a more realistic network. Lagrangian tracer particles placed throughout the star record the temporal evolution of the initial simulation and enable the extension of the nuclear network evolution by incorporating larger systems in post-processing nucleosynthesis calculations. We present post-processing results of the four ab initio axisymmetric CCSN 2D models of Bruenn et al. (2013) evolved with the smaller α-network, and initiated from stellar metallicity, non-rotating progenitors of mass 12, 15, 20, and 25 M⊙ from Woosley & Heger (2007). As a test of the limitations of post-processing, we provide preliminary results from an ongoing simulation of the 15 M⊙ model evolved with a realistic 150 species nuclear reaction network in situ. With more accurate energy generation rates and an improved determination of the thermodynamic trajectories of the tracer particles, we can better unravel the complicated multidimensional "mass-cut" in CCSN simulations and probe for less energetically significant nuclear processes like the νp-process and the r-process, which require still larger networks.

  17. Supporting observation campaigns with high resolution modeling

    NASA Astrophysics Data System (ADS)

    Klocke, Daniel; Brueck, Matthias; Voigt, Aiko

    2017-04-01

    High resolution simulation in support of measurement campaigns offers a promising and emerging way to create large-scale context for small-scale observations of clouds and precipitation processes. As these simulation include the coupling of measured small-scale processes with the circulation, they also help to integrate the research communities from modeling and observations and allow for detailed model evaluations against dedicated observations. In connection with the measurement campaign NARVAL (August 2016 and December 2013) simulations with a grid-spacing of 2.5 km for the tropical Atlantic region (9000x3300 km), with local refinement to 1.2 km for the western part of the domain, were performed using the icosahedral non-hydrostatic (ICON) general circulation model. These simulations are again used to drive large eddy resolving simulations with the same model for selected days in the high definition clouds and precipitation for advancing climate prediction (HD(CP)2) project. The simulations are presented with the focus on selected results showing the benefit for the scientific communities doing atmospheric measurements and numerical modeling of climate and weather. Additionally, an outlook will be given on how similar simulations will support the NAWDEX measurement campaign in the North Atlantic and AC3 measurement campaign in the Arctic.

  18. Conceptual Design of Simulation Models in an Early Development Phase of Lunar Spacecraft Simulator Using SMP2 Standard

    NASA Astrophysics Data System (ADS)

    Lee, Hoon Hee; Koo, Cheol Hea; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    The conceptual study for Korean lunar orbiter/lander prototype has been performed in Korea Aerospace Research Institute (KARI). Across diverse space programs around European countries, a variety of simulation application has been developed using SMP2 (Simulation Modelling Platform) standard related to portability and reuse of simulation models by various model users. KARI has not only first-hand experience of a development of SMP compatible simulation environment but also an ongoing study to apply the SMP2 development process of simulation model to a simulator development project for lunar missions. KARI has tried to extend the coverage of the development domain based on SMP2 standard across the whole simulation model life-cycle from software design to its validation through a lunar exploration project. Figure. 1 shows a snapshot from a visualization tool for the simulation of lunar lander motion. In reality, a demonstrator prototype on the right-hand side of image was made and tested in 2012. In an early phase of simulator development prior to a kick-off start in the near future, targeted hardware to be modelled has been investigated and indentified at the end of 2012. The architectural breakdown of the lunar simulator at system level was performed and the architecture with a hierarchical tree of models from the system to parts at lower level has been established. Finally, SMP Documents such as Catalogue, Assembly, Schedule and so on were converted using a XML(eXtensible Mark-up Language) converter. To obtain benefits of the suggested approaches and design mechanisms in SMP2 standard as far as possible, the object-oriented and component-based design concepts were strictly chosen throughout a whole model development process.

  19. Snow Physics and Meltwater Hydrology of the SSiB Model Employed for Climate Simulation Studies with GEOS 2 GCM

    NASA Technical Reports Server (NTRS)

    Mocko, David M.; Sud, Y. C.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Present-day climate models produce large climate drifts that interfere with the climate signals simulated in modelling studies. The simplifying assumptions of the physical parameterization of snow and ice processes lead to large biases in the annual cycles of surface temperature, evapotranspiration, and the water budget, which in turn causes erroneous land-atmosphere interactions. Since land processes are vital for climate prediction, and snow and snowmelt processes have been shown to affect Indian monsoons and North American rainfall and hydrology, special attention is now being given to cold land processes and their influence on the simulated annual cycle in GCMs. The snow model of the SSiB land-surface model being used at Goddard has evolved from a unified single snow-soil layer interacting with a deep soil layer through a force-restore procedure to a two-layer snow model atop a ground layer separated by a snow-ground interface. When the snow cover is deep, force-restore occurs within the snow layers. However, several other simplifying assumptions such as homogeneous snow cover, an empirical depth related surface albedo, snowmelt and melt-freeze in the diurnal cycles, and neglect of latent heat of soil freezing and thawing still remain as nagging problems. Several important influences of these assumptions will be discussed with the goal of improving them to better simulate the snowmelt and meltwater hydrology. Nevertheless, the current snow model (Mocko and Sud, 2000, submitted) better simulates cold land processes as compared to the original SSiB. This was confirmed against observations of soil moisture, runoff, and snow cover in global GSWP (Sud and Mocko, 1999) and point-scale Valdai simulations over seasonal snow regions. New results from the current snow model SSiB from the 10-year PILPS 2e intercomparison in northern Scandinavia will be presented.

  20. Procedural wound geometry and blood flow generation for medical training simulators

    NASA Astrophysics Data System (ADS)

    Aras, Rifat; Shen, Yuzhong; Li, Jiang

    2012-02-01

    Efficient application of wound treatment procedures is vital in both emergency room and battle zone scenes. In order to train first responders for such situations, physical casualty simulation kits, which are composed of tens of individual items, are commonly used. Similar to any other training scenarios, computer simulations can be effective means for wound treatment training purposes. For immersive and high fidelity virtual reality applications, realistic 3D models are key components. However, creation of such models is a labor intensive process. In this paper, we propose a procedural wound geometry generation technique that parameterizes key simulation inputs to establish the variability of the training scenarios without the need of labor intensive remodeling of the 3D geometry. The procedural techniques described in this work are entirely handled by the graphics processing unit (GPU) to enable interactive real-time operation of the simulation and to relieve the CPU for other computational tasks. The visible human dataset is processed and used as a volumetric texture for the internal visualization of the wound geometry. To further enhance the fidelity of the simulation, we also employ a surface flow model for blood visualization. This model is realized as a dynamic texture that is composed of a height field and a normal map and animated at each simulation step on the GPU. The procedural wound geometry and the blood flow model are applied to a thigh model and the efficiency of the technique is demonstrated in a virtual surgery scene.

  1. Evaluating the Credibility of Transport Processes in Simulations of Ozone Recovery using the Global Modeling Initiative Three-dimensional Model

    NASA Technical Reports Server (NTRS)

    Strahan, Susan E.; Douglass, Anne R.

    2004-01-01

    The Global Modeling Initiative (GMI) has integrated two 36-year simulations of an ozone recovery scenario with an offline chemistry and tra nsport model using two different meteorological inputs. Physically ba sed diagnostics, derived from satellite and aircraft data sets, are d escribed and then used to evaluate the realism of temperature and transport processes in the simulations. Processes evaluated include barri er formation in the subtropics and polar regions, and extratropical w ave-driven transport. Some diagnostics are especially relevant to sim ulation of lower stratospheric ozone, but most are applicable to any stratospheric simulation. The global temperature evaluation, which is relevant to gas phase chemical reactions, showed that both sets of me teorological fields have near climatological values at all latitudes and seasons at 30 hPa and below. Both simulations showed weakness in upper stratospheric wave driving. The simulation using input from a g eneral circulation model (GMI(GCM)) showed a very good residual circulation in the tropics and Northern Hemisphere. The simulation with inp ut from a data assimilation system (GMI(DAS)) performed better in the midlatitudes than it did at high latitudes. Neither simulation forms a realistic barrier at the vortex edge, leading to uncertainty in the fate of ozone-depleted vortex air. Overall, tracer transport in the offline GML(GCM) has greater fidelity throughout the stratosphere tha n it does in the GMI(DAS)

  2. Simulation of the impact of refractive surgery ablative laser pulses with a flying-spot laser beam on intrasurgery corneal temperature.

    PubMed

    Shraiki, Mario; Arba-Mosquera, Samuel

    2011-06-01

    To evaluate ablation algorithms and temperature changes in laser refractive surgery. The model (virtual laser system [VLS]) simulates different physical effects of an entire surgical process, simulating the shot-by-shot ablation process based on a modeled beam profile. The model is comprehensive and directly considers applied correction; corneal geometry, including astigmatism; laser beam characteristics; and ablative spot properties. Pulse lists collected from actual treatments were used to simulate the temperature increase during the ablation process. Ablation efficiency reduction in the periphery resulted in a lower peripheral temperature increase. Steep corneas had lesser temperature increases than flat ones. The maximum rise in temperature depends on the spatial density of the ablation pulses. For the same number of ablative pulses, myopic corrections showed the highest temperature increase, followed by myopic astigmatism, mixed astigmatism, phototherapeutic keratectomy (PTK), hyperopic astigmatism, and hyperopic treatments. The proposed model can be used, at relatively low cost, for calibration, verification, and validation of the laser systems used for ablation processes and would directly improve the quality of the results.

  3. Evaluation on Asian Dust Aerosol and Simulated Processes in CanAM4.2 Using Satellite Measurements and Station Data

    NASA Astrophysics Data System (ADS)

    Yiran, P.; Li, J.; von Salzen, K.; Dai, T.; Liu, D.

    2014-12-01

    Mineral dust is a significant contributor to global and Asian aerosol burden. Currently, large uncertainties still exist in simulated aerosol processes in global climate models (GCMs), which lead to a diversity in dust mass loading and spatial distribution of GCM projections. In this study, satellite measurements from CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization) and observed aerosol data from Asian stations are compared with modelled aerosol in the Canadian Atmospheric Global Climate Model (CanAM4.2). Both seasonal and annual variations in Asian dust distribution are investigated. Vertical profile of simulated aerosol in troposphere is evaluated with CALIOP Level 3 products and local observed extinction for dust and total aerosols. Physical processes in GCM such as horizontal advection, vertical mixing, dry and wet removals are analyzed according to model simulation and available measurements of aerosol. This work aims to improve current understanding of Asian dust transport and vertical exchange on a large scale, which may help to increase the accuracy of GCM simulation on aerosols.

  4. The Water, Energy, and Biogeochemical Model (WEBMOD): A TOPMODEL application developed within the Modular Modeling System

    NASA Astrophysics Data System (ADS)

    Webb, R. M.; Wolock, D. M.; Linard, J. I.; Wieczorek, M. E.

    2004-12-01

    Process-based flow and transport simulation models can help increase understanding of how hydrologic flow paths affect biogeochemical mixing and reactions in watersheds. This presentation describes the Water, Energy, and Biogeochemical Model (WEBMOD), a new model designed to simulate water and chemical transport in both pristine and agricultural watersheds. WEBMOD simulates streamflow using TOPMODEL algorithms and also simulates irrigation, canopy interception, snowpack, and tile-drain flow; these are important processes for successful multi-year simulations of agricultural watersheds. In addition, the hydrologic components of the model are linked to the U.S. Geological Survey's (USGS) geochemical model PHREEQC such that solute chemistry for the hillslopes and streams also are computed. Model development, execution, and calibration take place within the USGS Modular Modeling System. WEBMOD is being validated at ten research watersheds. Five of these watersheds are nearly pristine and comprise the USGS Water, Energy, and Biogeochemical Budget (WEBB) Program field sites: Loch Vale, Colorado; Trout Lake, Wisconsin; Sleepers River, Vermont; Panola Mountain, Georgia; and the Luquillo Experimental Forest, Puerto Rico. The remaining five watersheds contain intensely cultivated fields being studied by USGS National Water Quality Assessment Program: Merced River, California; Granger Drain, Washington; Maple Creek, Nebraska; Sugar Creek, Indiana; and Morgan Creek, Delaware. Model calibration improved understanding of observed variations in soil moisture, solute concentrations, and stream discharge at the five WEBB watersheds and is now being set up to simulate the processes at the five agricultural watersheds that are now ending their first year of data collection.

  5. Simulation of the Onset of the Southeast Asian Monsoon during 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Wang, Yansen; Tao, W.-K.; Lau, K.-M.; Wetzel, Peter J.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated by coupling a mesoscale atmospheric model (MM5) and a detailed, land surface model, PLACE (the Parameterization for Land-Atmosphere-Cloud Exchange). The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The control simulation with the PLACE land surface model and variable sea surface temperature captured the basic signatures of the monsoon onset processes and associated rainfall statistics. Sensitivity tests indicated that simulations were sigmficantly improved by including the PLACE land surface model. The mechanism by which the land surface processes affect the moisture transport and the convection during the onset of the southeast Asian monsoon were analyzed. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation: the southwest low-level flow over the Indo-china peninsula and the northern, cold frontal intrusion from southern China. The surface sensible and latent heat fluxes modified the low-level temperature distribution and gradient, and therefore the low-level wind due to the thermal wind effect. The more realistic forcing of the sensible and latent heat fluxes from the detailed, land surface model improved the low-level wind simulation apd associated moisture transport and convection.

  6. Simulation of the Onset of the Southeast Asian Monsoon during 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Wang, Yansen; Tao, W.-K.; Lau, K.-M.; Wetzel, Peter J.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated by coupling a mesoscale atmospheric model (MM5) and a detailed, land surface model, PLACE (the Parameterization for Land-Atmosphere-Cloud Exchange). The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The control simulation with the PLACE land surface model and variable sea surface temperature captured the basic signatures of the monsoon onset processes and associated rainfall statistics. Sensitivity tests indicated that simulations were significantly improved by including the PLACE land surface model. The mechanism by which the land surface processes affect the moisture transport and the convection during the onset of the southeast Asian monsoon were analyzed. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation: the southwest low-level flow over the Indo-China peninsula and the northern, cold frontal intrusion from southern China. The surface sensible and latent heat fluxes modified the low-level temperature distribution and merit, and therefore the low-level wind due to the thermal wind effect. The more realistic forcing of the sensible and latent heat fluxes from the detailed, land surface model improved the low-level wind simulation and associated moisture transport and convection.

  7. Using field observations to inform thermal hydrology models of permafrost dynamics with ATS (v0.83)

    DOE PAGES

    Atchley, Adam L.; Painter, Scott L.; Harp, Dylan R.; ...

    2015-09-01

    Climate change is profoundly transforming the carbon-rich Arctic tundra landscape, potentially moving it from a carbon sink to a carbon source by increasing the thickness of soil that thaws on a seasonal basis. Thus, the modeling capability and precise parameterizations of the physical characteristics needed to estimate projected active layer thickness (ALT) are limited in Earth system models (ESMs). In particular, discrepancies in spatial scale between field measurements and Earth system models challenge validation and parameterization of hydrothermal models. A recently developed surface–subsurface model for permafrost thermal hydrology, the Advanced Terrestrial Simulator (ATS), is used in combination with field measurementsmore » to achieve the goals of constructing a process-rich model based on plausible parameters and to identify fine-scale controls of ALT in ice-wedge polygon tundra in Barrow, Alaska. An iterative model refinement procedure that cycles between borehole temperature and snow cover measurements and simulations functions to evaluate and parameterize different model processes necessary to simulate freeze–thaw processes and ALT formation. After model refinement and calibration, reasonable matches between simulated and measured soil temperatures are obtained, with the largest errors occurring during early summer above ice wedges (e.g., troughs). The results suggest that properly constructed and calibrated one-dimensional thermal hydrology models have the potential to provide reasonable representation of the subsurface thermal response and can be used to infer model input parameters and process representations. The models for soil thermal conductivity and snow distribution were found to be the most sensitive process representations. However, information on lateral flow and snowpack evolution might be needed to constrain model representations of surface hydrology and snow depth.« less

  8. A MODELING AND SIMULATION LANGUAGE FOR BIOLOGICAL CELLS WITH COUPLED MECHANICAL AND CHEMICAL PROCESSES

    PubMed Central

    Somogyi, Endre; Glazier, James A.

    2017-01-01

    Biological cells are the prototypical example of active matter. Cells sense and respond to mechanical, chemical and electrical environmental stimuli with a range of behaviors, including dynamic changes in morphology and mechanical properties, chemical uptake and secretion, cell differentiation, proliferation, death, and migration. Modeling and simulation of such dynamic phenomena poses a number of computational challenges. A modeling language describing cellular dynamics must naturally represent complex intra and extra-cellular spatial structures and coupled mechanical, chemical and electrical processes. Domain experts will find a modeling language most useful when it is based on concepts, terms and principles native to the problem domain. A compiler must then be able to generate an executable model from this physically motivated description. Finally, an executable model must efficiently calculate the time evolution of such dynamic and inhomogeneous phenomena. We present a spatial hybrid systems modeling language, compiler and mesh-free Lagrangian based simulation engine which will enable domain experts to define models using natural, biologically motivated constructs and to simulate time evolution of coupled cellular, mechanical and chemical processes acting on a time varying number of cells and their environment. PMID:29303160

  9. A MODELING AND SIMULATION LANGUAGE FOR BIOLOGICAL CELLS WITH COUPLED MECHANICAL AND CHEMICAL PROCESSES.

    PubMed

    Somogyi, Endre; Glazier, James A

    2017-04-01

    Biological cells are the prototypical example of active matter. Cells sense and respond to mechanical, chemical and electrical environmental stimuli with a range of behaviors, including dynamic changes in morphology and mechanical properties, chemical uptake and secretion, cell differentiation, proliferation, death, and migration. Modeling and simulation of such dynamic phenomena poses a number of computational challenges. A modeling language describing cellular dynamics must naturally represent complex intra and extra-cellular spatial structures and coupled mechanical, chemical and electrical processes. Domain experts will find a modeling language most useful when it is based on concepts, terms and principles native to the problem domain. A compiler must then be able to generate an executable model from this physically motivated description. Finally, an executable model must efficiently calculate the time evolution of such dynamic and inhomogeneous phenomena. We present a spatial hybrid systems modeling language, compiler and mesh-free Lagrangian based simulation engine which will enable domain experts to define models using natural, biologically motivated constructs and to simulate time evolution of coupled cellular, mechanical and chemical processes acting on a time varying number of cells and their environment.

  10. A microphysical pathway analysis to investigate aerosol effects on convective clouds

    NASA Astrophysics Data System (ADS)

    Heikenfeld, Max; White, Bethan; Labbouz, Laurent; Stier, Philip

    2017-04-01

    The impact of aerosols on ice- and mixed-phase processes in convective clouds remains highly uncertain, which has strong implications for estimates of the role of aerosol-cloud interactions in the climate system. The wide range of interacting microphysical processes are still poorly understood and generally not resolved in global climate models. To understand and visualise these processes and to conduct a detailed pathway analysis, we have added diagnostic output of all individual process rates for number and mass mixing ratios to two commonly-used cloud microphysics schemes (Thompson and Morrison) in WRF. This allows us to investigate the response of individual processes to changes in aerosol conditions and the propagation of perturbations throughout the development of convective clouds. Aerosol effects on cloud microphysics could strongly depend on the representation of these interactions in the model. We use different model complexities with regard to aerosol-cloud interactions ranging from simulations with different levels of fixed cloud droplet number concentration (CDNC) as a proxy for aerosol, to prognostic CDNC with fixed modal aerosol distributions. Furthermore, we have implemented the HAM aerosol model in WRF-chem to also perform simulations with a fully interactive aerosol scheme. We employ a hierarchy of simulation types to understand the evolution of cloud microphysical perturbations in atmospheric convection. Idealised supercell simulations are chosen to present and test the analysis methods for a strongly confined and well-studied case. We then extend the analysis to large case study simulations of tropical convection over the Amazon rainforest. For both cases we apply our analyses to individually tracked convective cells. Our results show the impact of model uncertainties on the understanding of aerosol-convection interactions and have implications for improving process representation in models.

  11. Parameter Sensitivity and Laboratory Benchmarking of a Biogeochemical Process Model for Enhanced Anaerobic Dechlorination

    NASA Astrophysics Data System (ADS)

    Kouznetsova, I.; Gerhard, J. I.; Mao, X.; Barry, D. A.; Robinson, C.; Brovelli, A.; Harkness, M.; Fisher, A.; Mack, E. E.; Payne, J. A.; Dworatzek, S.; Roberts, J.

    2008-12-01

    A detailed model to simulate trichloroethene (TCE) dechlorination in anaerobic groundwater systems has been developed and implemented through PHAST, a robust and flexible geochemical modeling platform. The approach is comprehensive but retains flexibility such that models of varying complexity can be used to simulate TCE biodegradation in the vicinity of nonaqueous phase liquid (NAPL) source zones. The complete model considers a full suite of biological (e.g., dechlorination, fermentation, sulfate and iron reduction, electron donor competition, toxic inhibition, pH inhibition), physical (e.g., flow and mass transfer) and geochemical processes (e.g., pH modulation, gas formation, mineral interactions). Example simulations with the model demonstrated that the feedback between biological, physical, and geochemical processes is critical. Successful simulation of a thirty-two-month column experiment with site soil, complex groundwater chemistry, and exhibiting both anaerobic dechlorination and endogenous respiration, provided confidence in the modeling approach. A comprehensive suite of batch simulations was then conducted to estimate the sensitivity of predicted TCE degradation to the 36 model input parameters. A local sensitivity analysis was first employed to rank the importance of parameters, revealing that 5 parameters consistently dominated model predictions across a range of performance metrics. A global sensitivity analysis was then performed to evaluate the influence of a variety of full parameter data sets available in the literature. The modeling study was performed as part of the SABRE (Source Area BioREmediation) project, a public/private consortium whose charter is to determine if enhanced anaerobic bioremediation can result in effective and quantifiable treatment of chlorinated solvent DNAPL source areas. The modelling conducted has provided valuable insight into the complex interactions between processes in the evolving biogeochemical systems, particularly at the laboratory scale.

  12. A Mathematical Model for the Middle Ear Ventilation

    NASA Astrophysics Data System (ADS)

    Molnárka, G.; Miletics, E. M.; Fücsek, M.

    2008-09-01

    The otitis media is one of the mostly existing illness for the children, therefore investigation of the human middle ear ventilation is an actual problem. In earlier investigations both experimental and theoretical approach one can find in ([l]-[3]). Here we give a new mathematical and computer model to simulate this ventilation process. This model able to describe the diffusion and flow processes simultaneously, therefore it gives more precise results than earlier models did. The article contains the mathematical model and some results of the simulation.

  13. Viscous and thermal modelling of thermoplastic composites forming process

    NASA Astrophysics Data System (ADS)

    Guzman, Eduardo; Liang, Biao; Hamila, Nahiene; Boisse, Philippe

    2016-10-01

    Thermoforming thermoplastic prepregs is a fast manufacturing process. It is suitable for automotive composite parts manufacturing. The simulation of thermoplastic prepreg forming is achieved by alternate thermal and mechanical analyses. The thermal properties are obtained from a mesoscopic analysis and a homogenization procedure. The forming simulation is based on a viscous-hyperelastic approach. The thermal simulations define the coefficients of the mechanical model that depend on the temperature. The forming simulations modify the boundary conditions and the internal geometry of the thermal analyses. The comparison of the simulation with an experimental thermoforming of a part representative of automotive applications shows the efficiency of the approach.

  14. Experiences in teaching of modeling and simulation with emphasize on equation-based and acausal modeling techniques.

    PubMed

    Kulhánek, Tomáš; Ježek, Filip; Mateják, Marek; Šilar, Jan; Kofránek, Jří

    2015-08-01

    This work introduces experiences of teaching modeling and simulation for graduate students in the field of biomedical engineering. We emphasize the acausal and object-oriented modeling technique and we have moved from teaching block-oriented tool MATLAB Simulink to acausal and object oriented Modelica language, which can express the structure of the system rather than a process of computation. However, block-oriented approach is allowed in Modelica language too and students have tendency to express the process of computation. Usage of the exemplar acausal domains and approach allows students to understand the modeled problems much deeper. The causality of the computation is derived automatically by the simulation tool.

  15. A parallel implementation of an off-lattice individual-based model of multicellular populations

    NASA Astrophysics Data System (ADS)

    Harvey, Daniel G.; Fletcher, Alexander G.; Osborne, James M.; Pitt-Francis, Joe

    2015-07-01

    As computational models of multicellular populations include ever more detailed descriptions of biophysical and biochemical processes, the computational cost of simulating such models limits their ability to generate novel scientific hypotheses and testable predictions. While developments in microchip technology continue to increase the power of individual processors, parallel computing offers an immediate increase in available processing power. To make full use of parallel computing technology, it is necessary to develop specialised algorithms. To this end, we present a parallel algorithm for a class of off-lattice individual-based models of multicellular populations. The algorithm divides the spatial domain between computing processes and comprises communication routines that ensure the model is correctly simulated on multiple processors. The parallel algorithm is shown to accurately reproduce the results of a deterministic simulation performed using a pre-existing serial implementation. We test the scaling of computation time, memory use and load balancing as more processes are used to simulate a cell population of fixed size. We find approximate linear scaling of both speed-up and memory consumption on up to 32 processor cores. Dynamic load balancing is shown to provide speed-up for non-regular spatial distributions of cells in the case of a growing population.

  16. A model of nitrous oxide evolution from soil driven by rainfall events. I - Model structure and sensitivity. II - Model applications

    NASA Technical Reports Server (NTRS)

    Changsheng, LI; Frolking, Steve; Frolking, Tod A.

    1992-01-01

    Simulations of N2O and CO2 emissions from soils were conducted with a rain-event driven, process-oriented model (DNDC) of nitrogen and carbon cycling processes in soils. The magnitude and trends of simulated N2O (or N2O + N2) and CO2 emissions were consistent with the results obtained in field experiments. The successful simulation of these emissions from the range of soil types examined demonstrates that the DNDC will be a useful tool for the study of linkages among climate, soil-atmosphere interactions, land use, and trace gas fluxes.

  17. Computer simulation of the metastatic progression.

    PubMed

    Wedemann, Gero; Bethge, Anja; Haustein, Volker; Schumacher, Udo

    2014-01-01

    A novel computer model based on a discrete event simulation procedure describes quantitatively the processes underlying the metastatic cascade. Analytical functions describe the size of the primary tumor and the metastases, while a rate function models the intravasation events of the primary tumor and metastases. Events describe the behavior of the malignant cells until the formation of new metastases. The results of the computer simulations are in quantitative agreement with clinical data determined from a patient with hepatocellular carcinoma in the liver. The model provides a more detailed view on the process than a conventional mathematical model. In particular, the implications of interventions on metastasis formation can be calculated.

  18. Conceptual modeling for Prospective Health Technology Assessment.

    PubMed

    Gantner-Bär, Marion; Djanatliev, Anatoli; Prokosch, Hans-Ulrich; Sedlmayr, Martin

    2012-01-01

    Prospective Health Technology Assessment (ProHTA) is a new and innovative approach to analyze and assess new technologies, methods and procedures in health care. Simulation processes are used to model innovations before the cost-intensive design and development phase. Thus effects on patient care, the health care system as well as health economics aspects can be estimated. To generate simulation models a valid information base is necessary and therefore conceptual modeling is most suitable. Project-specifically improved methods and characteristics of simulation modeling are combined in the ProHTA Conceptual Modeling Process and initially implemented for acute ischemic stroke treatment in Germany. Additionally the project aims at simulation of other diseases and health care systems as well. ProHTA is an interdisciplinary research project within the Cluster of Excellence for Medical Technology - Medical Valley European Metropolitan Region Nuremberg (EMN), which is funded by the German Federal Ministry of Education and Research (BMBF), project grant No. 01EX1013B.

  19. A Management Information System Model for Program Management. Ph.D. Thesis - Oklahoma State Univ.; [Computerized Systems Analysis

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1972-01-01

    The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.

  20. Modelling of Sub-daily Hydrological Processes Using Daily Time-Step Models: A Distribution Function Approach to Temporal Scaling

    NASA Astrophysics Data System (ADS)

    Kandel, D. D.; Western, A. W.; Grayson, R. B.

    2004-12-01

    Mismatches in scale between the fundamental processes, the model and supporting data are a major limitation in hydrologic modelling. Surface runoff generation via infiltration excess and the process of soil erosion are fundamentally short time-scale phenomena and their average behaviour is mostly determined by the short time-scale peak intensities of rainfall. Ideally, these processes should be simulated using time-steps of the order of minutes to appropriately resolve the effect of rainfall intensity variations. However, sub-daily data support is often inadequate and the processes are usually simulated by calibrating daily (or even coarser) time-step models. Generally process descriptions are not modified but rather effective parameter values are used to account for the effect of temporal lumping, assuming that the effect of the scale mismatch can be counterbalanced by tuning the parameter values at the model time-step of interest. Often this results in parameter values that are difficult to interpret physically. A similar approach is often taken spatially. This is problematic as these processes generally operate or interact non-linearly. This indicates a need for better techniques to simulate sub-daily processes using daily time-step models while still using widely available daily information. A new method applicable to many rainfall-runoff-erosion models is presented. The method is based on temporal scaling using statistical distributions of rainfall intensity to represent sub-daily intensity variations in a daily time-step model. This allows the effect of short time-scale nonlinear processes to be captured while modelling at a daily time-step, which is often attractive due to the wide availability of daily forcing data. The approach relies on characterising the rainfall intensity variation within a day using a cumulative distribution function (cdf). This cdf is then modified by various linear and nonlinear processes typically represented in hydrological and erosion models. The statistical description of sub-daily variability is thus propagated through the model, allowing the effects of variability to be captured in the simulations. This results in cdfs of various fluxes, the integration of which over a day gives respective daily totals. Using 42-plot-years of surface runoff and soil erosion data from field studies in different environments from Australia and Nepal, simulation results from this cdf approach are compared with the sub-hourly (2-minute for Nepal and 6-minute for Australia) and daily models having similar process descriptions. Significant improvements in the simulation of surface runoff and erosion are achieved, compared with a daily model that uses average daily rainfall intensities. The cdf model compares well with a sub-hourly time-step model. This suggests that the approach captures the important effects of sub-daily variability while utilizing commonly available daily information. It is also found that the model parameters are more robustly defined using the cdf approach compared with the effective values obtained at the daily scale. This suggests that the cdf approach may offer improved model transferability spatially (to other areas) and temporally (to other periods).

  1. Simulating aerial gravitropism and posture control in plants: what has been done, what is missing

    NASA Astrophysics Data System (ADS)

    Coutand, Catherine; Pot, Guillaume; Bastien, R.; Badel, Eric; Moulia, Bruno

    The gravitropic response requires a process of perception of the signal and a motor process to actuate the movements. Different models have been developed, some focuses on the perception process and some focuses on the motor process. The kinematics of the gravitropic response will be first detailed to set the phenomenology of gravi- and auto-tropism. A model of perception (AC model) will be first presented to demonstrate that sensing inclination is not sufficient to control the gravitropic movement, and that proprioception is also involved. Then, “motor models” will be reviewed. In herbaceous plants, differential growth is the main motor. Modelling tropic movements with simulating elongation raises some difficulties that will be explained. In woody structures the main motor process is the differentiation of reaction wood via cambial growth. We will first present the simplest biomechanical model developed to simulate gravitropism and its limits will be pointed out. Then a more sophisticated model (TWIG) will be presented with a special focus on the importance of wood viscoelasticity and the wood maturation process and its regulation by a mechanosensing process. The presentation will end by a balance sheet of what is done and what is missing for a complete modelling of gravitropism and will present first results of a running project dedicating to get the data required to include phototropism in the actual models.

  2. MoSeS: Modelling and Simulation for e-Social Science.

    PubMed

    Townend, Paul; Xu, Jie; Birkin, Mark; Turner, Andy; Wu, Belinda

    2009-07-13

    MoSeS (Modelling and Simulation for e-Social Science) is a research node of the National Centre for e-Social Science. MoSeS uses e-Science techniques to execute an events-driven model that simulates discrete demographic processes; this allows us to project the UK population 25 years into the future. This paper describes the architecture, simulation methodology and latest results obtained by MoSeS.

  3. Simulations of Technology-Induced and Crisis-Led Stochastic and Chaotic Fluctuations in Higher Education Processes: A Model and a Case Study for Performance and Expected Employment

    ERIC Educational Resources Information Center

    Ahmet, Kara

    2015-01-01

    This paper presents a simple model of the provision of higher educational services that considers and exemplifies nonlinear, stochastic, and potentially chaotic processes. I use the methods of system dynamics to simulate these processes in the context of a particular sociologically interesting case, namely that of the Turkish higher education…

  4. Resolution-Adapted All-Atomic and Coarse-Grained Model for Biomolecular Simulations.

    PubMed

    Shen, Lin; Hu, Hao

    2014-06-10

    We develop here an adaptive multiresolution method for the simulation of complex heterogeneous systems such as the protein molecules. The target molecular system is described with the atomistic structure while maintaining concurrently a mapping to the coarse-grained models. The theoretical model, or force field, used to describe the interactions between two sites is automatically adjusted in the simulation processes according to the interaction distance/strength. Therefore, all-atomic, coarse-grained, or mixed all-atomic and coarse-grained models would be used together to describe the interactions between a group of atoms and its surroundings. Because the choice of theory is made on the force field level while the sampling is always carried out in the atomic space, the new adaptive method preserves naturally the atomic structure and thermodynamic properties of the entire system throughout the simulation processes. The new method will be very useful in many biomolecular simulations where atomistic details are critically needed.

  5. MOtoNMS: A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation.

    PubMed

    Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica

    2015-01-01

    Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust tools for the pre-processing of experimental movement data for their use in neuromusculoskeletal modeling software. This paper presents MOtoNMS (matlab MOtion data elaboration TOolbox for NeuroMusculoSkeletal applications), a toolbox freely available to the community, that aims to fill this lack. MOtoNMS processes experimental data from different motion analysis devices and generates input data for neuromusculoskeletal modeling and simulation software, such as OpenSim and CEINMS (Calibrated EMG-Informed NMS Modelling Toolbox). MOtoNMS implements commonly required processing steps and its generic architecture simplifies the integration of new user-defined processing components. MOtoNMS allows users to setup their laboratory configurations and processing procedures through user-friendly graphical interfaces, without requiring advanced computer skills. Finally, configuration choices can be stored enabling the full reproduction of the processing steps. MOtoNMS is released under GNU General Public License and it is available at the SimTK website and from the GitHub repository. Motion data collected at four institutions demonstrate that, despite differences in laboratory instrumentation and procedures, MOtoNMS succeeds in processing data and producing consistent inputs for OpenSim and CEINMS. MOtoNMS fills the gap between motion analysis and neuromusculoskeletal modeling and simulation. Its support to several devices, a complete implementation of the pre-processing procedures, its simple extensibility, the available user interfaces, and its free availability can boost the translation of neuromusculoskeletal methods in daily and clinical practice.

  6. Hydro turbine governor’s power control of hydroelectric unit with sloping ceiling tailrace tunnel

    NASA Astrophysics Data System (ADS)

    Fu, Liang; Wu, Changli; Tang, Weiping

    2018-02-01

    The primary frequency regulation and load regulation transient process when the hydro turbine governor is under the power mode of hydropower unit with sloping ceiling tailrace are analysed by field test and numerical simulation in this paper. A simulation method based on “three-zone model” to simulate small fluctuation transient process of the sloping ceiling tailrace is proposed. The simulation model of hydraulic turbine governor power mode is established by governor’s PLC program identification and parameter measurement, and the simulation model is verified by the test. The slow-fast-slow “three-stage regulation” method which can improve the dynamic quality of hydro turbine governor power mode is proposed. The power regulation strategy and parameters are optimized by numerical simulation, the performance of primary frequency regulation and load regulation transient process when the hydro turbine governor is under power mode are improved significantly.

  7. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  8. Modelling surface water-groundwater interaction with a conceptual approach: model development and application in New Zealand

    NASA Astrophysics Data System (ADS)

    Yang, J.; Zammit, C.; McMillan, H. K.

    2016-12-01

    As in most countries worldwide, water management in lowland areas is a big concern for New Zealand due to its economic importance for water related human activities. As a result, the estimation of available water resources in these areas (e.g., for irrigation and water supply purpose) is crucial and often requires an understanding of complex hydrological processes, which are often characterized by strong interactions between surface water and groundwater (usually expressed as losing and gaining rivers). These processes are often represented and simulated using integrated physically based hydrological models. However models with physically based groundwater modules typically require large amount of non-readily available geologic and aquifer information and are computationally intensive. Instead, this paper presents a conceptual groundwater model that is fully integrated into New Zealand's national hydrological model TopNet based on TopModel concepts (Beven, 1992). Within this conceptual framework, the integrated model can simulate not only surface processes, but also groundwater processes and surface water-groundwater interaction processes (including groundwater flow, river-groundwater interaction, and groundwater interaction with external watersheds). The developed model was applied to two New Zealand catchments with different hydro-geological and climate characteristics (Pareora catchment in the Canterbury Plains and Grey catchment on the West Coast). Previous studies have documented strong interactions between the river and groundwater, based on the analysis of a large number of concurrent flow measurements and associated information along the river main stem. Application of the integrated hydrological model indicates flow simulation (compared to the original hydrological model conceptualisation) during low flow conditions are significantly improved and further insights on local river dynamics are gained. Due to its conceptual characteristics and low level of data requirement, the integrated model could be used at local and national scales to improve the simulation of hydrological processes in non-topographically driven areas (where groundwater processes are important), and to assess impact of climate change on the integrated hydrological cycle in these areas.

  9. Simulating mixed-phase Arctic stratus clouds: sensitivity to ice initiation mechanisms

    NASA Astrophysics Data System (ADS)

    Sednev, I.; Menon, S.; McFarquhar, G.

    2008-06-01

    The importance of Arctic mixed-phase clouds on radiation and the Arctic climate is well known. However, the development of mixed-phase cloud parameterization for use in large scale models is limited by lack of both related observations and numerical studies using multidimensional models with advanced microphysics that provide the basis for understanding the relative importance of different microphysical processes that take place in mixed-phase clouds. To improve the representation of mixed-phase cloud processes in the GISS GCM we use the GISS single-column model coupled to a bin resolved microphysics (BRM) scheme that was specially designed to simulate mixed-phase clouds and aerosol-cloud interactions. Using this model with the microphysical measurements obtained from the DOE ARM Mixed-Phase Arctic Cloud Experiment (MPACE) campaign in October 2004 at the North Slope of Alaska, we investigate the effect of ice initiation processes and Bergeron-Findeisen process (BFP) on glaciation time and longevity of single-layer stratiform mixed-phase clouds. We focus on observations taken during 9th-10th October, which indicated the presence of a single-layer mixed-phase clouds. We performed several sets of 12-h simulations to examine model sensitivity to different ice initiation mechanisms and evaluate model output (hydrometeors' concentrations, contents, effective radii, precipitation fluxes, and radar reflectivity) against measurements from the MPACE Intensive Observing Period. Overall, the model qualitatively simulates ice crystal concentration and hydrometeors content, but it fails to predict quantitatively the effective radii of ice particles and their vertical profiles. In particular, the ice effective radii are overestimated by at least 50%. However, using the same definition as used for observations, the effective radii simulated and that observed were more comparable. We find that for the single-layer stratiform mixed-phase clouds simulated, process of ice phase initiation due to freezing of supercooled water in both saturated and undersaturated (w.r.t. water) environments is as important as primary ice crystal origination from water vapor. We also find that the BFP is a process mainly responsible for the rates of glaciation of simulated clouds. These glaciation rates cannot be adequately represented by a water-ice saturation adjustment scheme that only depends on temperature and liquid and solid hydrometeors' contents as is widely used in bulk microphysics schemes and are better represented by processes that also account for supersaturation changes as the hydrometeors grow.

  10. Simulating mixed-phase Arctic stratus clouds: sensitivity to ice initiation mechanisms

    NASA Astrophysics Data System (ADS)

    Sednev, I.; Menon, S.; McFarquhar, G.

    2009-07-01

    The importance of Arctic mixed-phase clouds on radiation and the Arctic climate is well known. However, the development of mixed-phase cloud parameterization for use in large scale models is limited by lack of both related observations and numerical studies using multidimensional models with advanced microphysics that provide the basis for understanding the relative importance of different microphysical processes that take place in mixed-phase clouds. To improve the representation of mixed-phase cloud processes in the GISS GCM we use the GISS single-column model coupled to a bin resolved microphysics (BRM) scheme that was specially designed to simulate mixed-phase clouds and aerosol-cloud interactions. Using this model with the microphysical measurements obtained from the DOE ARM Mixed-Phase Arctic Cloud Experiment (MPACE) campaign in October 2004 at the North Slope of Alaska, we investigate the effect of ice initiation processes and Bergeron-Findeisen process (BFP) on glaciation time and longevity of single-layer stratiform mixed-phase clouds. We focus on observations taken during 9-10 October, which indicated the presence of a single-layer mixed-phase clouds. We performed several sets of 12-h simulations to examine model sensitivity to different ice initiation mechanisms and evaluate model output (hydrometeors' concentrations, contents, effective radii, precipitation fluxes, and radar reflectivity) against measurements from the MPACE Intensive Observing Period. Overall, the model qualitatively simulates ice crystal concentration and hydrometeors content, but it fails to predict quantitatively the effective radii of ice particles and their vertical profiles. In particular, the ice effective radii are overestimated by at least 50%. However, using the same definition as used for observations, the effective radii simulated and that observed were more comparable. We find that for the single-layer stratiform mixed-phase clouds simulated, process of ice phase initiation due to freezing of supercooled water in both saturated and subsaturated (w.r.t. water) environments is as important as primary ice crystal origination from water vapor. We also find that the BFP is a process mainly responsible for the rates of glaciation of simulated clouds. These glaciation rates cannot be adequately represented by a water-ice saturation adjustment scheme that only depends on temperature and liquid and solid hydrometeors' contents as is widely used in bulk microphysics schemes and are better represented by processes that also account for supersaturation changes as the hydrometeors grow.

  11. Consolidation modelling for thermoplastic composites forming simulation

    NASA Astrophysics Data System (ADS)

    Xiong, H.; Rusanov, A.; Hamila, N.; Boisse, P.

    2016-10-01

    Pre-impregnated thermoplastic composites are widely used in the aerospace industry for their excellent mechanical properties, Thermoforming thermoplastic prepregs is a fast manufacturing process, the automotive industry has shown increasing interest in this manufacturing processes, in which the reconsolidation is an essential stage. The model of intimate contact is investigated as the consolidation model, compression experiments have been launched to identify the material parameters, several numerical tests show the influents of the temperature and pressure applied during processing. Finally, a new solid-shell prismatic element has been presented for the simulation of consolidation step in the thermoplastic composites forming process.

  12. Process Simulation of Aluminium Sheet Metal Deep Drawing at Elevated Temperatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winklhofer, Johannes; Trattnig, Gernot; Lind, Christoph

    Lightweight design is essential for an economic and environmentally friendly vehicle. Aluminium sheet metal is well known for its ability to improve the strength to weight ratio of lightweight structures. One disadvantage of aluminium is that it is less formable than steel. Therefore complex part geometries can only be realized by expensive multi-step production processes. One method for overcoming this disadvantage is deep drawing at elevated temperatures. In this way the formability of aluminium sheet metal can be improved significantly, and the number of necessary production steps can thereby be reduced. This paper introduces deep drawing of aluminium sheet metalmore » at elevated temperatures, a corresponding simulation method, a characteristic process and its optimization. The temperature and strain rate dependent material properties of a 5xxx series alloy and their modelling are discussed. A three dimensional thermomechanically coupled finite element deep drawing simulation model and its validation are presented. Based on the validated simulation model an optimised process strategy regarding formability, time and cost is introduced.« less

  13. Numerical simulation of heat transfer and phase change during freezing of potatoes with different shapes at the presence or absence of ultrasound irradiation

    NASA Astrophysics Data System (ADS)

    Kiani, Hossein; Sun, Da-Wen

    2018-03-01

    As novel processes such as ultrasound assisted heat transfer are emerged, new models and simulations are needed to describe these processes. In this paper, a numerical model was developed to study the freezing process of potatoes. Different thermal conductivity models were investigated, and the effect of sonication was evaluated on the convective heat transfer in a fluid to the particle heat transfer system. Potato spheres and sticks were the geometries researched, and the effect of different processing parameters on the results were studied. The numerical model successfully predicted the ultrasound assisted freezing of various shapes in comparison with experimental data of the process. The model was sensitive to processing parameters variation (sound intensity, duty cycle, shape, etc.) and could accurately simulate the freezing process. Among the thermal conductivity correlations studied, de Vries and Maxwell models gave closer estimations. The maximum temperature difference was obtained for the series equation that underestimated the thermal conductivity. Both numerical and experimental data confirmed that an optimum condition of intensity and duty cycle is needed for reducing the freezing time, as increasing the intensity, increased the heat transfer rate and sonically heating rate, simultaneously, that acted against each other.

  14. AmapSim: a structural whole-plant simulator based on botanical knowledge and designed to host external functional models.

    PubMed

    Barczi, Jean-François; Rey, Hervé; Caraglio, Yves; de Reffye, Philippe; Barthélémy, Daniel; Dong, Qiao Xue; Fourcaud, Thierry

    2008-05-01

    AmapSim is a tool that implements a structural plant growth model based on a botanical theory and simulates plant morphogenesis to produce accurate, complex and detailed plant architectures. This software is the result of more than a decade of research and development devoted to plant architecture. New advances in the software development have yielded plug-in external functions that open up the simulator to functional processes. The simulation of plant topology is based on the growth of a set of virtual buds whose activity is modelled using stochastic processes. The geometry of the resulting axes is modelled by simple descriptive functions. The potential growth of each bud is represented by means of a numerical value called physiological age, which controls the value for each parameter in the model. The set of possible values for physiological ages is called the reference axis. In order to mimic morphological and architectural metamorphosis, the value allocated for the physiological age of buds evolves along this reference axis according to an oriented finite state automaton whose occupation and transition law follows a semi-Markovian function. Simulations were performed on tomato plants to demonstrate how the AmapSim simulator can interface external modules, e.g. a GREENLAB growth model and a radiosity model. The algorithmic ability provided by AmapSim, e.g. the reference axis, enables unified control to be exercised over plant development parameter values, depending on the biological process target: how to affect the local pertinent process, i.e. the pertinent parameter(s), while keeping the rest unchanged. This opening up to external functions also offers a broadened field of applications and thus allows feedback between plant growth and the physical environment.

  15. AmapSim: A Structural Whole-plant Simulator Based on Botanical Knowledge and Designed to Host External Functional Models

    PubMed Central

    Barczi, Jean-François; Rey, Hervé; Caraglio, Yves; de Reffye, Philippe; Barthélémy, Daniel; Dong, Qiao Xue; Fourcaud, Thierry

    2008-01-01

    Background and Aims AmapSim is a tool that implements a structural plant growth model based on a botanical theory and simulates plant morphogenesis to produce accurate, complex and detailed plant architectures. This software is the result of more than a decade of research and development devoted to plant architecture. New advances in the software development have yielded plug-in external functions that open up the simulator to functional processes. Methods The simulation of plant topology is based on the growth of a set of virtual buds whose activity is modelled using stochastic processes. The geometry of the resulting axes is modelled by simple descriptive functions. The potential growth of each bud is represented by means of a numerical value called physiological age, which controls the value for each parameter in the model. The set of possible values for physiological ages is called the reference axis. In order to mimic morphological and architectural metamorphosis, the value allocated for the physiological age of buds evolves along this reference axis according to an oriented finite state automaton whose occupation and transition law follows a semi-Markovian function. Key Results Simulations were performed on tomato plants to demostrate how the AmapSim simulator can interface external modules, e.g. a GREENLAB growth model and a radiosity model. Conclusions The algorithmic ability provided by AmapSim, e.g. the reference axis, enables unified control to be exercised over plant development parameter values, depending on the biological process target: how to affect the local pertinent process, i.e. the pertinent parameter(s), while keeping the rest unchanged. This opening up to external functions also offers a broadened field of applications and thus allows feedback between plant growth and the physical environment. PMID:17766310

  16. Design-based research in designing the model for educating simulation facilitators.

    PubMed

    Koivisto, Jaana-Maija; Hannula, Leena; Bøje, Rikke Buus; Prescott, Stephen; Bland, Andrew; Rekola, Leena; Haho, Päivi

    2018-03-01

    The purpose of this article is to introduce the concept of design-based research, its appropriateness in creating education-based models, and to describe the process of developing such a model. The model was designed as part of the Nurse Educator Simulation based learning project, funded by the EU's Lifelong Learning program (2013-1-DK1-LEO05-07053). The project partners were VIA University College, Denmark, the University of Huddersfield, UK and Metropolia University of Applied Sciences, Finland. As an outcome of the development process, "the NESTLED model for educating simulation facilitators" (NESTLED model) was generated. This article also illustrates five design principles that could be applied to other pedagogies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Translating building information modeling to building energy modeling using model view definition.

    PubMed

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  18. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    PubMed Central

    Kim, Jong Bum; Clayton, Mark J.; Haberl, Jeff S.

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954

  19. Integration of communications and tracking data processing simulation for space station

    NASA Technical Reports Server (NTRS)

    Lacovara, Robert C.

    1987-01-01

    A simplified model of the communications network for the Communications and Tracking Data Processing System (CTDP) was developed. It was simulated by use of programs running on several on-site computers. These programs communicate with one another by means of both local area networks and direct serial connections. The domain of the model and its simulation is from Orbital Replaceable Unit (ORU) interface to Data Management Systems (DMS). The simulation was designed to allow status queries from remote entities across the DMS networks to be propagated through the model to several simulated ORU's. The ORU response is then propagated back to the remote entity which originated the request. Response times at the various levels were investigated in a multi-tasking, multi-user operating system environment. Results indicate that the effective bandwidth of the system may be too low to support expected data volume requirements under conventional operating systems. Instead, some form of embedded process control program may be required on the node computers.

  20. Surface-water hydrology and runoff simulations for three basins in Pierce County, Washington

    USGS Publications Warehouse

    Mastin, M.C.

    1996-01-01

    The surface-water hydrology in Clear, Clarks, and Clover Creek Basins in central Pierce County, Washington, is described with a conceptual model of the runoff processes and then simulated with the Hydrological Simulation Program-FORTRAN (HSPF), a continuous, deterministic hydrologic model. The study area is currently undergoing a rapid conversion of rural, undeveloped land to urban and suburban land that often changes the flow characteristics of the streams that drain these lands. The complex interactions of land cover, climate, soils, topography, channel characteristics, and ground- water flow patterns determine the surface-water hydrology of the study area and require a complex numerical model to assess the impact of urbanization on streamflows. The U.S. Geological Survey completed this investigation in cooperation with the Storm Drainage and Surface Water Management Utility within the Pierce County Department of Public Works to describe the important rainfall-runoff processes within the study area and to develop a simulation model to be used as a tool to predict changes in runoff characteristics resulting from changes in land use. The conceptual model, a qualitative representation of the study basins, links the physical characteristics to the runoff process of the study basins. The model incorporates 11 generalizations identified by the investigation, eight of which describe runoff from hillslopes, and three that account for the effects of channel characteristics and ground-water flow patterns on runoff. Stream discharge was measured at 28 sites and precipitation was measured at six sites for 3 years in two overlapping phases during the period of October 1989 through September 1992 to calibrate and validate the simulation model. Comparison of rainfall data from October 1989 through September 1992 shows the data-collection period beginning with 2 wet water years followed by the relatively dry 1992 water year. Runoff was simulated with two basin models-the Clover Creek Basin model and the Clear-Clarks Basin model-by incorporating the generalizations of the conceptual model into the construction of two HSPF numerical models. Initially, the process-related parameters for runoff from glacial-till hillslopes were calibrated with numerical models for three catchment sites and one headwater basin where streamflows were continuously measured and little or no influence from ground water, channel storage, or channel losses affected runoff. At one of the catchments soil moisture was monitored and compared with simulated soil moisture. The values for these parameters were used in the basin models. Basin models were calibrated to the first year of observed streamflow data by adjusting other parameters in the numerical model that simulated channel losses, simulated channel storage in a few of the reaches in the headwaters and in the floodplain of the main stem of Clover Creek, and simulated volume and outflow of the ground-water reservoir representing the regional ground-water aquifers. The models were run for a second year without any adjustments, and simulated results were compared with observed results as a measure of validation of the models. The investigation showed the importance of defining the ground-water flow boundaries and demonstrated a simple method of simulating the influence of the regional ground-water aquifer on streamflows. In the Clover Creek Basin model, ground-water flow boundaries were used to define subbasins containing mostly glacial outwash soils and not containing any surface drainage channels. In the Clear-Clarks Basin model, ground-water flow boundaries outlined a recharge area outside the surface-water boundaries of the basin that was incorporated into the model in order to provide sufficient water to balance simulated ground-water outflows to the creeks. A simulated ground-water reservoir used to represent regional ground-water flow processes successfully provided the proper water balance of inflows and outfl

  1. Modeling and simulating industrial land-use evolution in Shanghai, China

    NASA Astrophysics Data System (ADS)

    Qiu, Rongxu; Xu, Wei; Zhang, John; Staenz, Karl

    2018-01-01

    This study proposes a cellular automata-based Industrial and Residential Land Use Competition Model to simulate the dynamic spatial transformation of industrial land use in Shanghai, China. In the proposed model, land development activities in a city are delineated as competitions among different land-use types. The Hedonic Land Pricing Model is adopted to implement the competition framework. To improve simulation results, the Land Price Agglomeration Model was devised to simulate and adjust classic land price theory. A new evolutionary algorithm-based parameter estimation method was devised in place of traditional methods. Simulation results show that the proposed model closely resembles actual land transformation patterns and the model can not only simulate land development, but also redevelopment processes in metropolitan areas.

  2. Parallel computing method for simulating hydrological processesof large rivers under climate change

    NASA Astrophysics Data System (ADS)

    Wang, H.; Chen, Y.

    2016-12-01

    Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.

  3. On the limitations of General Circulation Climate Models

    NASA Technical Reports Server (NTRS)

    Stone, Peter H.; Risbey, James S.

    1990-01-01

    General Circulation Models (GCMs) by definition calculate large-scale dynamical and thermodynamical processes and their associated feedbacks from first principles. This aspect of GCMs is widely believed to give them an advantage in simulating global scale climate changes as compared to simpler models which do not calculate the large-scale processes from first principles. However, it is pointed out that the meridional transports of heat simulated GCMs used in climate change experiments differ from observational analyses and from other GCMs by as much as a factor of two. It is also demonstrated that GCM simulations of the large scale transports of heat are sensitive to the (uncertain) subgrid scale parameterizations. This leads to the question whether current GCMs are in fact superior to simpler models for simulating temperature changes associated with global scale climate change.

  4. Modeling cell adhesion and proliferation: a cellular-automata based approach.

    PubMed

    Vivas, J; Garzón-Alvarado, D; Cerrolaza, M

    Cell adhesion is a process that involves the interaction between the cell membrane and another surface, either a cell or a substrate. Unlike experimental tests, computer models can simulate processes and study the result of experiments in a shorter time and lower costs. One of the tools used to simulate biological processes is the cellular automata, which is a dynamic system that is discrete both in space and time. This work describes a computer model based on cellular automata for the adhesion process and cell proliferation to predict the behavior of a cell population in suspension and adhered to a substrate. The values of the simulated system were obtained through experimental tests on fibroblast monolayer cultures. The results allow us to estimate the cells settling time in culture as well as the adhesion and proliferation time. The change in the cells morphology as the adhesion over the contact surface progress was also observed. The formation of the initial link between cell and the substrate of the adhesion was observed after 100 min where the cell on the substrate retains its spherical morphology during the simulation. The cellular automata model developed is, however, a simplified representation of the steps in the adhesion process and the subsequent proliferation. A combined framework of experimental and computational simulation based on cellular automata was proposed to represent the fibroblast adhesion on substrates and changes in a macro-scale observed in the cell during the adhesion process. The approach showed to be simple and efficient.

  5. USER MANUAL FOR EXPRESS, THE EXAMS-PRZM EXPOSURE SIMULATION SHELL

    EPA Science Inventory

    The Environmental Fate and Effects Division (EFED) of EPA's Office of Pesticide Programs(OPP) uses a suite of ORD simulation models for the exposure analysis portion of regulatory risk assessments. These models (PRZM, EXAMS, AgDisp) are complex, process-based simulation codes tha...

  6. High performance hybrid functional Petri net simulations of biological pathway models on CUDA.

    PubMed

    Chalkidis, Georgios; Nagasaki, Masao; Miyano, Satoru

    2011-01-01

    Hybrid functional Petri nets are a wide-spread tool for representing and simulating biological models. Due to their potential of providing virtual drug testing environments, biological simulations have a growing impact on pharmaceutical research. Continuous research advancements in biology and medicine lead to exponentially increasing simulation times, thus raising the demand for performance accelerations by efficient and inexpensive parallel computation solutions. Recent developments in the field of general-purpose computation on graphics processing units (GPGPU) enabled the scientific community to port a variety of compute intensive algorithms onto the graphics processing unit (GPU). This work presents the first scheme for mapping biological hybrid functional Petri net models, which can handle both discrete and continuous entities, onto compute unified device architecture (CUDA) enabled GPUs. GPU accelerated simulations are observed to run up to 18 times faster than sequential implementations. Simulating the cell boundary formation by Delta-Notch signaling on a CUDA enabled GPU results in a speedup of approximately 7x for a model containing 1,600 cells.

  7. A Study of Umbilical Communication Interface of Simulator Kernel to Enhance Visibility and Controllability

    NASA Astrophysics Data System (ADS)

    Koo, Cheol Hea; Lee, Hoon Hee; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    In aerospace research and practical development area, increasing the usage of simulation in software development, component design and system operation has been maintained and the increasing speed getting faster. This phenomenon can be found from the easiness of handling of simulation and the powerfulness of the output from the simulation. Simulation brings lots of benefit from the several characteristics of it as following, - easy to handle ; it is never broken or damaged by mistake - never wear out ; it is never getting old - cost effective ; once it is built, it can be distributed over 100 ~ 1000 people GenSim (Generic Simulator) which is developing by KARI and compatible with ESA SMP standard provides such a simulation platform to support flight software validation and mission operation verification. User interface of GenSim is shown in Figure 1 [1,2]. As shown in Figure 1, as most simulation platform typically has, GenSim has GRD (Graphical Display) and AND (Alpha Numeric Display). But frequently more complex and powerful handling of the simulated data is required at the actual system validation for example mission operation. In Figure 2, system simulation result of COMS (Communication, Ocean, and Meteorological Satellite, launched at June 28 2008) is being drawn by Celestia 3D program. In this case, the needed data from Celestia is given by one of the simulation model resident in system simulator through UDP network connection in this case. But the requirement of displaying format, data size, and communication rate is variable so developer has to manage the connection protocol manually at each time and each case. It brings a chaos in the simulation model design and development, also to the performance issue at last. Performance issue is happen when the required data magnitude is higher than the capacity of simulation kernel to process the required data safely. The problem is that the sending data to a visualization tool such as celestia is given by a simulation model not kernel. Because the simulation model has no way to know about the status of simulation kernel load to process simulation events, as the result the simulation model sends the data as frequent as needed. This story may make many potential problems like lack of response, failure of meeting deadline and data integrity problem with the model data during the simulation. SIMSAT and EuroSim gives a warning message if the user request event such as printing log can't be processed as planned or requested. As the consequence the requested event will be delayed or not be able to be processed, and it means that this phenomenon may violate the planned deadline. In most soft real time simulation, this can be neglected and just make a little inconvenience of users. But it shall be noted that if the user request is not managed properly at some critical situation, the simulation results may be ended with a mess and chaos. As we traced the disadvantages of what simulation model provide the user request, simulation model is not appropriate to provide a service for such user request. This kind of work shall be minimized as much as possible.

  8. Standing wave design and experimental validation of a tandem simulated moving bed process for insulin purification.

    PubMed

    Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda

    2002-01-01

    A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.

  9. A flexible object-oriented software framework for developing complex multimedia simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sydelko, P. J.; Dolph, J. E.; Christiansen, J. H.

    Decision makers involved in brownfields redevelopment and long-term stewardship must consider environmental conditions, future-use potential, site ownership, area infrastructure, funding resources, cost recovery, regulations, risk and liability management, community relations, and expected return on investment in a comprehensive and integrated fashion to achieve desired results. Successful brownfields redevelopment requires the ability to assess the impacts of redevelopment options on multiple interrelated aspects of the ecosystem, both natural and societal. Computer-based tools, such as simulation models, databases, and geographical information systems (GISs) can be used to address brownfields planning and project execution. The transparent integration of these tools into a comprehensivemore » and dynamic decision support system would greatly enhance the brownfields assessment process. Such a system needs to be able to adapt to shifting and expanding analytical requirements and contexts. The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-oriented framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application domains. The modeling domain of a specific DIAS-based simulation is determined by (1) software objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. Models and applications used to express dynamic behaviors can be either internal or external to DIAS, including existing legacy models written in various languages (FORTRAN, C, etc.). The flexible design framework of DIAS makes the objects adjustable to the context of the problem without a great deal of recoding. The DIAS Spatial Data Set facility allows parameters to vary spatially depending on the simulation context according to any of a number of 1-D, 2-D, or 3-D topologies. DIAS is also capable of interacting with other GIS packages and can import many standard spatial data formats. DIAS simulation capabilities can also be extended by including societal process models. Models that implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations allow for interaction and feedback among natural and societal processes. The ability to simulate the complex interplay of multimedia processes makes DIAS a promising tool for constructing applications for comprehensive community planning, including the assessment of multiple development and redevelopment scenarios.« less

  10. Analysis of sensitivity of simulated recharge to selected parameters for seven watersheds modeled using the precipitation-runoff modeling system

    USGS Publications Warehouse

    Ely, D. Matthew

    2006-01-01

    Recharge is a vital component of the ground-water budget and methods for estimating it range from extremely complex to relatively simple. The most commonly used techniques, however, are limited by the scale of application. One method that can be used to estimate ground-water recharge includes process-based models that compute distributed water budgets on a watershed scale. These models should be evaluated to determine which model parameters are the dominant controls in determining ground-water recharge. Seven existing watershed models from different humid regions of the United States were chosen to analyze the sensitivity of simulated recharge to model parameters. Parameter sensitivities were determined using a nonlinear regression computer program to generate a suite of diagnostic statistics. The statistics identify model parameters that have the greatest effect on simulated ground-water recharge and that compare and contrast the hydrologic system responses to those parameters. Simulated recharge in the Lost River and Big Creek watersheds in Washington State was sensitive to small changes in air temperature. The Hamden watershed model in west-central Minnesota was developed to investigate the relations that wetlands and other landscape features have with runoff processes. Excess soil moisture in the Hamden watershed simulation was preferentially routed to wetlands, instead of to the ground-water system, resulting in little sensitivity of any parameters to recharge. Simulated recharge in the North Fork Pheasant Branch watershed, Wisconsin, demonstrated the greatest sensitivity to parameters related to evapotranspiration. Three watersheds were simulated as part of the Model Parameter Estimation Experiment (MOPEX). Parameter sensitivities for the MOPEX watersheds, Amite River, Louisiana and Mississippi, English River, Iowa, and South Branch Potomac River, West Virginia, were similar and most sensitive to small changes in air temperature and a user-defined flow routing parameter. Although the primary objective of this study was to identify, by geographic region, the importance of the parameter value to the simulation of ground-water recharge, the secondary objectives proved valuable for future modeling efforts. The value of a rigorous sensitivity analysis can (1) make the calibration process more efficient, (2) guide additional data collection, (3) identify model limitations, and (4) explain simulated results.

  11. How much expert knowledge is it worth to put in conceptual hydrological models?

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Zappa, Massimiliano

    2017-04-01

    Both modellers and experimentalists agree on using expert knowledge to improve our conceptual hydrological simulations on ungauged basins. However, they use expert knowledge differently for both hydrologically mapping the landscape and parameterising a given hydrological model. Modellers use generally very simplified (e.g. topography-based) mapping approaches and put most of the knowledge for constraining the model by defining parameter and process relational rules. In contrast, experimentalists tend to invest all their detailed and qualitative knowledge about processes to obtain a spatial distribution of areas with different dominant runoff generation processes (DRPs) as realistic as possible, and for defining plausible narrow value ranges for each model parameter. Since, most of the times, the modelling goal is exclusively to simulate runoff at a specific site, even strongly simplified hydrological classifications can lead to satisfying results due to equifinality of hydrological models, overfitting problems and the numerous uncertainty sources affecting runoff simulations. Therefore, to test to which extent expert knowledge can improve simulation results under uncertainty, we applied a typical modellers' modelling framework relying on parameter and process constraints defined based on expert knowledge to several catchments on the Swiss Plateau. To map the spatial distribution of the DRPs, mapping approaches with increasing involvement of expert knowledge were used. Simulation results highlighted the potential added value of using all the expert knowledge available on a catchment. Also, combinations of event types and landscapes, where even a simplified mapping approach can lead to satisfying results, were identified. Finally, the uncertainty originated by the different mapping approaches was compared with the one linked to meteorological input data and catchment initial conditions.

  12. Coupling dry deposition to vegetation phenology in the Community Earth System Model: Implications for the simulation of surface O3

    NASA Astrophysics Data System (ADS)

    Val Martin, M.; Heald, C. L.; Arnold, S. R.

    2014-04-01

    Dry deposition is an important removal process controlling surface ozone. We examine the representation of this ozone loss mechanism in the Community Earth System Model. We first correct the dry deposition parameterization by coupling the leaf and stomatal vegetation resistances to the leaf area index, an omission which has adversely impacted over a decade of ozone simulations using both the Model for Ozone and Related chemical Tracers (MOZART) and Community Atmospheric Model-Chem (CAM-Chem) global models. We show that this correction increases O3 dry deposition velocities over vegetated regions and improves the simulated seasonality in this loss process. This enhanced removal reduces the previously reported bias in summertime surface O3 simulated over eastern U.S. and Europe. We further optimize the parameterization by scaling down the stomatal resistance used in the Community Land Model to observed values. This in turn further improves the simulation of dry deposition velocity of O3, particularly over broadleaf forested regions. The summertime surface O3 bias is reduced from 30 ppb to 14 ppb over eastern U.S. and 13 ppb to 5 ppb over Europe from the standard to the optimized scheme, respectively. O3 deposition processes must therefore be accurately coupled to vegetation phenology within 3-D atmospheric models, as a first step toward improving surface O3 and simulating O3 responses to future and past vegetation changes.

  13. LPJ-GUESS Simulated North America Vegetation for 21-0 ka Using the TraCE-21ka Climate Simulation

    NASA Astrophysics Data System (ADS)

    Shafer, S. L.; Bartlein, P. J.

    2016-12-01

    Transient climate simulations that span multiple millennia (e.g., TraCE-21ka) have become more common as computing power has increased, allowing climate models to complete long simulations in relatively short periods of time (i.e., months). These climate simulations provide information on the potential rate, variability, and spatial expression of past climate changes. They also can be used as input data for other environmental models to simulate transient changes for different components of paleoenvironmental systems, such as vegetation. Long, transient paleovegetation simulations can provide information on a range of ecological processes, describe the spatial and temporal patterns of changes in species distributions, and identify the potential locations of past species refugia. Paleovegetation simulations also can be used to fill in spatial and temporal gaps in observed paleovegetation data (e.g., pollen records from lake sediments) and to test hypotheses of past vegetation change. We used the TraCE-21ka transient climate simulation for 21-0 ka from CCSM3, a coupled atmosphere-ocean general circulation model. The TraCE-21ka simulated temperature, precipitation, and cloud data were regridded onto a 10-minute grid of North America. These regridded climate data, along with soil data and atmospheric carbon dioxide concentrations, were used as input to LPJ-GUESS, a general ecosystem model, to simulate North America vegetation from 21-0 ka. LPJ-GUESS simulates many of the processes controlling the distribution of vegetation (e.g., competition), although some important processes (e.g., dispersal) are not simulated. We evaluate the LPJ-GUESS-simulated vegetation (in the form of plant functional types and biomes) for key time periods and compare the simulated vegetation with observed paleovegetation data, such as data archived in the Neotoma Paleoecology Database. In general, vegetation simulated by LPJ-GUESS reproduces the major North America vegetation patterns (e.g., forest, grassland) with regional areas of disagreement between simulated and observed vegetation. We describe the regions and time periods with the greatest data-model agreement and disagreement, and discuss some of the strengths and weaknesses of both the simulated climate and simulated vegetation data.

  14. Development of a Scale-up Tool for Pervaporation Processes

    PubMed Central

    Thiess, Holger; Strube, Jochen

    2018-01-01

    In this study, an engineering tool for the design and optimization of pervaporation processes is developed based on physico-chemical modelling coupled with laboratory/mini-plant experiments. The model incorporates the solution-diffusion-mechanism, polarization effects (concentration and temperature), axial dispersion, pressure drop and the temperature drop in the feed channel due to vaporization of the permeating components. The permeance, being the key model parameter, was determined via dehydration experiments on a mini-plant scale for the binary mixtures ethanol/water and ethyl acetate/water. A second set of experimental data was utilized for the validation of the model for two chemical systems. The industrially relevant ternary mixture, ethanol/ethyl acetate/water, was investigated close to its azeotropic point and compared to a simulation conducted with the determined binary permeance data. Experimental and simulation data proved to agree very well for the investigated process conditions. In order to test the scalability of the developed engineering tool, large-scale data from an industrial pervaporation plant used for the dehydration of ethanol was compared to a process simulation conducted with the validated physico-chemical model. Since the membranes employed in both mini-plant and industrial scale were of the same type, the permeance data could be transferred. The comparison of the measured and simulated data proved the scalability of the derived model. PMID:29342956

  15. ICME — A Mere Coupling of Models or a Discipline of Its Own?

    NASA Astrophysics Data System (ADS)

    Bambach, Markus; Schmitz, Georg J.; Prahl, Ulrich

    Technically, ICME — Integrated computational materials engineering — is an approach for solving advanced engineering problems related to the design of new materials and processes by combining individual materials and process models. The combination of models by now is mainly achieved by manual transformation of the output of a simulation to form the input to a subsequent one. This subsequent simulation is either performed at a different length scale or constitutes a subsequent step along the process chain. Is ICME thus just a synonym for the coupling of simulations? In fact, most ICME publications up to now are examples of the joint application of selected models and software codes to a specific problem. However, from a systems point of view, the coupling of individual models and/or software codes across length scales and along material processing chains leads to highly complex meta-models. Their viability has to be ensured by joint efforts from science, industry, software developers and independent organizations. This paper identifies some developments that seem necessary to make future ICME simulations viable, sustainable and broadly accessible and accepted. The main conclusion is that ICME is more than a multi-disciplinary subject but a discipline of its own, for which a generic structural framework has to be elaborated and established.

  16. Cellular automata-based modelling and simulation of biofilm structure on multi-core computers.

    PubMed

    Skoneczny, Szymon

    2015-01-01

    The article presents a mathematical model of biofilm growth for aerobic biodegradation of a toxic carbonaceous substrate. Modelling of biofilm growth has fundamental significance in numerous processes of biotechnology and mathematical modelling of bioreactors. The process following double-substrate kinetics with substrate inhibition proceeding in a biofilm has not been modelled so far by means of cellular automata. Each process in the model proposed, i.e. diffusion of substrates, uptake of substrates, growth and decay of microorganisms and biofilm detachment, is simulated in a discrete manner. It was shown that for flat biofilm of constant thickness, the results of the presented model agree with those of a continuous model. The primary outcome of the study was to propose a mathematical model of biofilm growth; however a considerable amount of focus was also placed on the development of efficient algorithms for its solution. Two parallel algorithms were created, differing in the way computations are distributed. Computer programs were created using OpenMP Application Programming Interface for C++ programming language. Simulations of biofilm growth were performed on three high-performance computers. Speed-up coefficients of computer programs were compared. Both algorithms enabled a significant reduction of computation time. It is important, inter alia, in modelling and simulation of bioreactor dynamics.

  17. Use of high performance networks and supercomputers for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1993-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.

  18. Modeling Best Management Practices (BMPs) with HSPF

    EPA Science Inventory

    The Hydrological Simulation Program-Fortran (HSPF) is a semi-distributed watershed model, which simulates hydrology and water quality processes at user-specified spatial and temporal scales. Although HSPF is a comprehensive and highly flexible model, a number of investigators not...

  19. Software quality and process improvement in scientific simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosiano, J.; Webster, R.

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  20. The use of mechanistic descriptions of algal growth and zooplankton grazing in an estuarine eutrophication model

    NASA Astrophysics Data System (ADS)

    Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.

    2003-03-01

    A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.

  1. Virtual welding equipment for simulation of GMAW processes with integration of power source regulation

    NASA Astrophysics Data System (ADS)

    Reisgen, Uwe; Schleser, Markus; Mokrov, Oleg; Zabirov, Alexander

    2011-06-01

    A two dimensional transient numerical analysis and computational module for simulation of electrical and thermal characteristics during electrode melting and metal transfer involved in Gas-Metal-Arc-Welding (GMAW) processes is presented. Solution of non-linear transient heat transfer equation is carried out using a control volume finite difference technique. The computational module also includes controlling and regulation algorithms of industrial welding power sources. The simulation results are the current and voltage waveforms, mean voltage drops at different parts of circuit, total electric power, cathode, anode and arc powers and arc length. We describe application of the model for normal process (constant voltage) and for pulsed processes with U/I and I/I-modulation modes. The comparisons with experimental waveforms of current and voltage show that the model predicts current, voltage and electric power with a high accuracy. The model is used in simulation package SimWeld for calculation of heat flux into the work-piece and the weld seam formation. From the calculated heat flux and weld pool sizes, an equivalent volumetric heat source according to Goldak model, can be generated. The method was implemented and investigated with the simulation software SimWeld developed by the ISF at RWTH Aachen University.

  2. Computing the apparent centroid of radar targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C.E.

    1996-12-31

    A high-frequency multibounce radar scattering code was used as a simulation platform for demonstrating an algorithm to compute the ARC of specific radar targets. To illustrate this simulation process, several targets models were used. Simulation results for a sphere model were used to determine the errors of approximation associated with the simulation; verifying the process. The severity of glint induced tracking errors was also illustrated using a model of an F-15 aircraft. It was shown, in a deterministic manner, that the ARC of a target can fall well outside its physical extent. Finally, the apparent radar centroid simulation based onmore » a ray casting procedure is well suited for use on most massively parallel computing platforms and could lead to the development of a near real-time radar tracking simulation for applications such as endgame fuzing, survivability, and vulnerability analyses using specific radar targets and fuze algorithms.« less

  3. Development of high resolution simulations of the atmospheric environment using the MASS model

    NASA Technical Reports Server (NTRS)

    Kaplan, Michael L.; Zack, John W.; Karyampudi, V. Mohan

    1989-01-01

    Numerical simulations were performed with a very high resolution (7.25 km) version of the MASS model (Version 4.0) in an effort to diagnose the vertical wind shear and static stability structure during the Shuttle Challenger disaster which occurred on 28 January 1986. These meso-beta scale simulations reveal that the strongest vertical wind shears were concentrated in the 200 to 150 mb layer at 1630 GMT, i.e., at about the time of the disaster. These simulated vertical shears were the result of two primary dynamical processes. The juxtaposition of both of these processes produced a shallow (30 mb deep) region of strong vertical wind shear, and hence, low Richardson number values during the launch time period. Comparisons with the Cape Canaveral (XMR) rawinsonde indicates that the high resolution MASS 4.0 simulation more closely emulated nature than did previous simulations of the same event with the GMASS model.

  4. Experiment Analysis and Modelling of Compaction Behaviour of Ag60Cu30Sn10 Mixed Metal Powders

    NASA Astrophysics Data System (ADS)

    Zhou, Mengcheng; Huang, Shangyu; Liu, Wei; Lei, Yu; Yan, Shiwei

    2018-03-01

    A novel process method combines powder compaction and sintering was employed to fabricate thin sheets of cadmium-free silver based filler metals, the compaction densification behaviour of Ag60Cu30Sn10 mixed metal powders was investigated experimentally. Based on the equivalent density method, the density-dependent Drucker-Prager Cap (DPC) model was introduced to model the powder compaction behaviour. Various experiment procedures were completed to determine the model parameters. The friction coefficients in lubricated and unlubricated die were experimentally determined. The determined material parameters were validated by experiments and numerical simulation of powder compaction process using a user subroutine (USDFLD) in ABAQUS/Standard. The good agreement between the simulated and experimental results indicates that the determined model parameters are able to describe the compaction behaviour of the multicomponent mixed metal powders, which can be further used for process optimization simulations.

  5. Extending the Community Multiscale Air Quality (CMAQ) Modeling System to Hemispheric Scales: Overview of Process Considerations and Initial Applications

    PubMed Central

    Mathur, Rohit; Xing, Jia; Gilliam, Robert; Sarwar, Golam; Hogrefe, Christian; Pleim, Jonathan; Pouliot, George; Roselle, Shawn; Spero, Tanya L.; Wong, David C.; Young, Jeffrey

    2018-01-01

    The Community Multiscale Air Quality (CMAQ) modeling system is extended to simulate ozone, particulate matter, and related precursor distributions throughout the Northern Hemisphere. Modelled processes were examined and enhanced to suitably represent the extended space and time scales for such applications. Hemispheric scale simulations with CMAQ and the Weather Research and Forecasting (WRF) model are performed for multiple years. Model capabilities for a range of applications including episodic long-range pollutant transport, long-term trends in air pollution across the Northern Hemisphere, and air pollution-climate interactions are evaluated through detailed comparison with available surface, aloft, and remotely sensed observations. The expansion of CMAQ to simulate the hemispheric scales provides a framework to examine interactions between atmospheric processes occurring at various spatial and temporal scales with physical, chemical, and dynamical consistency. PMID:29681922

  6. A Simple and Accurate Rate-Driven Infiltration Model

    NASA Astrophysics Data System (ADS)

    Cui, G.; Zhu, J.

    2017-12-01

    In this study, we develop a novel Rate-Driven Infiltration Model (RDIMOD) for simulating infiltration into soils. Unlike traditional methods, RDIMOD avoids numerically solving the highly non-linear Richards equation or simply modeling with empirical parameters. RDIMOD employs infiltration rate as model input to simulate one-dimensional infiltration process by solving an ordinary differential equation. The model can simulate the evolutions of wetting front, infiltration rate, and cumulative infiltration on any surface slope including vertical and horizontal directions. Comparing to the results from the Richards equation for both vertical infiltration and horizontal infiltration, RDIMOD simply and accurately predicts infiltration processes for any type of soils and soil hydraulic models without numerical difficulty. Taking into account the accuracy, capability, and computational effectiveness and stability, RDIMOD can be used in large-scale hydrologic and land-atmosphere modeling.

  7. Prediction of normalized biodiesel properties by simulation of multiple feedstock blends.

    PubMed

    García, Manuel; Gonzalo, Alberto; Sánchez, José Luis; Arauzo, Jesús; Peña, José Angel

    2010-06-01

    A continuous process for biodiesel production has been simulated using Aspen HYSYS V7.0 software. As fresh feed, feedstocks with a mild acid content have been used. The process flowsheet follows a traditional alkaline transesterification scheme constituted by esterification, transesterification and purification stages. Kinetic models taking into account the concentration of the different species have been employed in order to simulate the behavior of the CSTR reactors and the product distribution within the process. The comparison between experimental data found in literature and the predicted normalized properties, has been discussed. Additionally, a comparison between different thermodynamic packages has been performed. NRTL activity model has been selected as the most reliable of them. The combination of these models allows the prediction of 13 out of 25 parameters included in standard EN-14214:2003, and confers simulators a great value as predictive as well as optimization tool. (c) 2010 Elsevier Ltd. All rights reserved.

  8. Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software

    NASA Astrophysics Data System (ADS)

    Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.

    2017-09-01

    This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.

  9. [The history of development of evolutionary methods in St. Petersburg school of computer simulation in biology].

    PubMed

    Menshutkin, V V; Kazanskiĭ, A B; Levchenko, V F

    2010-01-01

    The history of rise and development of evolutionary methods in Saint Petersburg school of biological modelling is traced and analyzed. Some pioneering works in simulation of ecological and evolutionary processes, performed in St.-Petersburg school became an exemplary ones for many followers in Russia and abroad. The individual-based approach became the crucial point in the history of the school as an adequate instrument for construction of models of biological evolution. This approach is natural for simulation of the evolution of life-history parameters and adaptive processes in populations and communities. In some cases simulated evolutionary process was used for solving a reverse problem, i. e., for estimation of uncertain life-history parameters of population. Evolutionary computations is one more aspect of this approach application in great many fields. The problems and vistas of ecological and evolutionary modelling in general are discussed.

  10. Kinetic Theory and Simulation of Single-Channel Water Transport

    NASA Astrophysics Data System (ADS)

    Tajkhorshid, Emad; Zhu, Fangqiang; Schulten, Klaus

    Water translocation between various compartments of a system is a fundamental process in biology of all living cells and in a wide variety of technological problems. The process is of interest in different fields of physiology, physical chemistry, and physics, and many scientists have tried to describe the process through physical models. Owing to advances in computer simulation of molecular processes at an atomic level, water transport has been studied in a variety of molecular systems ranging from biological water channels to artificial nanotubes. While simulations have successfully described various kinetic aspects of water transport, offering a simple, unified model to describe trans-channel translocation of water turned out to be a nontrivial task.

  11. Understanding price discovery in interconnected markets: Generalized Langevin process approach and simulation

    NASA Astrophysics Data System (ADS)

    Schenck, Natalya A.; Horvath, Philip A.; Sinha, Amit K.

    2018-02-01

    While the literature on price discovery process and information flow between dominant and satellite market is exhaustive, most studies have applied an approach that can be traced back to Hasbrouck (1995) or Gonzalo and Granger (1995). In this paper, however, we propose a Generalized Langevin process with asymmetric double-well potential function, with co-integrated time series and interconnected diffusion processes to model the information flow and price discovery process in two, a dominant and a satellite, interconnected markets. A simulated illustration of the model is also provided.

  12. Modeling nuclear processes by Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox softwaremore » that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.« less

  13. Extending BPM Environments of Your Choice with Performance Related Decision Support

    NASA Astrophysics Data System (ADS)

    Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter

    What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.

  14. Towards process-informed bias correction of climate change simulations

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Shepherd, Theodore G.; Widmann, Martin; Zappa, Giuseppe; Walton, Daniel; Gutiérrez, José M.; Hagemann, Stefan; Richter, Ingo; Soares, Pedro M. M.; Hall, Alex; Mearns, Linda O.

    2017-11-01

    Biases in climate model simulations introduce biases in subsequent impact simulations. Therefore, bias correction methods are operationally used to post-process regional climate projections. However, many problems have been identified, and some researchers question the very basis of the approach. Here we demonstrate that a typical cross-validation is unable to identify improper use of bias correction. Several examples show the limited ability of bias correction to correct and to downscale variability, and demonstrate that bias correction can cause implausible climate change signals. Bias correction cannot overcome major model errors, and naive application might result in ill-informed adaptation decisions. We conclude with a list of recommendations and suggestions for future research to reduce, post-process, and cope with climate model biases.

  15. Dependence of Snowmelt Simulations on Scaling of the Forcing Processes (Invited)

    NASA Astrophysics Data System (ADS)

    Winstral, A. H.; Marks, D. G.; Gurney, R. J.

    2009-12-01

    The spatial organization and scaling relationships of snow distribution in mountain environs is ultimately dependent on the controlling processes. These processes include interactions between weather, topography, vegetation, snow state, and seasonally-dependent radiation inputs. In large scale snow modeling it is vital to know these dependencies to obtain accurate predictions while reducing computational costs. This study examined the scaling characteristics of the forcing processes and the dependency of distributed snowmelt simulations to their scaling. A base model simulation characterized these processes with 10m resolution over a 14.0 km2 basin with an elevation range of 1474 - 2244 masl. Each of the major processes affecting snow accumulation and melt - precipitation, wind speed, solar radiation, thermal radiation, temperature, and vapor pressure - were independently degraded to 1 km resolution. Seasonal and event-specific results were analyzed. Results indicated that scale effects on melt vary by process and weather conditions. The dependence of melt simulations on the scaling of solar radiation fluxes also had a seasonal component. These process-based scaling characteristics should remain static through time as they are based on physical considerations. As such, these results not only provide guidance for current modeling efforts, but are also well suited to predicting how potential climate changes will affect the heterogeneity of mountain snow distributions.

  16. Simulation of SEU Cross-sections using MRED under Conditions of Limited Device Information

    NASA Technical Reports Server (NTRS)

    Lauenstein, J. M.; Reed, R. A.; Weller, R. A.; Mendenhall, M. H.; Warren, K. M.; Pellish, J. A.; Schrimpf, R. D.; Sierawski, B. D.; Massengill, L. W.; Dodd, P. E.; hide

    2007-01-01

    This viewgraph presentation reviews the simulation of Single Event Upset (SEU) cross sections using the membrane electrode assembly (MEA) resistance and electrode diffusion (MRED) tool using "Best guess" assumptions about the process and geometry, and direct ionization, low-energy beam test results. This work will also simulate SEU cross-sections including angular and high energy responses and compare the simulated results with beam test data for the validation of the model. Using MRED, we produced a reasonably accurate upset response model of a low-critical charge SRAM without detailed information about the circuit, device geometry, or fabrication process

  17. Simulating Local Area Network Protocols with the General Purpose Simulation System (GPSS)

    DTIC Science & Technology

    1990-03-01

    generation 15 3.1.2 Frame delivery . 15 3.2 Model artifices 16 3.3 Model variables 17 3.4 Simulation results 18 4. EXTERNAL PROCEDURES USED IN SIMULATION 19...46 15. Token Ring: Frame generation process 47 16. Token Ring: Frame delivery process 48 17 . Token Ring: Mean transfer delay vs mean throughput 49...assumed to be zero were replaced by the maximum values specified in the ANSI 802.3 standard (viz &MI=6, &M2=3, &M3= 17 , &D1=18, &D2=3, &D4=4, &D7=3, and

  18. Simulation of generation of new ideas for new product development and IT services

    NASA Astrophysics Data System (ADS)

    Nasiopoulos, Dimitrios K.; Sakas, Damianos P.; Vlachos, D. S.; Mavrogianni, Amanda

    2015-02-01

    This paper describes a dynamic model of the New Product Development (NPD) process. The model has been occurring from best practice noticed in our research conducted at a range of situations. The model contributes to determine and put an IT company's NPD activities into the frame of the overall NPD process[1]. It has been found to be a useful tool for organizing data on IT company's NPD activities without enforcement an excessively restrictive research methodology refers to the model of NPD. The framework, which strengthens the model, will help to promote a research of the methods undertaken within an IT company's NPD process, thus promoting understanding and improvement of the simulation process[2]. IT companies tested many techniques with several different practices designed to improve the validity and efficacy of their NPD process[3]. Supported by the model, this research examines how widely accepted stated tactics are and what impact these best tactics have on NPD performance. The main assumption of this study is that simulation of generation of new ideas[4] will lead to greater NPD effectiveness and more successful products in IT companies. With the model implementation, practices concern the implementation strategies of NPD (product selection, objectives, leadership, marketing strategy and customer satisfaction) are all more widely accepted than best practices related with controlling the application of NPD (process control, measurements, results). In linking simulation with impact, our results states product success depends on developing strong products and ensuring organizational emphasis, through proper project selection. Project activities strengthens both product and project success. IT products and services success also depends on monitoring the NPD procedure through project management and ensuring team consistency with group rewards. Sharing experiences between projects can positively influence the NPD process.

  19. Plug-and -Play Model Architecture and Development Environment for Powertrain/Propulsion System - Final CRADA Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rousseau, Aymeric

    2013-02-01

    Several tools already exist to develop detailed plant model, including GT-Power, AMESim, CarSim, and SimScape. The objective of Autonomie is not to provide a language to develop detailed models; rather, Autonomie supports the assembly and use of models from design to simulation to analysis with complete plug-and-play capabilities. Autonomie provides a plug-and-play architecture to support this ideal use of modeling and simulation for math-based automotive control system design. Models in the standard format create building blocks, which are assembled at runtime into a simulation model of a vehicle, system, subsystem, or component to simulate. All parts of the graphical usermore » interface (GUI) are designed to be flexible to support architectures, systems, components, and processes not yet envisioned. This allows the software to be molded to individual uses, so it can grow as requirements and technical knowledge expands. This flexibility also allows for implementation of legacy code, including models, controller code, processes, drive cycles, and post-processing equations. A library of useful and tested models and processes is included as part of the software package to support a full range of simulation and analysis tasks, immediately. Autonomie also includes a configuration and database management front end to facilitate the storage, versioning, and maintenance of all required files, such as the models themselves, the model’s supporting files, test data, and reports. During the duration of the CRADA, Argonne has worked closely with GM to implement and demonstrate each one of their requirements. A use case was developed by GM for every requirement and demonstrated by Argonne. Each of the new features were verified by GM experts through a series of Gate. Once all the requirements were validated they were presented to the directors as part of GM Gate process.« less

  20. Mathematical modelling of disintegration-limited co-digestion of OFMSW and sewage sludge.

    PubMed

    Esposito, G; Frunzo, L; Panico, A; d'Antonio, G

    2008-01-01

    This paper presents a mathematical model able to simulate under dynamic conditions the physical, chemical and biological processes prevailing in a OFMSW and sewage sludge anaerobic digestion system. The model proposed is based on differential mass balance equations for substrates, products and bacterial groups involved in the co-digestion process and includes the biochemical reactions of the substrate conversion and the kinetics of microbial growth and decay. The main peculiarity of the model is the surface based kinetic description of the OFMSW disintegration process, whereas the pH determination is based on a nine-order polynomial equation derived by acid-base equilibria. The model can be applied to simulate the co-digestion process for several purposes, such as the evaluation of the optimal process conditions in terms of OFMSW/sewage sludge ratio, temperature, OFMSW particle size, solid mixture retention time, reactor stirring rate, etc. Biogas production and composition can also be evaluated to estimate the potential energy production under different process conditions. In particular, model simulations reported in this paper show the model capability to predict the OFMSW amount which can be treated in the digester of an existing MWWTP and to assess the OFMSW particle size diminution pre-treatment required to increase the rate of the disintegration process, which otherwise can highly limit the co-digestion system. Copyright IWA Publishing 2008.

  1. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  2. Composite Study Of Aerosol Long-Range Transport Events From East Asia And North America

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Waliser, D. E.; Guan, B.; Xavier, P.; Petch, J.; Klingaman, N. P.; Woolnough, S.

    2011-12-01

    While the Madden-Julian Oscillation (MJO) exerts pronounced influences on global climate and weather systems, current general circulation models (GCMs) exhibit rather limited capability in representing this prominent tropical variability mode. Meanwhile, the fundamental physics of the MJO are still elusive. Given the central role of the diabatic heating for prevailing MJO theories and demands for reducing the model deficiencies in simulating the MJO, a global model inter-comparison project on diabatic processes and vertical heating structure associated with the MJO has been coordinated through a joint effort by the WCRP-WWRP/THORPEX YOTC MJO Task Force and GEWEX GASS Program. In this presentation, progress of this model inter-comparison project will be reported, with main focus on climate simulations from about 27 atmosphere-only and coupled GCMs. Vertical structures of heating and diabatic processes associated with the MJO based on multi-model simulations will be presented along with their reanalysis and satellite estimate counterparts. Key processes possibly responsible for a realistic simulation of the MJO, including moisture-convection interaction, gross moist stability, ocean coupling, and surface heat flux, will be discussed.

  3. A SLAM II simulation model for analyzing space station mission processing requirements

    NASA Technical Reports Server (NTRS)

    Linton, D. G.

    1985-01-01

    Space station mission processing is modeled via the SLAM 2 simulation language on an IBM 4381 mainframe and an IBM PC microcomputer with 620K RAM, two double-sided disk drives and an 8087 coprocessor chip. Using a time phased mission (payload) schedule and parameters associated with the mission, orbiter (space shuttle) and ground facility databases, estimates for ground facility utilization are computed. Simulation output associated with the science and applications database is used to assess alternative mission schedules.

  4. Random Process Simulation for stochastic fatigue analysis. Ph.D. Thesis - Rice Univ., Houston, Tex.

    NASA Technical Reports Server (NTRS)

    Larsen, Curtis E.

    1988-01-01

    A simulation technique is described which directly synthesizes the extrema of a random process and is more efficient than the Gaussian simulation method. Such a technique is particularly useful in stochastic fatigue analysis because the required stress range moment E(R sup m), is a function only of the extrema of the random stress process. The family of autoregressive moving average (ARMA) models is reviewed and an autoregressive model is presented for modeling the extrema of any random process which has a unimodal power spectral density (psd). The proposed autoregressive technique is found to produce rainflow stress range moments which compare favorably with those computed by the Gaussian technique and to average 11.7 times faster than the Gaussian technique. The autoregressive technique is also adapted for processes having bimodal psd's. The adaptation involves using two autoregressive processes to simulate the extrema due to each mode and the superposition of these two extrema sequences. The proposed autoregressive superposition technique is 9 to 13 times faster than the Gaussian technique and produces comparable values for E(R sup m) for bimodal psd's having the frequency of one mode at least 2.5 times that of the other mode.

  5. Validation of a multi-phase plant-wide model for the description of the aeration process in a WWTP.

    PubMed

    Lizarralde, I; Fernández-Arévalo, T; Beltrán, S; Ayesa, E; Grau, P

    2018-02-01

    This paper introduces a new mathematical model built under the PC-PWM methodology to describe the aeration process in a full-scale WWTP. This methodology enables a systematic and rigorous incorporation of chemical and physico-chemical transformations into biochemical process models, particularly for the description of liquid-gas transfer to describe the aeration process. The mathematical model constructed is able to reproduce biological COD and nitrogen removal, liquid-gas transfer and chemical reactions. The capability of the model to describe the liquid-gas mass transfer has been tested by comparing simulated and experimental results in a full-scale WWTP. Finally, an exploration by simulation has been undertaken to show the potential of the mathematical model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Using field observations to inform thermal hydrology models of permafrost dynamics with ATS (v0.83)

    DOE PAGES

    Atchley, A. L.; Painter, S. L.; Harp, D. R.; ...

    2015-04-14

    Climate change is profoundly transforming the carbon-rich Arctic tundra landscape, potentially moving it from a carbon sink to a carbon source by increasing the thickness of soil that thaws on a seasonal basis. However, the modeling capability and precise parameterizations of the physical characteristics needed to estimate projected active layer thickness (ALT) are limited in Earth System Models (ESMs). In particular, discrepancies in spatial scale between field measurements and Earth System Models challenge validation and parameterization of hydrothermal models. A recently developed surface/subsurface model for permafrost thermal hydrology, the Advanced Terrestrial Simulator (ATS), is used in combination with field measurementsmore » to calibrate and identify fine scale controls of ALT in ice wedge polygon tundra in Barrow, Alaska. An iterative model refinement procedure that cycles between borehole temperature and snow cover measurements and simulations functions to evaluate and parameterize different model processes necessary to simulate freeze/thaw processes and ALT formation. After model refinement and calibration, reasonable matches between simulated and measured soil temperatures are obtained, with the largest errors occurring during early summer above ice wedges (e.g. troughs). The results suggest that properly constructed and calibrated one-dimensional thermal hydrology models have the potential to provide reasonable representation of the subsurface thermal response and can be used to infer model input parameters and process representations. The models for soil thermal conductivity and snow distribution were found to be the most sensitive process representations. However, information on lateral flow and snowpack evolution might be needed to constrain model representations of surface hydrology and snow depth.« less

  7. A generic biogeochemical module for earth system models

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Huang, M.; Liu, C.; Li, H.-Y.; Leung, L. R.

    2013-06-01

    Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into earth system models (e.g. community land models - CLM), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into the CLM model. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems.

  8. Modeling DNP3 Traffic Characteristics of Field Devices in SCADA Systems of the Smart Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Huan; Cheng, Liang; Chuah, Mooi Choo

    In the generation, transmission, and distribution sectors of the smart grid, intelligence of field devices is realized by programmable logic controllers (PLCs). Many smart-grid subsystems are essentially cyber-physical energy systems (CPES): For instance, the power system process (i.e., the physical part) within a substation is monitored and controlled by a SCADA network with hosts running miscellaneous applications (i.e., the cyber part). To study the interactions between the cyber and physical components of a CPES, several co-simulation platforms have been proposed. However, the network simulators/emulators of these platforms do not include a detailed traffic model that takes into account the impactsmore » of the execution model of PLCs on traffic characteristics. As a result, network traces generated by co-simulation only reveal the impacts of the physical process on the contents of the traffic generated by SCADA hosts, whereas the distinction between PLCs and computing nodes (e.g., a hardened computer running a process visualization application) has been overlooked. To generate realistic network traces using co-simulation for the design and evaluation of applications relying on accurate traffic profiles, it is necessary to establish a traffic model for PLCs. In this work, we propose a parameterized model for PLCs that can be incorporated into existing co-simulation platforms. We focus on the DNP3 subsystem of slave PLCs, which automates the processing of packets from the DNP3 master. To validate our approach, we extract model parameters from both the configuration and network traces of real PLCs. Simulated network traces are generated and compared against those from PLCs. Our evaluation shows that our proposed model captures the essential traffic characteristics of DNP3 slave PLCs, which can be used to extend existing co-simulation platforms and gain further insights into the behaviors of CPES.« less

  9. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  10. Using Instrument Simulators and a Satellite Database to Evaluate Microphysical Assumptions in High-Resolution Simulations of Hurricane Rita

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S. M.; Chao, Y.; Chau, A. H.; Haddad, Z. S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Martin, J. M.; Poulsen, W. L.; Rodriguez, E.; Stiles, B. W.; Turk, J.; Vu, Q.

    2009-12-01

    Improving forecasting of hurricane intensity remains a significant challenge for the research and operational communities. Many factors determine a tropical cyclone’s intensity. Ultimately, though, intensity is dependent on the magnitude and distribution of the latent heating that accompanies the hydrometeor production during the convective process. Hence, the microphysical processes and their representation in hurricane models are of crucial importance for accurately simulating hurricane intensity and evolution. The accurate modeling of the microphysical processes becomes increasingly important when running high-resolution models that should properly reflect the convective processes in the hurricane eyewall. There are many microphysical parameterizations available today. However, evaluating their performance and selecting the most representative ones remains a challenge. Several field campaigns were focused on collecting in situ microphysical observations to help distinguish between different modeling approaches and improve on the most promising ones. However, these point measurements cannot adequately reflect the space and time correlations characteristic of the convective processes. An alternative approach to evaluating microphysical assumptions is to use multi-parameter remote sensing observations of the 3D storm structure and evolution. In doing so, we could compare modeled to retrieved geophysical parameters. The satellite retrievals, however, carry their own uncertainty. To increase the fidelity of the microphysical evaluation results, we can use instrument simulators to produce satellite observables from the model fields and compare to the observed. This presentation will illustrate how instrument simulators can be used to discriminate between different microphysical assumptions. We will compare and contrast the members of high-resolution ensemble WRF model simulations of Hurricane Rita (2005), each member reflecting different microphysical assumptions. We will use the geophysical model fields as input to instrument simulators to produce microwave brightness temperatures and radar reflectivity at the TRMM (TMI and PR) frequencies and polarizations. We will also simulate the surface backscattering cross-section at the QuikSCAT frequency, polarizations and viewing geometry. We will use satellite observations from TRMM and QuikSCAT to determine those parameterizations that yield a realistic forecast and those parameterizations that do not. To facilitate hurricane research, we have developed the JPL Tropical Cyclone Information System (TCIS), which includes a comprehensive set of multi-sensor observations relevant to large-scale and storm-scale processes in the atmosphere and the ocean. In this presentation, we will illustrate how the TCIS can be used for hurricane research. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  11. Can climate models be tuned to simulate the global mean absolute temperature correctly?

    NASA Astrophysics Data System (ADS)

    Duan, Q.; Shi, Y.; Gong, W.

    2016-12-01

    The Inter-government Panel on Climate Change (IPCC) has already issued five assessment reports (ARs), which include the simulation of the past climate and the projection of the future climate under various scenarios. The participating models can simulate reasonably well the trend in global mean temperature change, especially of the last 150 years. However, there is a large, constant discrepancy in terms of global mean absolute temperature simulations over this period. This discrepancy remained in the same range between IPCC-AR4 and IPCC-AR5, which amounts to about 3oC between the coldest model and the warmest model. This discrepancy has great implications to the land processes, particularly the processes related to the cryosphere, and casts doubts over if land-atmosphere-ocean interactions are correctly considered in those models. This presentation aims to explore if this discrepancy can be reduced through model tuning. We present an automatic model calibration strategy to tune the parameters of a climate model so the simulated global mean absolute temperature would match the observed data over the last 150 years. An intermediate complexity model known as LOVECLIM is used in the study. This presentation will show the preliminary results.

  12. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    PubMed

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived commitment, influence and confidence of stakeholders in implementing policy and program decisions identified in the modelling process; and the impact of the process in terms of policy and program change. The study will generate empirical evidence on the feasibility and potential value of simulation modelling to support knowledge mobilisation and consensus building in health settings.

  13. Improving surgeon utilization in an orthopedic department using simulation modeling

    PubMed Central

    Simwita, Yusta W; Helgheim, Berit I

    2016-01-01

    Purpose Worldwide more than two billion people lack appropriate access to surgical services due to mismatch between existing human resource and patient demands. Improving utilization of existing workforce capacity can reduce the existing gap between surgical demand and available workforce capacity. In this paper, the authors use discrete event simulation to explore the care process at an orthopedic department. Our main focus is improving utilization of surgeons while minimizing patient wait time. Methods The authors collaborated with orthopedic department personnel to map the current operations of orthopedic care process in order to identify factors that influence poor surgeons utilization and high patient waiting time. The authors used an observational approach to collect data. The developed model was validated by comparing the simulation output with the actual patient data that were collected from the studied orthopedic care process. The authors developed a proposal scenario to show how to improve surgeon utilization. Results The simulation results showed that if ancillary services could be performed before the start of clinic examination services, the orthopedic care process could be highly improved. That is, improved surgeon utilization and reduced patient waiting time. Simulation results demonstrate that with improved surgeon utilizations, up to 55% increase of future demand can be accommodated without patients reaching current waiting time at this clinic, thus, improving patient access to health care services. Conclusion This study shows how simulation modeling can be used to improve health care processes. This study was limited to a single care process; however the findings can be applied to improve other orthopedic care process with similar operational characteristics. PMID:29355193

  14. Modified two-layer social force model for emergency earthquake evacuation

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Liu, Hong; Qin, Xin; Liu, Baoxi

    2018-02-01

    Studies of crowd behavior with related research on computer simulation provide an effective basis for architectural design and effective crowd management. Based on low-density group organization patterns, a modified two-layer social force model is proposed in this paper to simulate and reproduce a group gathering process. First, this paper studies evacuation videos from the Luan'xian earthquake in 2012, and extends the study of group organization patterns to a higher density. Furthermore, taking full advantage of the strength in crowd gathering simulations, a new method on grouping and guidance is proposed while using crowd dynamics. Second, a real-life grouping situation in earthquake evacuation is simulated and reproduced. Comparing with the fundamental social force model and existing guided crowd model, the modified model reduces congestion time and truly reflects group behaviors. Furthermore, the experiment result also shows that a stable group pattern and a suitable leader could decrease collision and allow a safer evacuation process.

  15. Stochastic Modelling, Analysis, and Simulations of the Solar Cycle Dynamic Process

    NASA Astrophysics Data System (ADS)

    Turner, Douglas C.; Ladde, Gangaram S.

    2018-03-01

    Analytical solutions, discretization schemes and simulation results are presented for the time delay deterministic differential equation model of the solar dynamo presented by Wilmot-Smith et al. In addition, this model is extended under stochastic Gaussian white noise parametric fluctuations. The introduction of stochastic fluctuations incorporates variables affecting the dynamo process in the solar interior, estimation error of parameters, and uncertainty of the α-effect mechanism. Simulation results are presented and analyzed to exhibit the effects of stochastic parametric volatility-dependent perturbations. The results generalize and extend the work of Hazra et al. In fact, some of these results exhibit the oscillatory dynamic behavior generated by the stochastic parametric additative perturbations in the absence of time delay. In addition, the simulation results of the modified stochastic models influence the change in behavior of the very recently developed stochastic model of Hazra et al.

  16. Modelling the pelagic nitrogen cycle and vertical particle flux in the Norwegian sea

    NASA Astrophysics Data System (ADS)

    Haupt, Olaf J.; Wolf, Uli; v. Bodungen, Bodo

    1999-02-01

    A 1D Eulerian ecosystem model (BIological Ocean Model) for the Norwegian Sea was developed to investigate the dynamics of pelagic ecosystems. The BIOM combines six biochemical compartments and simulates the annual nitrogen cycle with specific focus on production, modification and sedimentation of particles in the water column. The external forcing and physical framework is based on a simulated annual cycle of global radiation and an annual mixed-layer cycle derived from field data. The vertical resolution of the model is given by an exponential grid with 200 depth layers, allowing specific parameterization of various sinking velocities, breakdown of particles and the remineralization processes. The aim of the numerical experiments is the simulation of ecosystem dynamics considering the specific biogeochemical properties of the Norwegian Sea, for example the life cycle of the dominant copepod Calanus finmarchicus. The results of the simulations were validated with field data. Model results are in good agreement with field data for the lower trophic levels of the food web. With increasing complexity of the organisms the differences increase between simulated processes and field data. Results of the numerical simulations suggest that BIOM is well adapted to investigate a physically controlled ecosystem. The simulation of grazing controlled pelagic ecosystems, like the Norwegian Sea, requires adaptations of parameterization to the specific ecosystem features. By using seasonally adaptation of the most sensible processes like utilization of light by phytoplankton and grazing by zooplankton results were greatly improved.

  17. Diabatic modification of potential vorticity in extratropical cyclones

    NASA Astrophysics Data System (ADS)

    Chagnon, J.

    2012-12-01

    Representation of diabatic processes and their impact on extratropical cyclones is a likely source of skill degradation in operational numerical weather prediction systems. This investigation examines the source, structure, and magnitude of diabatic potential vorticity (PV) anomalies generated by small-scale and parameterized processes in both mesoscale and global model simulations of extratropical cyclones in the North Atlantic. Simulations of several cold season extratropical storms have been performed using the Met Office Unified Model. Several cases simulated were drawn from the DIAbatic influences on Mesoscale structures in ExTratropical cyclones (DIAMET) observational campaign during which the National Environmental Research Council (NERC) Facility for Airborne Atmospheric Measurement (FAAM) BAE-146 aircraft was deployed. The influence of specific modelled processes was quantified using a set of tracers, each of which represents a history of the PV contributed by a specific segment of the model (e.g., boundary-layer scheme, cloud microphysics, convection scheme , radiation, etc.). This presentation will highlight several differences and similarities in high and low resolution simulations. For example, in high resolution simulations, tropopause folds are sharpened by a tripolar PV anomaly arising from the convection, boundary-layer, and microphysics schemes; this structure is not present in coarser global model simulations. However, a dipole of PV straddling the tropopause is diagnosed in both coarse- and fine-resolution simulations. The PV dipole, which is strongly influenced by long-wave radiative cooling, increases the gradient of PV near the tropopause and therefore modifies the characteristics Rossby wave propagation and moist baroclinic wave growth.

  18. Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.

  19. Effect of land model ensemble versus coupled model ensemble on the simulation of precipitation climatology and variability

    NASA Astrophysics Data System (ADS)

    Wei, Jiangfeng; Dirmeyer, Paul A.; Yang, Zong-Liang; Chen, Haishan

    2017-10-01

    Through a series of model simulations with an atmospheric general circulation model coupled to three different land surface models, this study investigates the impacts of land model ensembles and coupled model ensemble on precipitation simulation. It is found that coupling an ensemble of land models to an atmospheric model has a very minor impact on the improvement of precipitation climatology and variability, but a simple ensemble average of the precipitation from three individually coupled land-atmosphere models produces better results, especially for precipitation variability. The generally weak impact of land processes on precipitation should be the main reason that the land model ensembles do not improve precipitation simulation. However, if there are big biases in the land surface model or land surface data set, correcting them could improve the simulated climate, especially for well-constrained regional climate simulations.

  20. Structurally Integrated Coatings for Wear and Corrosion (SICWC): Arc Lamp, InfraRed (IR) Thermal Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mackiewicz-Ludtka, G.; Sebright, J.

    2007-12-15

    The primary goal of this Cooperative Research and Development Agreement (CRADA) betwe1311 UT-Battelle (Contractor) and Caterpillar Inc. (Participant) was to develop the plasma arc lamp (PAL), infrared (IR) thermal processing technology 1.) to enhance surface coating performance by improving the interfacial bond strength between selected coatings and substrates; and 2.) to extend this technology base for transitioning of the arc lamp processing to the industrial Participant. Completion of the following three key technical tasks (described below) was necessary in order to accomplish this goal. First, thermophysical property data sets were successfully determined for composite coatings applied to 1010 steel substrates,more » with a more limited data set successfully measured for free-standing coatings. These data are necessary for the computer modeling simulations and parametric studies to; A.) simulate PAL IR processing, facilitating the development of the initial processing parameters; and B.) help develop a better understanding of the basic PAL IR fusing process fundamentals, including predicting the influence of melt pool stirring and heat tnmsfar characteristics introduced during plasma arc lamp infrared (IR) processing; Second, a methodology and a set of procedures were successfully developed and the plasma arc lamp (PAL) power profiles were successfully mapped as a function of PAL power level for the ORNL PAL. The latter data also are necessary input for the computer model to accurately simulate PAL processing during process modeling simulations, and to facilitate a better understand of the fusing process fundamentals. Third, several computer modeling codes have been evaluated as to their capabilities and accuracy in being able to capture and simulate convective mixing that may occur during PAL thermal processing. The results from these evaluation efforts are summarized in this report. The intention of this project was to extend the technology base and provide for transitioning of the arc lamp processing to the industrial Participant.« less

  1. Improving the capability of an integrated CA-Markov model to simulate spatio-temporal urban growth trends using an Analytical Hierarchy Process and Frequency Ratio

    NASA Astrophysics Data System (ADS)

    Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan

    2017-07-01

    The creation of an accurate simulation of future urban growth is considered one of the most important challenges in urban studies that involve spatial modeling. The purpose of this study is to improve the simulation capability of an integrated CA-Markov Chain (CA-MC) model using CA-MC based on the Analytical Hierarchy Process (AHP) and CA-MC based on Frequency Ratio (FR), both applied in Seremban, Malaysia, as well as to compare the performance and accuracy between the traditional and hybrid models. Various physical, socio-economic, utilities, and environmental criteria were used as predictors, including elevation, slope, soil texture, population density, distance to commercial area, distance to educational area, distance to residential area, distance to industrial area, distance to roads, distance to highway, distance to railway, distance to power line, distance to stream, and land cover. For calibration, three models were applied to simulate urban growth trends in 2010; the actual data of 2010 were used for model validation utilizing the Relative Operating Characteristic (ROC) and Kappa coefficient methods Consequently, future urban growth maps of 2020 and 2030 were created. The validation findings confirm that the integration of the CA-MC model with the FR model and employing the significant driving force of urban growth in the simulation process have resulted in the improved simulation capability of the CA-MC model. This study has provided a novel approach for improving the CA-MC model based on FR, which will provide powerful support to planners and decision-makers in the development of future sustainable urban planning.

  2. Simulation modeling of forest landscape disturbances: Where do we go from here?

    Treesearch

    Ajith H. Perera; Brian R. Sturtevant; Lisa J. Buse

    2015-01-01

    It was nearly a quarter-century ago when Turner and Gardner (1991) drew attention to methods of quantifying landscape patterns and processes, including simulation modeling. The many authors who contributed to that seminal text collectively signaled the emergence of a new field—spatially explicit simulation modeling of broad-scale ecosystem dynamics. Of particular note...

  3. Simulating hydrological processes of a typical small mountainous catchment in Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Xu, Y. P.; Bai, Z.; Fu, Q.; Pan, S.; Zhu, C.

    2017-12-01

    Water cycle of small watersheds with seasonal/permanent frozen soil and snow pack in Tibetan Plateau is seriously affected by climate change. The objective of this study is to find out how much and in what way the frozen soil and snow pack will influence the hydrology of small mountainous catchments in cold regions and how can the performance of simulation by a distributed hydrological model be improved. The Dong catchment, a small catchment located in Tibetan Plateau, is used as a case study. Two measurement stations are set up to collect basic meteorological and hydrological data for the modeling purpose. Annual and interannual variations of runoff indices are first analyzed based on historic data series. The sources of runoff in dry periods and wet periods are analyzed respectively. Then, a distributed hydrology soil vegetation model (DHSVM) is adopted to simulate the hydrological process of Dong catchment based on limited data set. Global sensitivity analysis is applied to help determine the important processes of the catchment. Based on sensitivity analysis results, the Epsilon-Dominance Non-Dominated Sorted Genetic Algorithm II (ɛ-NSGAII) is finally added into the hydrological model to calibrate the hydrological model in a multi-objective way and analyze the performance of DHSVM model. The performance of simulation is evaluated with several evaluation indices. The final results show that frozen soil and snow pack do play an important role in hydrological processes in cold mountainous region, in particular in dry periods without precipitation, while in wet periods precipitation is often the main source of runoff. The results also show that although the DHSVM hydrological model has the potential to model the hydrology well in small mountainous catchments with very limited data in Tibetan Plateau, the simulation of hydrology in dry periods is not very satisfactory due to the model's insufficiency in simulating seasonal frozen soil.

  4. Molecular simulations of self-assembly processes in metal-organic frameworks: Model dependence

    NASA Astrophysics Data System (ADS)

    Biswal, Debasmita; Kusalik, Peter G.

    2017-07-01

    Molecular simulation is a powerful tool for investigating microscopic behavior in various chemical systems, where the use of suitable models is critical to successfully reproduce the structural and dynamic properties of the real systems of interest. In this context, molecular dynamics simulation studies of self-assembly processes in metal-organic frameworks (MOFs), a well-known class of porous materials with interesting chemical and physical properties, are relatively challenging, where a reasonably accurate representation of metal-ligand interactions is anticipated to play an important role. In the current study, we both investigate the performance of some existing models and introduce and test new models to help explore the self-assembly in an archetypal Zn-carboxylate MOF system. To this end, the behavior of six different Zn-ion models, three solvent models, and two ligand models was examined and validated against key experimental structural parameters. To explore longer time scale ordering events during MOF self-assembly via explicit solvent simulations, it is necessary to identify a suitable combination of simplified model components representing metal ions, organic ligands, and solvent molecules. It was observed that an extended cationic dummy atom (ECDA) Zn-ion model combined with an all-atom carboxylate ligand model and a simple dipolar solvent model can reproduce characteristic experimental structures for the archetypal MOF system. The successful use of these models in extensive sets of molecular simulations, which provide key insights into the self-assembly mechanism of this archetypal MOF system occurring during the early stages of this process, has been very recently reported.

  5. Physical Uncertainty Bounds (PUB)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switchingmore » out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.« less

  6. Numerical Simulation of Nonperiodic Rail Operation Diagram Characteristics

    PubMed Central

    Qian, Yongsheng; Wang, Bingbing; Zeng, Junwei; Wang, Xin

    2014-01-01

    This paper succeeded in utilizing cellular automata (CA) model to simulate the process of the train operation under the four-aspect color light system and getting the nonperiodic diagram of the mixed passenger and freight tracks. Generally speaking, the concerned models could simulate well the situation of wagon in preventing trains from colliding when parking and restarting and of the real-time changes the situation of train speeds and displacement and get hold of the current train states in their departures and arrivals. Finally the model gets the train diagram that simulates the train operation in different ratios of the van and analyzes some parameter characters in the process of train running, such as time, speed, through capacity, interval departing time, and departing numbers. PMID:25435863

  7. Space-filling designs for computer experiments: A review

    DOE PAGES

    Joseph, V. Roshan

    2016-01-29

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  8. Space-filling designs for computer experiments: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  9. One-dimensional biomass fast pyrolysis model with reaction kinetics integrated in an Aspen Plus Biorefinery Process Model

    DOE PAGES

    Humbird, David; Trendewicz, Anna; Braun, Robert; ...

    2017-01-12

    A biomass fast pyrolysis reactor model with detailed reaction kinetics and one-dimensional fluid dynamics was implemented in an equation-oriented modeling environment (Aspen Custom Modeler). Portions of this work were detailed in previous publications; further modifications have been made here to improve stability and reduce execution time of the model to make it compatible for use in large process flowsheets. The detailed reactor model was integrated into a larger process simulation in Aspen Plus and was stable for different feedstocks over a range of reactor temperatures. Sample results are presented that indicate general agreement with experimental results, but with higher gasmore » losses caused by stripping of the bio-oil by the fluidizing gas in the simulated absorber/condenser. Lastly, this integrated modeling approach can be extended to other well-defined, predictive reactor models for fast pyrolysis, catalytic fast pyrolysis, as well as other processes.« less

  10. One-dimensional biomass fast pyrolysis model with reaction kinetics integrated in an Aspen Plus Biorefinery Process Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humbird, David; Trendewicz, Anna; Braun, Robert

    A biomass fast pyrolysis reactor model with detailed reaction kinetics and one-dimensional fluid dynamics was implemented in an equation-oriented modeling environment (Aspen Custom Modeler). Portions of this work were detailed in previous publications; further modifications have been made here to improve stability and reduce execution time of the model to make it compatible for use in large process flowsheets. The detailed reactor model was integrated into a larger process simulation in Aspen Plus and was stable for different feedstocks over a range of reactor temperatures. Sample results are presented that indicate general agreement with experimental results, but with higher gasmore » losses caused by stripping of the bio-oil by the fluidizing gas in the simulated absorber/condenser. Lastly, this integrated modeling approach can be extended to other well-defined, predictive reactor models for fast pyrolysis, catalytic fast pyrolysis, as well as other processes.« less

  11. Reverse logistics system planning for recycling computers hardware: A case study

    NASA Astrophysics Data System (ADS)

    Januri, Siti Sarah; Zulkipli, Faridah; Zahari, Siti Meriam; Shamsuri, Siti Hajar

    2014-09-01

    This paper describes modeling and simulation of reverse logistics networks for collection of used computers in one of the company in Selangor. The study focuses on design of reverse logistics network for used computers recycling operation. Simulation modeling, presented in this work allows the user to analyze the future performance of the network and to understand the complex relationship between the parties involved. The findings from the simulation suggest that the model calculates processing time and resource utilization in a predictable manner. In this study, the simulation model was developed by using Arena simulation package.

  12. Modeling target normal sheath acceleration using handoffs between multiple simulations

    NASA Astrophysics Data System (ADS)

    McMahon, Matthew; Willis, Christopher; Mitchell, Robert; King, Frank; Schumacher, Douglass; Akli, Kramer; Freeman, Richard

    2013-10-01

    We present a technique to model the target normal sheath acceleration (TNSA) process using full-scale LSP PIC simulations. The technique allows for a realistic laser, full size target and pre-plasma, and sufficient propagation length for the accelerated ions and electrons. A first simulation using a 2D Cartesian grid models the laser-plasma interaction (LPI) self-consistently and includes field ionization. Electrons accelerated by the laser are imported into a second simulation using a 2D cylindrical grid optimized for the initial TNSA process and incorporating an equation of state. Finally, all of the particles are imported to a third simulation optimized for the propagation of the accelerated ions and utilizing a static field solver for initialization. We also show use of 3D LPI simulations. Simulation results are compared to recent ion acceleration experiments using SCARLET laser at The Ohio State University. This work was performed with support from ASOFR under contract # FA9550-12-1-0341, DARPA, and allocations of computing time from the Ohio Supercomputing Center.

  13. Using a Virtual Experiment to Analyze Infiltration Process from Point to Grid-cell Size Scale

    NASA Astrophysics Data System (ADS)

    Barrios, M. I.

    2013-12-01

    The hydrological science requires the emergence of a consistent theoretical corpus driving the relationships between dominant physical processes at different spatial and temporal scales. However, the strong spatial heterogeneities and non-linearities of these processes make difficult the development of multiscale conceptualizations. Therefore, scaling understanding is a key issue to advance this science. This work is focused on the use of virtual experiments to address the scaling of vertical infiltration from a physically based model at point scale to a simplified physically meaningful modeling approach at grid-cell scale. Numerical simulations have the advantage of deal with a wide range of boundary and initial conditions against field experimentation. The aim of the work was to show the utility of numerical simulations to discover relationships between the hydrological parameters at both scales, and to use this synthetic experience as a media to teach the complex nature of this hydrological process. The Green-Ampt model was used to represent vertical infiltration at point scale; and a conceptual storage model was employed to simulate the infiltration process at the grid-cell scale. Lognormal and beta probability distribution functions were assumed to represent the heterogeneity of soil hydraulic parameters at point scale. The linkages between point scale parameters and the grid-cell scale parameters were established by inverse simulations based on the mass balance equation and the averaging of the flow at the point scale. Results have shown numerical stability issues for particular conditions and have revealed the complex nature of the non-linear relationships between models' parameters at both scales and indicate that the parameterization of point scale processes at the coarser scale is governed by the amplification of non-linear effects. The findings of these simulations have been used by the students to identify potential research questions on scale issues. Moreover, the implementation of this virtual lab improved the ability to understand the rationale of these process and how to transfer the mathematical models to computational representations.

  14. Processes influencing formation of low-salinity high-biomass lenses near the edge of the Ross Ice Shelf

    NASA Astrophysics Data System (ADS)

    Li, Yizhen; McGillicuddy, Dennis J.; Dinniman, Michael S.; Klinck, John M.

    2017-02-01

    Both remotely sensed and in situ observations in austral summer of early 2012 in the Ross Sea suggest the presence of cold, low-salinity, and high-biomass eddies along the edge of the Ross Ice Shelf (RIS). Satellite measurements include sea surface temperature and ocean color, and shipboard data sets include hydrographic profiles, towed instrumentation, and underway acoustic Doppler current profilers. Idealized model simulations are utilized to examine the processes responsible for ice shelf eddy formation. 3-D model simulations produce similar cold and fresh eddies, although the simulated vertical lenses are quantitatively thinner than observed. Model sensitivity tests show that both basal melting underneath the ice shelf and irregularity of the ice shelf edge facilitate generation of cold and fresh eddies. 2-D model simulations further suggest that both basal melting and downwelling-favorable winds play crucial roles in forming a thick layer of low-salinity water observed along the edge of the RIS. These properties may have been entrained into the observed eddies, whereas that entrainment process was not captured in the specific eddy formation events studied in our 3-D model-which may explain the discrepancy between the simulated and observed eddies, at least in part. Additional sensitivity experiments imply that uncertainties associated with background stratification and wind stress may also explain why the model underestimates the thickness of the low-salinity lens in the eddy interiors. Our study highlights the importance of incorporating accurate wind forcing, basal melting, and ice shelf irregularity for simulating eddy formation near the RIS edge. The processes responsible for generating the high phytoplankton biomass inside these eddies remain to be elucidated. Appendix B. Details for the basal melting and mechanical forcing by the ice shelf edge.

  15. Advances in multi-scale modeling of solidification and casting processes

    NASA Astrophysics Data System (ADS)

    Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang

    2011-04-01

    The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.

  16. Strategy and gaps for modeling, simulation, and control of hybrid systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabiti, Cristian; Garcia, Humberto E.; Hovsapian, Rob

    2015-04-01

    The purpose of this report is to establish a strategy for modeling and simulation of candidate hybrid energy systems. Modeling and simulation is necessary to design, evaluate, and optimize the system technical and economic performance. Accordingly, this report first establishes the simulation requirements to analysis candidate hybrid systems. Simulation fidelity levels are established based on the temporal scale, real and synthetic data availability or needs, solution accuracy, and output parameters needed to evaluate case-specific figures of merit. Accordingly, the associated computational and co-simulation resources needed are established; including physical models when needed, code assembly and integrated solutions platforms, mathematical solvers,more » and data processing. This report first attempts to describe the figures of merit, systems requirements, and constraints that are necessary and sufficient to characterize the grid and hybrid systems behavior and market interactions. Loss of Load Probability (LOLP) and effective cost of Effective Cost of Energy (ECE), as opposed to the standard Levelized Cost of Electricty (LCOE), are introduced as technical and economical indices for integrated energy system evaluations. Financial assessment methods are subsequently introduced for evaluation of non-traditional, hybrid energy systems. Algorithms for coupled and iterative evaluation of the technical and economic performance are subsequently discussed. This report further defines modeling objectives, computational tools, solution approaches, and real-time data collection and processing (in some cases using real test units) that will be required to model, co-simulate, and optimize; (a) an energy system components (e.g., power generation unit, chemical process, electricity management unit), (b) system domains (e.g., thermal, electrical or chemical energy generation, conversion, and transport), and (c) systems control modules. Co-simulation of complex, tightly coupled, dynamic energy systems requires multiple simulation tools, potentially developed in several programming languages and resolved on separate time scales. Whereas further investigation and development of hybrid concepts will provide a more complete understanding of the joint computational and physical modeling needs, this report highlights areas in which co-simulation capabilities are warranted. The current development status, quality assurance, availability and maintainability of simulation tools that are currently available for hybrid systems modeling is presented. Existing gaps in the modeling and simulation toolsets and development needs are subsequently discussed. This effort will feed into a broader Roadmap activity for designing, developing, and demonstrating hybrid energy systems.« less

  17. A computational modeling of semantic knowledge in reading comprehension: Integrating the landscape model with latent semantic analysis.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2016-09-01

    It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.

  18. Exact Hybrid Particle/Population Simulation of Rule-Based Models of Biochemical Systems

    PubMed Central

    Stover, Lori J.; Nair, Niketh S.; Faeder, James R.

    2014-01-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This “network-free” approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of “partial network expansion” into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility. PMID:24699269

  19. Exact hybrid particle/population simulation of rule-based models of biochemical systems.

    PubMed

    Hogg, Justin S; Harris, Leonard A; Stover, Lori J; Nair, Niketh S; Faeder, James R

    2014-04-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility.

  20. A Cellular Automaton / Finite Element model for predicting grain texture development in galvanized coatings

    NASA Astrophysics Data System (ADS)

    Guillemot, G.; Avettand-Fènoël, M.-N.; Iosta, A.; Foct, J.

    2011-01-01

    Hot-dipping galvanizing process is a widely used and efficient way to protect steel from corrosion. We propose to master the microstructure of zinc grains by investigating the relevant process parameters. In order to improve the texture of this coating, we model grain nucleation and growth processes and simulate the zinc solid phase development. A coupling scheme model has been applied with this aim. This model improves a previous two-dimensional model of the solidification process. It couples a cellular automaton (CA) approach and a finite element (FE) method. CA grid and FE mesh are superimposed on the same domain. The grain development is simulated at the micro-scale based on the CA grid. A nucleation law is defined using a Gaussian probability and a random set of nucleating cells. A crystallographic orientation is defined for each one with a choice of Euler's angle (Ψ,θ,φ). A small growing shape is then associated to each cell in the mushy domain and a dendrite tip kinetics is defined using the model of Kurz [2]. The six directions of basal plane and the two perpendicular directions develop in each mushy cell. During each time step, cell temperature and solid fraction are then determined at micro-scale using the enthalpy conservation relation and variations are reassigned at macro-scale. This coupling scheme model enables to simulate the three-dimensional growing kinetics of the zinc grain in a two-dimensional approach. Grain structure evolutions for various cooling times have been simulated. Final grain structure has been compared to EBSD measurements. We show that the preferentially growth of dendrite arms in the basal plane of zinc grains is correctly predicted. The described coupling scheme model could be applied for simulated other product or manufacturing processes. It constitutes an approach gathering both micro and macro scale models.

  1. Measurement with microscopic MRI and simulation of flow in different aneurysm models.

    PubMed

    Edelhoff, Daniel; Walczak, Lars; Frank, Frauke; Heil, Marvin; Schmitz, Inge; Weichert, Frank; Suter, Dieter

    2015-10-01

    The impact and the development of aneurysms depend to a significant degree on the exchange of liquid between the regular vessel and the pathological extension. A better understanding of this process will lead to improved prediction capabilities. The aim of the current study was to investigate fluid-exchange in aneurysm models of different complexities by combining microscopic magnetic resonance measurements with numerical simulations. In order to evaluate the accuracy and applicability of these methods, the fluid-exchange process between the unaltered vessel lumen and the aneurysm phantoms was analyzed quantitatively using high spatial resolution. Magnetic resonance flow imaging was used to visualize fluid-exchange in two different models produced with a 3D printer. One model of an aneurysm was based on histological findings. The flow distribution in the different models was measured on a microscopic scale using time of flight magnetic resonance imaging. The whole experiment was simulated using fast graphics processing unit-based numerical simulations. The obtained simulation results were compared qualitatively and quantitatively with the magnetic resonance imaging measurements, taking into account flow and spin-lattice relaxation. The results of both presented methods compared well for the used aneurysm models and the chosen flow distributions. The results from the fluid-exchange analysis showed comparable characteristics concerning measurement and simulation. Similar symmetry behavior was observed. Based on these results, the amount of fluid-exchange was calculated. Depending on the geometry of the models, 7% to 45% of the liquid was exchanged per second. The result of the numerical simulations coincides well with the experimentally determined velocity field. The rate of fluid-exchange between vessel and aneurysm was well-predicted. Hence, the results obtained by simulation could be validated by the experiment. The observed deviations can be caused by the noise in the measurement and by the limited resolution of the simulation. The resulting differences are small enough to allow reliable predictions of the flow distribution in vessels with stents and for pulsed blood flow.

  2. VS2DRTI: Simulating Heat and Reactive Solute Transport in Variably Saturated Porous Media.

    PubMed

    Healy, Richard W; Haile, Sosina S; Parkhurst, David L; Charlton, Scott R

    2018-01-29

    Variably saturated groundwater flow, heat transport, and solute transport are important processes in environmental phenomena, such as the natural evolution of water chemistry of aquifers and streams, the storage of radioactive waste in a geologic repository, the contamination of water resources from acid-rock drainage, and the geologic sequestration of carbon dioxide. Up to now, our ability to simulate these processes simultaneously with fully coupled reactive transport models has been limited to complex and often difficult-to-use models. To address the need for a simple and easy-to-use model, the VS2DRTI software package has been developed for simulating water flow, heat transport, and reactive solute transport through variably saturated porous media. The underlying numerical model, VS2DRT, was created by coupling the flow and transport capabilities of the VS2DT and VS2DH models with the equilibrium and kinetic reaction capabilities of PhreeqcRM. Flow capabilities include two-dimensional, constant-density, variably saturated flow; transport capabilities include both heat and multicomponent solute transport; and the reaction capabilities are a complete implementation of geochemical reactions of PHREEQC. The graphical user interface includes a preprocessor for building simulations and a postprocessor for visual display of simulation results. To demonstrate the simulation of multiple processes, the model is applied to a hypothetical example of injection of heated waste water to an aquifer with temperature-dependent cation exchange. VS2DRTI is freely available public domain software. © 2018, National Ground Water Association.

  3. A novel method of multi-scale simulation of macro-scale deformation and microstructure evolution on metal forming

    NASA Astrophysics Data System (ADS)

    Huang, Shiquan; Yi, Youping; Li, Pengchuan

    2011-05-01

    In recent years, multi-scale simulation technique of metal forming is gaining significant attention for prediction of the whole deformation process and microstructure evolution of product. The advances of numerical simulation at macro-scale level on metal forming are remarkable and the commercial FEM software, such as Deform2D/3D, has found a wide application in the fields of metal forming. However, the simulation method of multi-scale has little application due to the non-linearity of microstructure evolution during forming and the difficulty of modeling at the micro-scale level. This work deals with the modeling of microstructure evolution and a new method of multi-scale simulation in forging process. The aviation material 7050 aluminum alloy has been used as example for modeling of microstructure evolution. The corresponding thermal simulated experiment has been performed on Gleeble 1500 machine. The tested specimens have been analyzed for modeling of dislocation density, nucleation and growth of recrystallization(DRX). The source program using cellular automaton (CA) method has been developed to simulate the grain nucleation and growth, in which the change of grain topology structure caused by the metal deformation was considered. The physical fields at macro-scale level such as temperature field, stress and strain fields, which can be obtained by commercial software Deform 3D, are coupled with the deformed storage energy at micro-scale level by dislocation model to realize the multi-scale simulation. This method was explained by forging process simulation of the aircraft wheel hub forging. Coupled the results of Deform 3D with CA results, the forging deformation progress and the microstructure evolution at any point of forging could be simulated. For verifying the efficiency of simulation, experiments of aircraft wheel hub forging have been done in the laboratory and the comparison of simulation and experiment result has been discussed in details.

  4. Numerical Simulation and Optimization of Directional Solidification Process of Single Crystal Superalloy Casting

    PubMed Central

    Zhang, Hang; Xu, Qingyan; Liu, Baicheng

    2014-01-01

    The rapid development of numerical modeling techniques has led to more accurate results in modeling metal solidification processes. In this study, the cellular automaton-finite difference (CA-FD) method was used to simulate the directional solidification (DS) process of single crystal (SX) superalloy blade samples. Experiments were carried out to validate the simulation results. Meanwhile, an intelligent model based on fuzzy control theory was built to optimize the complicate DS process. Several key parameters, such as mushy zone width and temperature difference at the cast-mold interface, were recognized as the input variables. The input variables were functioned with the multivariable fuzzy rule to get the output adjustment of withdrawal rate (v) (a key technological parameter). The multivariable fuzzy rule was built, based on the structure feature of casting, such as the relationship between section area, and the delay time of the temperature change response by changing v, and the professional experience of the operator as well. Then, the fuzzy controlling model coupled with CA-FD method could be used to optimize v in real-time during the manufacturing process. The optimized process was proven to be more flexible and adaptive for a steady and stray-grain free DS process. PMID:28788535

  5. Modeling nitrate-nitrogen removal process in first-flush reactor for stormwater treatment.

    PubMed

    Deng, Zhiqiang; Sun, Shaowei; Gang, Daniel Dianchen

    2012-08-01

    Stormwater runoff is one of the most common non-point sources of water pollution to rivers, lakes, estuaries, and coastal beaches. While most pollutants and nutrients, including nitrate-nitrogen, in stormwater are discharged into receiving waters during the first-flush period, no existing best management practices (BMPs) are specifically designed to capture and treat the first-flush portion of urban stormwater runoff. This paper presents a novel BMP device for highway and urban stormwater treatment with emphasis on numerical modeling of the new BMP, called first-flush reactor (FFR). A new model, called VART-DN model, for simulation of denitrification process in the designed first-flush reactor was developed using the variable residence time (VART) model. The VART-DN model is capable of simulating various processes and mechanisms responsible for denitrification in the FFR. Based on sensitivity analysis results of model parameters, the denitrification process is sensitive to the temperature correction factor (b), maximum nitrate-nitrogen decay rate (K (max)), actual varying residence time (T (v)), the constant decay rate of denitrifiying bacteria (v (dec)), temperature (T), biomass inhibition constant (K (b)), maximum growth rate of denitrifiying bacteria (v (max)), denitrifying bacteria concentration (X), longitudinal dispersion coefficient (K (s)), and half-saturation constant of dissolved carbon for biomass (K (Car-X)); a 10% increase in the model parameter values causes a change in model root mean square error (RMSE) of -28.02, -16.16, -12.35, 11.44, -9.68, 10.61, -16.30, -9.27, 6.58 and 3.89%, respectively. The VART-DN model was tested using the data from laboratory experiments conducted using highway stormwater and secondary wastewater. Model results for the denitrification process of highway stormwater showed a good agreement with observed data and the simulation error was less than 9.0%. The RMSE and the coefficient of determination for simulating denitrification process of wastewater were 0.5167 and 0.6912, respectively, demonstrating the efficacy of the VART-DN model.

  6. Identifying Hydrologic Processes in Agricultural Watersheds Using Precipitation-Runoff Models

    USGS Publications Warehouse

    Linard, Joshua I.; Wolock, David M.; Webb, Richard M.T.; Wieczorek, Michael

    2009-01-01

    Understanding the fate and transport of agricultural chemicals applied to agricultural fields will assist in designing the most effective strategies to prevent water-quality impairments. At a watershed scale, the processes controlling the fate and transport of agricultural chemicals are generally understood only conceptually. To examine the applicability of conceptual models to the processes actually occurring, two precipitation-runoff models - the Soil and Water Assessment Tool (SWAT) and the Water, Energy, and Biogeochemical Model (WEBMOD) - were applied in different agricultural settings of the contiguous United States. Each model, through different physical processes, simulated the transport of water to a stream from the surface, the unsaturated zone, and the saturated zone. Models were calibrated for watersheds in Maryland, Indiana, and Nebraska. The calibrated sets of input parameters for each model at each watershed are discussed, and the criteria used to validate the models are explained. The SWAT and WEBMOD model results at each watershed conformed to each other and to the processes identified in each watershed's conceptual hydrology. In Maryland the conceptual understanding of the hydrology indicated groundwater flow was the largest annual source of streamflow; the simulation results for the validation period confirm this. The dominant source of water to the Indiana watershed was thought to be tile drains. Although tile drains were not explicitly simulated in the SWAT model, a large component of streamflow was received from lateral flow, which could be attributed to tile drains. Being able to explicitly account for tile drains, WEBMOD indicated water from tile drains constituted most of the annual streamflow in the Indiana watershed. The Nebraska models indicated annual streamflow was composed primarily of perennial groundwater flow and infiltration-excess runoff, which conformed to the conceptual hydrology developed for that watershed. The hydrologic processes represented in the parameter sets resulting from each model were comparable at individual watersheds, but varied between watersheds. The models were unable to show, however, whether hydrologic processes other than those included in the original conceptual models were major contributors to streamflow. Supplemental simulations of agricultural chemical transport could improve the ability to assess conceptual models.

  7. Evaluation of mean climate in a chemistry-climate model simulation

    NASA Astrophysics Data System (ADS)

    Hong, S.; Park, H.; Wie, J.; Park, R.; Lee, S.; Moon, B. K.

    2017-12-01

    Incorporation of the interactive chemistry is essential for understanding chemistry-climate interactions and feedback processes in climate models. Here we assess a newly developed chemistry-climate model (GRIMs-Chem), which is based on the Global/Regional Integrated Model system (GRIMs) including the aerosol direct effect as well as stratospheric linearized ozone chemistry (LINOZ). We conducted GRIMs-Chem with observed sea surface temperature during the period of 1979-2010, and compared the simulation results with observations and also with CMIP models. To measure the relative performance of our model, we define the quantitative performance metric using the Taylor diagram. This metric allow us to assess overall features in simulating multiple variables. Overall, our model better reproduce the zonal mean spatial pattern of temperature, horizontal wind, vertical motion, and relative humidity relative to other models. However, the model did not produce good simulations at upper troposphere (200 hPa). It is currently unclear which model processes are responsible for this. AcknowledgementsThis research was supported by the Korea Ministry of Environment (MOE) as "Climate Change Correspondence Program."

  8. 10 CFR 434.517 - HVAC systems and equipment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... simulation, except that excess capacity provided to meet process loads need not be modeled unless the process... Reference Buildings. The zones in the simulation shall correspond to the zones provided by the controls in... simulation. Table 517.4.1—HVAC System Description for Prototype and Reference Buildings 1,2 HVAC component...

  9. Introduction to Stochastic Simulations for Chemical and Physical Processes: Principles and Applications

    ERIC Educational Resources Information Center

    Weiss, Charles J.

    2017-01-01

    An introduction to digital stochastic simulations for modeling a variety of physical and chemical processes is presented. Despite the importance of stochastic simulations in chemistry, the prevalence of turn-key software solutions can impose a layer of abstraction between the user and the underlying approach obscuring the methodology being…

  10. Three-Dimensional Numerical Simulation of Water Quality and Sediment-Associated Processes with Application to a Mississippi Delta Lake

    USDA-ARS?s Scientific Manuscript database

    A three-dimensional water quality model was developed for simulating temporal and spatial variations of phytoplankton, nutrients, and dissolved oxygen in freshwater bodies. Effects of suspended and bed sediment on the water quality processes were simulated. A formula was generated from field measure...

  11. A Comparison of Three Approaches to Model Human Behavior

    NASA Astrophysics Data System (ADS)

    Palmius, Joel; Persson-Slumpi, Thomas

    2010-11-01

    One way of studying social processes is through the use of simulations. The use of simulations for this purpose has been established as its own field, social simulations, and has been used for studying a variety of phenomena. A simulation of a social setting can serve as an aid for thinking about that social setting, and for experimenting with different parameters and studying the outcomes caused by them. When using the simulation as an aid for thinking and experimenting, the chosen simulation approach will implicitly steer the simulationist towards thinking in a certain fashion in order to fit the model. To study the implications of model choice on the understanding of a setting where human anticipation comes into play, a simulation scenario of a coffee room was constructed using three different simulation approaches: Cellular Automata, Systems Dynamics and Agent-based modeling. The practical implementations of the models were done in three different simulation packages: Stella for Systems Dynamic, CaFun for Cellular automata and SesAM for Agent-based modeling. The models were evaluated both using Randers' criteria for model evaluation, and through introspection where the authors reflected upon how their understanding of the scenario was steered through the model choice. Further the software used for implementing the simulation models was evaluated, and practical considerations for the choice of software package are listed. It is concluded that the models have very different strengths. The Agent-based modeling approach offers the most intuitive support for thinking about and modeling a social setting where the behavior of the individual is in focus. The Systems Dynamics model would be preferable in situations where populations and large groups would be studied as wholes, but where individual behavior is of less concern. The Cellular Automata models would be preferable where processes need to be studied from the basis of a small set of very simple rules. It is further concluded that in most social simulation settings the Agent-based modeling approach would be the probable choice. This since the other models does not offer much in the way of supporting the modeling of the anticipatory behavior of humans acting in an organization.

  12. COMPUTERIZED TRAINING OF CRYOSURGERY – A SYSTEM APPROACH

    PubMed Central

    Keelan, Robert; Yamakawa, Soji; Shimada, Kenji; Rabin, Yoed

    2014-01-01

    The objective of the current study is to provide the foundation for a computerized training platform for cryosurgery. Consistent with clinical practice, the training process targets the correlation of the frozen region contour with the target region shape, using medical imaging and accepted criteria for clinical success. The current study focuses on system design considerations, including a bioheat transfer model, simulation techniques, optimal cryoprobe layout strategy, and a simulation core framework. Two fundamentally different approaches were considered for the development of a cryosurgery simulator, based on a finite-elements (FE) commercial code (ANSYS) and a proprietary finite-difference (FD) code. Results of this study demonstrate that the FE simulator is superior in terms of geometric modeling, while the FD simulator is superior in terms of runtime. Benchmarking results further indicate that the FD simulator is superior in terms of usage of memory resources, pre-processing, parallel processing, and post-processing. It is envisioned that future integration of a human-interface module and clinical data into the proposed computer framework will make computerized training of cryosurgery a practical reality. PMID:23995400

  13. On the modeling of separation foils in thermoforming simulations

    NASA Astrophysics Data System (ADS)

    Margossian, Alexane; Bel, Sylvain; Hinterhölzl, Roland

    2016-10-01

    Composite forming simulations consist in modelling the forming process of composite components to anticipate the occurrence of potential flaws such as out-of-plane wrinkles and fibre re-orientation. Forming methods often consist of automated processes in which flat composite blanks are forced to comply with tool geometries. Although Finite Element forming simulations require the modelling of all stakeholders (blankholder, tooling and composite blank), consumables such as separation films are often not considered. Used in thermoforming processes, these films are placed between tooling and composite to ease part removal after forming. These films are also used to decrease tool/ply friction and thus, enhance forming quality. This work presents thermoforming simulations of pre-impregnated carbon fibre thermoplastic blanks in which separation films are modelled in the same manner as composite layers, i.e. by a layer of shell elements. The mechanical properties of such films are also characterised at the same temperature as forming occurs. The proposed approach is finally compared to the actual modelling method, in which separation films are not modelled as such but in which their influence is only considered within the friction coefficient between tooling and blank.

  14. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks

    PubMed Central

    Walpole, J.; Chappell, J.C.; Cluceru, J.G.; Mac Gabhann, F.; Bautch, V.L.; Peirce, S. M.

    2015-01-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods. PMID:26158406

  15. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks.

    PubMed

    Walpole, J; Chappell, J C; Cluceru, J G; Mac Gabhann, F; Bautch, V L; Peirce, S M

    2015-09-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods.

  16. Baseline process description for simulating plutonium oxide production for precalc project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pike, J. A.

    Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO 2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as wellmore » as process and facility design details necessary for multi-scale, multi-physics models are provided.« less

  17. Effects of heat exchanger tubes on hydrodynamics and CO 2 capture of a sorbent-based fluidized bed reactor

    DOE PAGES

    Lai, Canhai; Xu, Zhijie; Li, Tingwen; ...

    2017-08-05

    In virtual design and scale up of pilot-scale carbon capture systems, the coupled reactive multiphase flow problem must be solved to predict the adsorber's performance and capture efficiency under various operation conditions. This paper focuses on the detailed computational fluid dynamics (CFD) modeling of a pilot-scale fluidized bed adsorber equipped with vertical cooling tubes. Multiphase Flow with Interphase eXchanges (MFiX), an open-source multiphase flow CFD solver, is used for the simulations with custom code to simulate the chemical reactions and filtered sub-grid models to capture the effect of the unresolved details in the coarser mesh for simulations with reasonable accuracymore » and manageable computational effort. Previously developed filtered models for horizontal cylinder drag, heat transfer, and reaction kinetics have been modified to derive the 2D filtered models representing vertical cylinders in the coarse-grid CFD simulations. The effects of the heat exchanger configurations (i.e., horizontal or vertical tubes) on the adsorber's hydrodynamics and CO 2 capture performance are then examined. A one-dimensional three-region process model is briefly introduced for comparison purpose. The CFD model matches reasonably well with the process model while provides additional information about the flow field that is not available with the process model.« less

  18. Enhanced modeling and simulation of EO/IR sensor systems

    NASA Astrophysics Data System (ADS)

    Hixson, Jonathan G.; Miller, Brian; May, Christopher

    2015-05-01

    The testing and evaluation process developed by the Night Vision and Electronic Sensors Directorate (NVESD) Modeling and Simulation Division (MSD) provides end to end systems evaluation, testing, and training of EO/IR sensors. By combining NV-LabCap, the Night Vision Integrated Performance Model (NV-IPM), One Semi-Automated Forces (OneSAF) input sensor file generation, and the Night Vision Image Generator (NVIG) capabilities, NVESD provides confidence to the M&S community that EO/IR sensor developmental and operational testing and evaluation are accurately represented throughout the lifecycle of an EO/IR system. This new process allows for both theoretical and actual sensor testing. A sensor can be theoretically designed in NV-IPM, modeled in NV-IPM, and then seamlessly input into the wargames for operational analysis. After theoretical design, prototype sensors can be measured by using NV-LabCap, then modeled in NV-IPM and input into wargames for further evaluation. The measurement process to high fidelity modeling and simulation can then be repeated again and again throughout the entire life cycle of an EO/IR sensor as needed, to include LRIP, full rate production, and even after Depot Level Maintenance. This is a prototypical example of how an engineering level model and higher level simulations can share models to mutual benefit.

  19. Benchmark simulation Model no 2 in Matlab-simulink: towards plant-wide WWTP control strategy evaluation.

    PubMed

    Vreck, D; Gernaey, K V; Rosen, C; Jeppsson, U

    2006-01-01

    In this paper, implementation of the Benchmark Simulation Model No 2 (BSM2) within Matlab-Simulink is presented. The BSM2 is developed for plant-wide WWTP control strategy evaluation on a long-term basis. It consists of a pre-treatment process, an activated sludge process and sludge treatment processes. Extended evaluation criteria are proposed for plant-wide control strategy assessment. Default open-loop and closed-loop strategies are also proposed to be used as references with which to compare other control strategies. Simulations indicate that the BM2 is an appropriate tool for plant-wide control strategy evaluation.

  20. Computer modeling and simulators as part of university training for NPP operating personnel

    NASA Astrophysics Data System (ADS)

    Volman, M.

    2017-01-01

    This paper considers aspects of a program for training future nuclear power plant personnel developed by the NPP Department of Ivanovo State Power Engineering University. Computer modeling is used for numerical experiments on the kinetics of nuclear reactors in Mathcad. Simulation modeling is carried out on the computer and full-scale simulator of water-cooled power reactor for the simulation of neutron-physical reactor measurements and the start-up - shutdown process.

  1. An integrated 3D log processing optimization system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang

    2013-01-01

    An integrated 3D log processing optimization system was developed to perform 3D log generation, opening face determination, headrig log sawing simulation, fl itch edging and trimming simulation, cant resawing, and lumber grading. A circular cross-section model, together with 3D modeling techniques, was used to reconstruct 3D virtual logs. Internal log defects (knots)...

  2. Simulating soil moisture change in a semiarid rangeland watershed with a process-based water-balance model

    Treesearch

    Howard Evan Canfield; Vicente L. Lopes

    2000-01-01

    A process-based, simulation model for evaporation, soil water and streamflow (BROOK903) was used to estimate soil moisture change on a semiarid rangeland watershed in southeastern Arizona. A sensitivity analysis was performed to select parameters affecting ET and soil moisture for calibration. Automatic parameter calibration was performed using a procedure based on a...

  3. Simulation Based Low-Cost Composite Process Development at the US Air Force Research Laboratory

    NASA Technical Reports Server (NTRS)

    Rice, Brian P.; Lee, C. William; Curliss, David B.

    2003-01-01

    Low-cost composite research in the US Air Force Research Laboratory, Materials and Manufacturing Directorate, Organic Matrix Composites Branch has focused on the theme of affordable performance. Practically, this means that we use a very broad view when considering the affordability of composites. Factors such as material costs, labor costs, recurring and nonrecurring manufacturing costs are balanced against performance to arrive at the relative affordability vs. performance measure of merit. The research efforts discussed here are two projects focused on affordable processing of composites. The first topic is the use of a neural network scheme to model cure reaction kinetics, then utilize the kinetics coupled with simple heat transport models to predict, in real-time, future exotherms and control them. The neural network scheme is demonstrated to be very robust and a much more efficient method that mechanistic cure modeling approach. This enables very practical low-cost processing of thick composite parts. The second project is liquid composite molding (LCM) process simulation. LCM processing of large 3D integrated composite parts has been demonstrated to be a very cost effective way to produce large integrated aerospace components specific examples of LCM processes are resin transfer molding (RTM), vacuum assisted resin transfer molding (VARTM), and other similar approaches. LCM process simulation is a critical part of developing an LCM process approach. Flow simulation enables the development of the most robust approach to introducing resin into complex preforms. Furthermore, LCM simulation can be used in conjunction with flow front sensors to control the LCM process in real-time to account for preform or resin variability.

  4. Development of a global aerosol model using a two-dimensional sectional method: 1. Model design

    NASA Astrophysics Data System (ADS)

    Matsui, H.

    2017-08-01

    This study develops an aerosol module, the Aerosol Two-dimensional bin module for foRmation and Aging Simulation version 2 (ATRAS2), and implements the module into a global climate model, Community Atmosphere Model. The ATRAS2 module uses a two-dimensional (2-D) sectional representation with 12 size bins for particles from 1 nm to 10 μm in dry diameter and 8 black carbon (BC) mixing state bins. The module can explicitly calculate the enhancement of absorption and cloud condensation nuclei activity of BC-containing particles by aging processes. The ATRAS2 module is an extension of a 2-D sectional aerosol module ATRAS used in our previous studies within a framework of a regional three-dimensional model. Compared with ATRAS, the computational cost of the aerosol module is reduced by more than a factor of 10 by simplifying the treatment of aerosol processes and 2-D sectional representation, while maintaining good accuracy of aerosol parameters in the simulations. Aerosol processes are simplified for condensation of sulfate, ammonium, and nitrate, organic aerosol formation, coagulation, and new particle formation processes, and box model simulations show that these simplifications do not substantially change the predicted aerosol number and mass concentrations and their mixing states. The 2-D sectional representation is simplified (the number of advected species is reduced) primarily by the treatment of chemical compositions using two interactive bin representations. The simplifications do not change the accuracy of global aerosol simulations. In part 2, comparisons with measurements and the results focused on aerosol processes such as BC aging processes are shown.

  5. Mesoscale Convective Systems in SCSMEX: Simulated by a Regional Climate Model and a Cloud Resolving Model

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Wang, Y.; Qian, I.; Lau, W.; Shie, C.-L.; Starr, David (Technical Monitor)

    2002-01-01

    A Regional Land-Atmosphere Climate Simulation (RELACS) System is being developed and implemented at NASA Goddard Space Flight Center. One of the major goals of RELACS is to use a regional scale model with improved physical processes, in particular land-related processes, to understand the role of the land surface and its interaction with convection and radiation as well as the water and energy cycles in Indo-China/ South China Sea (SCS)/China, N. America and S. America. The Penn State/NCAR MM5 atmospheric modeling system, a state of the art atmospheric numerical model designed to simulate regional weather and climate, has been successfully coupled to the Goddard Parameterization for Land-Atmosphere-C loud Exchange (PLACE) land surface model. PLACE allows for the effects of vegetation, and thus important physical processes such as evapotranspiration and interception are included. The PLACE model incorporates vegetation type and has been shown in international comparisons to accurately predict evapotranspiration and runoff over a wide variety of land surfaces. The coupling of MM5 and PLACE creates a numerical modeling system with the potential to more realistically simulate the atmosphere and land surface processes including land-sea interaction, regional circulations such as monsoons, and flash flood events. RELACS has been used to simulate the onset of the South China Sea Monsoon in 1986, 1997 and 1998. Sensitivity tests on various land surface models, cumulus parameterization schemes (CPSs), sea surface temperature (SST) variations and midlatitude influences have been performed. These tests have indicated that the land surface model has a major impact on the circulation over the S. China Sea. CPSs can effect the precipitation pattern while SST variation can effect the precipitation amounts over both land and ocean. RELACS has also been used to understand the soil-precipitation interaction and feedback associated with a flood event that occurred in and around China's Yantz River during 1998. The exact location (region) of the flooding can be effected by the soil-rainfall feedback. Also, the Goddard Cumulus Ensemble (GCE) model which allows for realistic moist processes as well as explicit interactions between cloud and radiation, and cloud and surface processes will be used to simulate convective systems associated with the onset of the South China Sea Monsoon in 1998. The GCE model also includes the same PLACE and radiation scheme used in the RELACS. A detailed comparison between the results from the GCE model and RELACS will be performed.

  6. Mesoscale Convective Systems in SCSMEX: Simulated by a Regional Climate Model and a Cloud Resolving Model

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Wang, Y.; Lau, W.; Jia, Y.; Johnson, D.; Shie, C.-L.; Einaudi, Franco (Technical Monitor)

    2001-01-01

    A Regional Land-Atmosphere Climate Simulation (RELACS) System is being developed and implemented at NASA Goddard Space Flight Center. One of the major goals of RELACS is to use a regional scale model with improved physical processes, in particular land-related processes, to understand the role of the land surface and its interaction with convection and radiation as well as the water and energy cycles in Indo-China/South China Sea (SCS)/China, North America and South America. The Penn State/NCAR MM5 atmospheric modeling system, a state of the art atmospheric numerical model designed to simulate regional weather and climate, has been successfully coupled to the Goddard Parameterization for Land-Atmosphere-Cloud Exchange (PLACE) land surface model, PLACE allows for the effect A vegetation, and thus important physical processes such as evapotranspiration and interception are included. The PLACE model incorporates vegetation type and has been shown in international comparisons to accurately predict evapotranspiration and runoff over a wide variety of land surfaces. The coupling of MM5 and PLACE creates a numerical modeling system with the potential to more realistically simulate the atmosphere and land surface processes including land-sea interaction, regional circulations such as monsoons, and flash flood events. RELACS has been used to simulate the onset of the South China Sea Monsoon in 1986, 1991 and 1998. Sensitivity tests on various land surface models, cumulus parameterization schemes (CPSs), sea surface temperature (SST) variations and midlatitude influences have been performed. These tests have indicated that the land surface model has a major impact on the circulation over the South China Sea. CPSs can effect the precipitation pattern while SST variation can effect the precipitation amounts over both land and ocean. RELACS has also been used to understand the soil-precipitation interaction and feedback associated with a flood event that occurred in and around China's Yantz River during 1998. The exact location (region) of the flooding can be effected by the soil-rainfall feedback. Also, the Goddard Cumulus Ensemble (GCE) model which allows for realistic moist processes as well as explicit interactions between cloud and radiation, and cloud and surface processes will be used to simulate convective systems associated with the onset of the South China Sea Monsoon in 1998. The GCE model also includes the same PLACE and radiation scheme used in the RELACS. A detailed comparison between the results from the GCE model and RELACS will be performed.

  7. Application of Compressible Volume of Fluid Model in Simulating the Impact and Solidification of Hollow Spherical ZrO2 Droplet on a Surface

    NASA Astrophysics Data System (ADS)

    Safaei, Hadi; Emami, Mohsen Davazdah; Jazi, Hamidreza Salimi; Mostaghimi, Javad

    2017-12-01

    Applications of hollow spherical particles in thermal spraying process have been developed in recent years, accompanied by attempts in the form of experimental and numerical studies to better understand the process of impact of a hollow droplet on a surface. During such process, volume and density of the trapped gas inside droplet change. The numerical models should be able to simulate such changes and their consequent effects. The aim of this study is to numerically simulate the impact of a hollow ZrO2 droplet on a flat surface using the volume of fluid technique for compressible flows. An open-source, finite-volume-based CFD code was used to perform the simulations, where appropriate subprograms were added to handle the studied cases. Simulation results were compared with the available experimental data. Results showed that at high impact velocities ( U 0 > 100 m/s), the compression of trapped gas inside droplet played a significant role in the impact dynamics. In such velocities, the droplet splashed explosively. Compressibility effects result in a more porous splat, compared to the corresponding incompressible model. Moreover, the compressible model predicted a higher spread factor than the incompressible model, due to planetary structure of the splat.

  8. Evolution dynamics modeling and simulation of logistics enterprise's core competence based on service innovation

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Tong, Yuting

    2017-04-01

    With the rapid development of economy, the development of logistics enterprises in China is also facing a huge challenge, especially the logistics enterprises generally lack of core competitiveness, and service innovation awareness is not strong. Scholars in the process of studying the core competitiveness of logistics enterprises are mainly from the perspective of static stability, not from the perspective of dynamic evolution to explore. So the author analyzes the influencing factors and the evolution process of the core competence of logistics enterprises, using the method of system dynamics to study the cause and effect of the evolution of the core competence of logistics enterprises, construct a system dynamics model of evolution of core competence logistics enterprises, which can be simulated by vensim PLE. The analysis for the effectiveness and sensitivity of simulation model indicates the model can be used as the fitting of the evolution process of the core competence of logistics enterprises and reveal the process and mechanism of the evolution of the core competence of logistics enterprises, and provide management strategies for improving the core competence of logistics enterprises. The construction and operation of computer simulation model offers a kind of effective method for studying the evolution of logistics enterprise core competence.

  9. Microphysics in Multi-scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2012-01-01

    Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.

  10. The use of discrete-event simulation modeling to compare handwritten and electronic prescribing systems.

    PubMed

    Ghany, Ahmad; Vassanji, Karim; Kuziemsky, Craig; Keshavjee, Karim

    2013-01-01

    Electronic prescribing (e-prescribing) is expected to bring many benefits to Canadian healthcare, such as a reduction in errors and adverse drug reactions. As there currently is no functioning e-prescribing system in Canada that is completely electronic, we are unable to evaluate the performance of a live system. An alternative approach is to use simulation modeling for evaluation. We developed two discrete-event simulation models, one of the current handwritten prescribing system and one of a proposed e-prescribing system, to compare the performance of these two systems. We were able to compare the number of processes in each model, workflow efficiency, and the distribution of patients or prescriptions. Although we were able to compare these models to each other, using discrete-event simulation software was challenging. We were limited in the number of variables we could measure. We discovered non-linear processes and feedback loops in both models that could not be adequately represented using discrete-event simulation software. Finally, interactions between entities in both models could not be modeled using this type of software. We have come to the conclusion that a more appropriate approach to modeling both the handwritten and electronic prescribing systems would be to use a complex adaptive systems approach using agent-based modeling or systems-based modeling.

  11. Automatic mathematical modeling for real time simulation system

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1988-01-01

    A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.

  12. Controlling Ethylene for Extended Preservation of Fresh Fruits and Vegetables

    DTIC Science & Technology

    2008-12-01

    into a process simulation to determine the effects of key design parameters on the overall performance of the system. Integrating process simulation...High Decay [Asian Pears High High Decay [ Avocados High High Decay lBananas Moderate ~igh Decay Cantaloupe High Moderate Decay Cherimoya Very High High...ozonolysis. Process simulation was subsequently used to understand the effect of key system parameters on EEU performance. Using this modeling work

  13. Mesoscale Convective Systems During SCSMEX: Simulations with a Regional Climate Model and a Cloud-Resolving Model

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Wang, Y.; Qian, J.-H.; Shie, C.-L.; Lau, W. K.-M.; Kakar, R.; Starr, David (Technical Monitor)

    2002-01-01

    The South China Sea Monsoon Experiment (SCSMEX) was conducted in May-June 1998. One of its major objectives is to better understand the key physical processes for the onset and evolution of the summer monsoon over Southeast Asia and southern China. Multiple observation platforms (e.g., upper-air soundings, Doppler radar, ships, wind profilers, radiometers, etc.) during SCSMEX provided a first attempt at investigating the detailed characteristics of convection and circulation changes associated with monsoons over the South China Sea region. SCSMEX also provided precipitation derived from atmospheric budgets and comparison to those obtained from the Tropical Rainfall Measuring Mission (TRMM). In this paper, a regional scale model (with grid size of 20 km) and Goddard Cumulus Ensemble (GCE) model (with 1 km grid size) are used to perform multi-day integration to understand the precipitation processes associated with the summer monsoon over Southeast Asia and southern China. The regional climate model is used to understand the soil-precipitation interaction and feedback associated with a flood event that occurred in and around China's Yantz River during SCSMEX Sensitivity tests on various land surface models, sea surface temperature (SST) variations, and cloud processes are performed to understand the precipitation processes associated with the onset of the monsoon over the S. China Sea during SCSMEX. These tests have indicated that the land surface model has a major impact on the circulation over the S. China Sea. Cloud processes can effect the precipitation pattern while SST variation can effect the precipitation amounts over both land and ocean. The exact location (region) of the flooding can be effected by the soil-rainfall feedback. The GCE-model results captured many observed precipitation characteristics because it used a fine grid size. For example, the model simulated rainfall temporal variation compared quite well to the sounding-estimated rainfall. The results show there are more latent heat fluxes prior to the onset of the monsoon. However, more rainfall was simulated after the onset of the monsoon. This modeling study indicates the latent heat fluxes (or evaporation) have more of an impact on precipitation processes and rainfall in the regional climate model simulations than in the cloud-resolving model simulations. Research is underway to determine if the difference in the grid sizes or the moist processes used in these two models is responsible for the differing influence of surface fluxes an precipitation processes.

  14. Business intelligence modeling in launch operations

    NASA Astrophysics Data System (ADS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-05-01

    The future of business intelligence in space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems. This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations, process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations, and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems.

  15. Business Intelligence Modeling in Launch Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-01-01

    This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation .based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations. process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations. and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems. The future of business intelligence of space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems.

  16. Kinetic Monte Carlo Simulations of Scintillation Processes in NaI(Tl)

    NASA Astrophysics Data System (ADS)

    Kerisit, Sebastien; Wang, Zhiguo; Williams, Richard T.; Grim, Joel Q.; Gao, Fei

    2014-04-01

    Developing a comprehensive understanding of the processes that govern the scintillation behavior of inorganic scintillators provides a pathway to optimize current scintillators and allows for the science-driven search for new scintillator materials. Recent experimental data on the excitation density dependence of the light yield of inorganic scintillators presents an opportunity to incorporate parameterized interactions between excitations in scintillation models and thus enable more realistic simulations of the nonproportionality of inorganic scintillators. Therefore, a kinetic Monte Carlo (KMC) model of elementary scintillation processes in NaI(Tl) is developed in this paper to simulate the kinetics of scintillation for a range of temperatures and Tl concentrations as well as the scintillation efficiency as a function of excitation density. The ability of the KMC model to reproduce available experimental data allows for elucidating the elementary processes that give rise to the kinetics and efficiency of scintillation observed experimentally for a range of conditions.

  17. Aircraft Flight Modeling During the Optimization of Gas Turbine Engine Working Process

    NASA Astrophysics Data System (ADS)

    Tkachenko, A. Yu; Kuz'michev, V. S.; Krupenich, I. N.

    2018-01-01

    The article describes a method for simulating the flight of the aircraft along a predetermined path, establishing a functional connection between the parameters of the working process of gas turbine engine and the efficiency criteria of the aircraft. This connection is necessary for solving the optimization tasks of the conceptual design stage of the engine according to the systems approach. Engine thrust level, in turn, influences the operation of aircraft, thus making accurate simulation of the aircraft behavior during flight necessary for obtaining the correct solution. The described mathematical model of aircraft flight provides the functional connection between the airframe characteristics, working process of gas turbine engines (propulsion system), ambient and flight conditions and flight profile features. This model provides accurate results of flight simulation and the resulting aircraft efficiency criteria, required for optimization of working process and control function of a gas turbine engine.

  18. AgMIP: Next Generation Models and Assessments

    NASA Astrophysics Data System (ADS)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6 that involves the key modeling groups from around the world including North America, Europe, South America, Sub-Saharan Africa, South Asia, East Asia, and Australia and Oceania. This community process will lead to mutually agreed protocols for coordinated global and regional assessments.

  19. Estimation of in-situ bioremediation system cost using a hybrid Extreme Learning Machine (ELM)-particle swarm optimization approach

    NASA Astrophysics Data System (ADS)

    Yadav, Basant; Ch, Sudheer; Mathur, Shashi; Adamowski, Jan

    2016-12-01

    In-situ bioremediation is the most common groundwater remediation procedure used for treating organically contaminated sites. A simulation-optimization approach, which incorporates a simulation model for groundwaterflow and transport processes within an optimization program, could help engineers in designing a remediation system that best satisfies management objectives as well as regulatory constraints. In-situ bioremediation is a highly complex, non-linear process and the modelling of such a complex system requires significant computational exertion. Soft computing techniques have a flexible mathematical structure which can generalize complex nonlinear processes. In in-situ bioremediation management, a physically-based model is used for the simulation and the simulated data is utilized by the optimization model to optimize the remediation cost. The recalling of simulator to satisfy the constraints is an extremely tedious and time consuming process and thus there is need for a simulator which can reduce the computational burden. This study presents a simulation-optimization approach to achieve an accurate and cost effective in-situ bioremediation system design for groundwater contaminated with BTEX (Benzene, Toluene, Ethylbenzene, and Xylenes) compounds. In this study, the Extreme Learning Machine (ELM) is used as a proxy simulator to replace BIOPLUME III for the simulation. The selection of ELM is done by a comparative analysis with Artificial Neural Network (ANN) and Support Vector Machine (SVM) as they were successfully used in previous studies of in-situ bioremediation system design. Further, a single-objective optimization problem is solved by a coupled Extreme Learning Machine (ELM)-Particle Swarm Optimization (PSO) technique to achieve the minimum cost for the in-situ bioremediation system design. The results indicate that ELM is a faster and more accurate proxy simulator than ANN and SVM. The total cost obtained by the ELM-PSO approach is held to a minimum while successfully satisfying all the regulatory constraints of the contaminated site.

  20. Efficient SRAM yield optimization with mixture surrogate modeling

    NASA Astrophysics Data System (ADS)

    Zhongjian, Jiang; Zuochang, Ye; Yan, Wang

    2016-12-01

    Largely repeated cells such as SRAM cells usually require extremely low failure-rate to ensure a moderate chi yield. Though fast Monte Carlo methods such as importance sampling and its variants can be used for yield estimation, they are still very expensive if one needs to perform optimization based on such estimations. Typically the process of yield calculation requires a lot of SPICE simulation. The circuit SPICE simulation analysis accounted for the largest proportion of time in the process yield calculation. In the paper, a new method is proposed to address this issue. The key idea is to establish an efficient mixture surrogate model. The surrogate model is based on the design variables and process variables. This model construction method is based on the SPICE simulation to get a certain amount of sample points, these points are trained for mixture surrogate model by the lasso algorithm. Experimental results show that the proposed model is able to calculate accurate yield successfully and it brings significant speed ups to the calculation of failure rate. Based on the model, we made a further accelerated algorithm to further enhance the speed of the yield calculation. It is suitable for high-dimensional process variables and multi-performance applications.

  1. Machine learning in sentiment reconstruction of the simulated stock market

    NASA Astrophysics Data System (ADS)

    Goykhman, Mikhail; Teimouri, Ali

    2018-02-01

    In this paper we continue the study of the simulated stock market framework defined by the driving sentiment processes. We focus on the market environment driven by the buy/sell trading sentiment process of the Markov chain type. We apply the methodology of the Hidden Markov Models and the Recurrent Neural Networks to reconstruct the transition probabilities matrix of the Markov sentiment process and recover the underlying sentiment states from the observed stock price behavior. We demonstrate that the Hidden Markov Model can successfully recover the transition probabilities matrix for the hidden sentiment process of the Markov Chain type. We also demonstrate that the Recurrent Neural Network can successfully recover the hidden sentiment states from the observed simulated stock price time series.

  2. Splitting algorithm for numerical simulation of Li-ion battery electrochemical processes

    NASA Astrophysics Data System (ADS)

    Iliev, Oleg; Nikiforova, Marina A.; Semenov, Yuri V.; Zakharov, Petr E.

    2017-11-01

    In this paper we present a splitting algorithm for a numerical simulation of Li-ion battery electrochemical processes. Liion battery consists of three domains: anode, cathode and electrolyte. Mathematical model of electrochemical processes is described on a microscopic scale, and contains nonlinear equations for concentration and potential in each domain. On the interface of electrodes and electrolyte there are the Lithium ions intercalation and deintercalation processes, which are described by Butler-Volmer nonlinear equation. To approximate in spatial coordinates we use finite element methods with discontinues Galerkin elements. To simplify numerical simulations we develop the splitting algorithm, which split the original problem into three independent subproblems. We investigate the numerical convergence of the algorithm on 2D model problem.

  3. [Kinetic model and simulation of the adsorption-biofilm theory for the process of biopurifying VOC waste gases].

    PubMed

    Sun, Peishi; Huang, Bing; Huang, Ruohua; Yang, Ping

    2002-05-01

    For the process of biopurifying waste gas containing VOC in low concentration by using a biological trickling filter, the related kinetic model and simulation of the new Adsorption-Biofilm theory were investigated in this study. By using the lab test data and the industrial test data, the results of contrast and validation indicated that the model had a good applicability for describing the practical bio-purification process of VOC waste gas. In the simulation study for the affection of main factor, such as the concentration of toluene in inlet gas, the gas flow and the height of biofilm-packing, a good pertinence was showed between calculated data and test dada, the interrelation coefficients were in 0.80-0.97.

  4. Efficient Numerical Simulation of Aerothermoelastic Hypersonic Vehicles

    NASA Astrophysics Data System (ADS)

    Klock, Ryan J.

    Hypersonic vehicles operate in a high-energy flight environment characterized by high dynamic pressures, high thermal loads, and non-equilibrium flow dynamics. This environment induces strong fluid, thermal, and structural dynamics interactions that are unique to this flight regime. If these vehicles are to be effectively designed and controlled, then a robust and intuitive understanding of each of these disciplines must be developed not only in isolation, but also when coupled. Limitations on scaling and the availability of adequate test facilities mean that physical investigation is infeasible. Ever growing computational power offers the ability to perform elaborate numerical simulations, but also has its own limitations. The state of the art in numerical simulation is either to create ever more high-fidelity physics models that do not couple well and require too much processing power to consider more than a few seconds of flight, or to use low-fidelity analytical models that can be tightly coupled and processed quickly, but do not represent realistic systems due to their simplifying assumptions. Reduced-order models offer a middle ground by distilling the dominant trends of high-fidelity training solutions into a form that can be quickly processed and more tightly coupled. This thesis presents a variably coupled, variable-fidelity, aerothermoelastic framework for the simulation and analysis of high-speed vehicle systems using analytical, reduced-order, and surrogate modeling techniques. Full launch-to-landing flights of complete vehicles are considered and used to define flight envelopes with aeroelastic, aerothermal, and thermoelastic limits, tune in-the-loop flight controllers, and inform future design considerations. A partitioned approach to vehicle simulation is considered in which regions dominated by particular combinations of processes are made separate from the overall solution and simulated by a specialized set of models to improve overall processing speed and overall solution fidelity. A number of enhancements to this framework are made through 1. the implementation of a publish-subscribe code architecture for rapid prototyping of physics and process models. 2. the implementation of a selection of linearization and model identification methods including high-order pseudo-time forward difference, complex-step, and direct identification from ordinary differential equation inspection. 3. improvements to the aeroheating and thermal models with non-equilibrium gas dynamics and generalized temperature dependent material thermal properties. A variety of model reduction and surrogate model techniques are applied to a representative hypersonic vehicle on a terminal trajectory to enable complete aerothermoelastic flight simulations. Multiple terminal trajectories of various starting altitudes and Mach numbers are optimized to maximize final kinetic energy of the vehicle upon reaching the surface. Surrogate models are compared to represent the variation of material thermal properties with temperature. A new method is developed and shown to be both accurate and computationally efficient. While the numerically efficient simulation of high-speed vehicles is developed within the presented framework, the goal of real time simulation is hampered by the necessity of multiple nested convergence loops. An alternative all-in-one surrogate model method is developed based on singular-value decomposition and regression that is near real time. Finally, the aeroelastic stability of pressurized cylindrical shells is investigated in the context of a maneuvering axisymmetric high-speed vehicle. Moderate internal pressurization is numerically shown to decrease stability, as showed experimentally in the literature, yet not well reproduced analytically. Insights are drawn from time simulation results and used to inform approaches for future vehicle model development.

  5. A Standard Kinematic Model for Flight Simulation at NASA Ames

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. E.

    1975-01-01

    A standard kinematic model for aircraft simulation exists at NASA-Ames on a variety of computer systems, one of which is used to control the flight simulator for advanced aircraft (FSAA). The derivation of the kinematic model is given and various mathematical relationships are presented as a guide. These include descriptions of standardized simulation subsystems such as the atmospheric turbulence model and the generalized six-degrees-of-freedom trim routine, as well as an introduction to the emulative batch-processing system which enables this facility to optimize its real-time environment.

  6. Global Sensitivity Analysis for Process Identification under Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.

    2015-12-01

    The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.

  7. A Distributed Snow Evolution Modeling System (SnowModel)

    NASA Astrophysics Data System (ADS)

    Liston, G. E.; Elder, K.

    2004-12-01

    A spatially distributed snow-evolution modeling system (SnowModel) has been specifically designed to be applicable over a wide range of snow landscapes, climates, and conditions. To reach this goal, SnowModel is composed of four sub-models: MicroMet defines the meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowMass simulates snow depth and water-equivalent evolution, and SnowTran-3D accounts for snow redistribution by wind. While other distributed snow models exist, SnowModel is unique in that it includes a well-tested blowing-snow sub-model (SnowTran-3D) for application in windy arctic, alpine, and prairie environments where snowdrifts are common. These environments comprise 68% of the seasonally snow-covered Northern Hemisphere land surface. SnowModel also accounts for snow processes occurring in forested environments (e.g., canopy interception related processes). SnowModel is designed to simulate snow-related physical processes occurring at spatial scales of 5-m and greater, and temporal scales of 1-hour and greater. These include: accumulation from precipitation; wind redistribution and sublimation; loading, unloading, and sublimation within forest canopies; snow-density evolution; and snowpack ripening and melt. To enhance its wide applicability, SnowModel includes the physical calculations required to simulate snow evolution within each of the global snow classes defined by Sturm et al. (1995), e.g., tundra, taiga, alpine, prairie, maritime, and ephemeral snow covers. The three, 25-km by 25-km, Cold Land Processes Experiment (CLPX) mesoscale study areas (MSAs: Fraser, North Park, and Rabbit Ears) are used as SnowModel simulation examples to highlight model strengths, weaknesses, and features in forested, semi-forested, alpine, and shrubland environments.

  8. Technology for Transient Simulation of Vibration during Combustion Process in Rocket Thruster

    NASA Astrophysics Data System (ADS)

    Zubanov, V. M.; Stepanov, D. V.; Shabliy, L. S.

    2018-01-01

    The article describes the technology for simulation of transient combustion processes in the rocket thruster for determination of vibration frequency occurs during combustion. The engine operates on gaseous propellant: oxygen and hydrogen. Combustion simulation was performed using the ANSYS CFX software. Three reaction mechanisms for the stationary mode were considered and described in detail. The way for obtaining quick CFD-results with intermediate combustion components using an EDM model was found. The way to generate the Flamelet library with CFX-RIF was described. A technique for modeling transient combustion processes in the rocket thruster was proposed based on the Flamelet library. A cyclic irregularity of the temperature field like vortex core precession was detected in the chamber. Frequency of flame precession was obtained with the proposed simulation technique.

  9. WEST-3 wind turbine simulator development

    NASA Technical Reports Server (NTRS)

    Hoffman, J. A.; Sridhar, S.

    1985-01-01

    The software developed for WEST-3, a new, all digital, and fully programmable wind turbine simulator is given. The process of wind turbine simulation on WEST-3 is described in detail. The major steps are, the processing of the mathematical models, the preparation of the constant data, and the use of system software generated executable code for running on WEST-3. The mechanics of reformulation, normalization, and scaling of the mathematical models is discussed in detail, in particulr, the significance of reformulation which leads to accurate simulations. Descriptions for the preprocessor computer programs which are used to prepare the constant data needed in the simulation are given. These programs, in addition to scaling and normalizing all the constants, relieve the user from having to generate a large number of constants used in the simulation. Also given are brief descriptions of the components of the WEST-3 system software: Translator, Assembler, Linker, and Loader. Also included are: details of the aeroelastic rotor analysis, which is the center of a wind turbine simulation model, analysis of the gimbal subsystem; and listings of the variables, constants, and equations used in the simulation.

  10. Modifier mass transfer kinetic effect in the performance of solvent gradient simulated moving bed (SG-SMB) process

    NASA Astrophysics Data System (ADS)

    Câmara, L. D. T.

    2015-09-01

    The solvent-gradient simulated moving bed process (SG-SMB) is the new tendency in the performance improvement if compared to the traditional isocratic solvent conditions. In such SG-SMB separation process the modulation of the solvent strength leads to significant increase in the purities and productivity followed by reduction in the solvent consumption. A stepwise modelling approach was utilized in the representation of the interconnected chromatographic columns of the system combined with lumped mass transfer models between the solid and liquid phase. The influence of the solvent modifier was considered applying the Abel model which takes into account the effect of modifier volume fraction over the partition coefficient. The modelling and simulations were carried out and compared to the experimental SG-SMB separation of the amino acids phenylalanine and tryptophan. A lumped mass transfer kinetic model was applied for both the modifier (ethanol) as well as the solutes. The simulation results showed that such simple and global mass transfer models are enough to represent all the mass transfer effect between the solid adsorbent and the liquid phase. The separation performance can be improved reducing the interaction or the mass transfer kinetic effect between the solid adsorbent phase and the modifier. The simulations showed great agreement fitting the experimental data of the amino acids concentrations both at the extract as well as at the raffinate.

  11. Carbon stocks and fluxes in the high latitudes: using site-level data to evaluate Earth system models

    NASA Astrophysics Data System (ADS)

    Chadburn, Sarah E.; Krinner, Gerhard; Porada, Philipp; Bartsch, Annett; Beer, Christian; Belelli Marchesini, Luca; Boike, Julia; Ekici, Altug; Elberling, Bo; Friborg, Thomas; Hugelius, Gustaf; Johansson, Margareta; Kuhry, Peter; Kutzbach, Lars; Langer, Moritz; Lund, Magnus; Parmentier, Frans-Jan W.; Peng, Shushi; Van Huissteden, Ko; Wang, Tao; Westermann, Sebastian; Zhu, Dan; Burke, Eleanor J.

    2017-11-01

    It is important that climate models can accurately simulate the terrestrial carbon cycle in the Arctic due to the large and potentially labile carbon stocks found in permafrost-affected environments, which can lead to a positive climate feedback, along with the possibility of future carbon sinks from northward expansion of vegetation under climate warming. Here we evaluate the simulation of tundra carbon stocks and fluxes in three land surface schemes that each form part of major Earth system models (JSBACH, Germany; JULES, UK; ORCHIDEE, France). We use a site-level approach in which comprehensive, high-frequency datasets allow us to disentangle the importance of different processes. The models have improved physical permafrost processes and there is a reasonable correspondence between the simulated and measured physical variables, including soil temperature, soil moisture and snow. We show that if the models simulate the correct leaf area index (LAI), the standard C3 photosynthesis schemes produce the correct order of magnitude of carbon fluxes. Therefore, simulating the correct LAI is one of the first priorities. LAI depends quite strongly on climatic variables alone, as we see by the fact that the dynamic vegetation model can simulate most of the differences in LAI between sites, based almost entirely on climate inputs. However, we also identify an influence from nutrient limitation as the LAI becomes too large at some of the more nutrient-limited sites. We conclude that including moss as well as vascular plants is of primary importance to the carbon budget, as moss contributes a large fraction to the seasonal CO2 flux in nutrient-limited conditions. Moss photosynthetic activity can be strongly influenced by the moisture content of moss, and the carbon uptake can be significantly different from vascular plants with a similar LAI. The soil carbon stocks depend strongly on the rate of input of carbon from the vegetation to the soil, and our analysis suggests that an improved simulation of photosynthesis would also lead to an improved simulation of soil carbon stocks. However, the stocks are also influenced by soil carbon burial (e.g. through cryoturbation) and the rate of heterotrophic respiration, which depends on the soil physical state. More detailed below-ground measurements are needed to fully evaluate biological and physical soil processes. Furthermore, even if these processes are well modelled, the soil carbon profiles cannot resemble peat layers as peat accumulation processes are not represented in the models. Thus, we identify three priority areas for model development: (1) dynamic vegetation including (a) climate and (b) nutrient limitation effects; (2) adding moss as a plant functional type; and an (3) improved vertical profile of soil carbon including peat processes.

  12. Theory, modeling, and simulation of structural and functional materials: Micromechanics, microstructures, and properties

    NASA Astrophysics Data System (ADS)

    Jin, Yongmei

    In recent years, theoretical modeling and computational simulation of microstructure evolution and materials property has been attracting much attention. While significant advances have been made, two major challenges remain. One is the integration of multiple physical phenomena for simulation of complex materials behavior, the other is the bridging over multiple length and time scales in materials modeling and simulation. The research presented in this Thesis is focused mainly on tackling the first major challenge. In this Thesis, a unified Phase Field Microelasticity (PFM) approach is developed. This approach is an advanced version of the phase field method that takes into account the exact elasticity of arbitrarily anisotropic, elastically and structurally inhomogeneous systems. The proposed theory and models are applicable to infinite solids, elastic half-space, and finite bodies with arbitrary-shaped free surfaces, which may undergo various concomitant physical processes. The Phase Field Microelasticity approach is employed to formulate the theories and models of martensitic transformation, dislocation dynamics, and crack evolution in single crystal and polycrystalline solids. It is also used to study strain relaxation in heteroepitaxial thin films through misfit dislocation and surface roughening. Magnetic domain evolution in nanocrystalline thin films is also investigated. Numerous simulation studies are performed. Comparison with analytical predictions and experimental observations are presented. Agreement verities the theory and models as realistic simulation tools for computational materials science and engineering. The same Phase Field Microelasticity formalism of individual models of different physical phenomena makes it easy to integrate multiple physical processes into one unified simulation model, where multiple phenomena are treated as various relaxation modes that together act as one common cooperative phenomenon. The model does not impose a priori constraints on possible microstructure evolution paths. This gives the model predicting power, where material system itself "chooses" the optimal path for multiple processes. The advances made in this Thesis present a significant step forward to overcome the first challenge, mesoscale multi-physics modeling and simulation of materials. At the end of this Thesis, the way to tackle the second challenge, bridging over multiple length and time scales in materials modeling and simulation, is discussed based on connection between the mesoscale Phase Field Microelasticity modeling and microscopic atomistic calculation as well as macroscopic continuum theory.

  13. Strategies for Large Scale Implementation of a Multiscale, Multiprocess Integrated Hydrologic Model

    NASA Astrophysics Data System (ADS)

    Kumar, M.; Duffy, C.

    2006-05-01

    Distributed models simulate hydrologic state variables in space and time while taking into account the heterogeneities in terrain, surface, subsurface properties and meteorological forcings. Computational cost and complexity associated with these model increases with its tendency to accurately simulate the large number of interacting physical processes at fine spatio-temporal resolution in a large basin. A hydrologic model run on a coarse spatial discretization of the watershed with limited number of physical processes needs lesser computational load. But this negatively affects the accuracy of model results and restricts physical realization of the problem. So it is imperative to have an integrated modeling strategy (a) which can be universally applied at various scales in order to study the tradeoffs between computational complexity (determined by spatio- temporal resolution), accuracy and predictive uncertainty in relation to various approximations of physical processes (b) which can be applied at adaptively different spatial scales in the same domain by taking into account the local heterogeneity of topography and hydrogeologic variables c) which is flexible enough to incorporate different number and approximation of process equations depending on model purpose and computational constraint. An efficient implementation of this strategy becomes all the more important for Great Salt Lake river basin which is relatively large (~89000 sq. km) and complex in terms of hydrologic and geomorphic conditions. Also the types and the time scales of hydrologic processes which are dominant in different parts of basin are different. Part of snow melt runoff generated in the Uinta Mountains infiltrates and contributes as base flow to the Great Salt Lake over a time scale of decades to centuries. The adaptive strategy helps capture the steep topographic and climatic gradient along the Wasatch front. Here we present the aforesaid modeling strategy along with an associated hydrologic modeling framework which facilitates a seamless, computationally efficient and accurate integration of the process model with the data model. The flexibility of this framework leads to implementation of multiscale, multiresolution, adaptive refinement/de-refinement and nested modeling simulations with least computational burden. However, performing these simulations and related calibration of these models over a large basin at higher spatio- temporal resolutions is computationally intensive and requires use of increasing computing power. With the advent of parallel processing architectures, high computing performance can be achieved by parallelization of existing serial integrated-hydrologic-model code. This translates to running the same model simulation on a network of large number of processors thereby reducing the time needed to obtain solution. The paper also discusses the implementation of the integrated model on parallel processors. Also will be discussed the mapping of the problem on multi-processor environment, method to incorporate coupling between hydrologic processes using interprocessor communication models, model data structure and parallel numerical algorithms to obtain high performance.

  14. Development and testing of a fast conceptual river water quality model.

    PubMed

    Keupers, Ingrid; Willems, Patrick

    2017-04-15

    Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. A parallel computational model for GATE simulations.

    PubMed

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Effects of in-sewer processes: a stochastic model approach.

    PubMed

    Vollertsen, J; Nielsen, A H; Yang, W; Hvitved-Jacobsen, T

    2005-01-01

    Transformations of organic matter, nitrogen and sulfur in sewers can be simulated taking into account the relevant transformation and transport processes. One objective of such simulation is the assessment and management of hydrogen sulfide formation and corrosion. Sulfide is formed in the biofilms and sediments of the water phase, but corrosion occurs on the moist surfaces of the sewer gas phase. Consequently, both phases and the transport of volatile substances between these phases must be included. Furthermore, wastewater composition and transformations in sewers are complex and subject to high, natural variability. This paper presents the latest developments of the WATS model concept, allowing integrated aerobic, anoxic and anaerobic simulation of the water phase and of gas phase processes. The resulting model is complex and with high parameter variability. An example applying stochastic modeling shows how this complexity and variability can be taken into account.

  17. Fermi-level effects in semiconductor processing: A modeling scheme for atomistic kinetic Monte Carlo simulators

    NASA Astrophysics Data System (ADS)

    Martin-Bragado, I.; Castrillo, P.; Jaraiz, M.; Pinacho, R.; Rubio, J. E.; Barbolla, J.; Moroz, V.

    2005-09-01

    Atomistic process simulation is expected to play an important role for the development of next generations of integrated circuits. This work describes an approach for modeling electric charge effects in a three-dimensional atomistic kinetic Monte Carlo process simulator. The proposed model has been applied to the diffusion of electrically active boron and arsenic atoms in silicon. Several key aspects of the underlying physical mechanisms are discussed: (i) the use of the local Debye length to smooth out the atomistic point-charge distribution, (ii) algorithms to correctly update the charge state in a physically accurate and computationally efficient way, and (iii) an efficient implementation of the drift of charged particles in an electric field. High-concentration effects such as band-gap narrowing and degenerate statistics are also taken into account. The efficiency, accuracy, and relevance of the model are discussed.

  18. Atmospheric Modeling And Sensor Simulation (AMASS) study

    NASA Technical Reports Server (NTRS)

    Parker, K. G.

    1984-01-01

    The capabilities of the atmospheric modeling and sensor simulation (AMASS) system were studied in order to enhance them. This system is used in processing atmospheric measurements which are utilized in the evaluation of sensor performance, conducting design-concept simulation studies, and also in the modeling of the physical and dynamical nature of atmospheric processes. The study tasks proposed in order to both enhance the AMASS system utilization and to integrate the AMASS system with other existing equipment to facilitate the analysis of data for modeling and image processing are enumerated. The following array processors were evaluated for anticipated effectiveness and/or improvements in throughput by attachment of the device to the P-e: (1) Floating Point Systems AP-120B; (2) Floating Point Systems 5000; (3) CSP, Inc. MAP-400; (4) Analogic AP500; (5) Numerix MARS-432; and (6) Star Technologies, Inc. ST-100.

  19. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  20. Multislice spiral CT simulator for dynamic cardiopulmonary studies

    NASA Astrophysics Data System (ADS)

    De Francesco, Silvia; Ferreira da Silva, Augusto M.

    2002-04-01

    We've developed a Multi-slice Spiral CT Simulator modeling the acquisition process of a real tomograph over a 4-dimensional phantom (4D MCAT) of the human thorax. The simulator allows us to visually characterize artifacts due to insufficient temporal sampling and a priori evaluate the quality of the images obtained in cardio-pulmonary studies (both with single-/multi-slice and ECG gated acquisition processes). The simulating environment allows both for conventional and spiral scanning modes and includes a model of noise in the acquisition process. In case of spiral scanning, reconstruction facilities include longitudinal interpolation methods (360LI and 180LI both for single and multi-slice). Then, the reconstruction of the section is performed through FBP. The reconstructed images/volumes are affected by distortion due to insufficient temporal sampling of the moving object. The developed simulating environment allows us to investigate the nature of the distortion characterizing it qualitatively and quantitatively (using, for example, Herman's measures). Much of our work is focused on the determination of adequate temporal sampling and sinogram regularization techniques. At the moment, the simulator model is limited to the case of multi-slice tomograph, being planned as a next step of development the extension to cone beam or area detectors.

Top