Hydrological and water quality processes simulation by the integrated MOHID model
NASA Astrophysics Data System (ADS)
Epelde, Ane; Antiguedad, Iñaki; Brito, David; Eduardo, Jauch; Neves, Ramiro; Sauvage, Sabine; Sánchez-Pérez, José Miguel
2016-04-01
Different modelling approaches have been used in recent decades to study the water quality degradation caused by non-point source pollution. In this study, the MOHID fully distributed and physics-based model has been employed to simulate hydrological processes and nitrogen dynamics in a nitrate vulnerable zone: the Alegria River watershed (Basque Country, Northern Spain). The results of this study indicate that the MOHID code is suitable for hydrological processes simulation at the watershed scale, as the model shows satisfactory performance at simulating the discharge (with NSE: 0.74 and 0.76 during calibration and validation periods, respectively). The agronomical component of the code, allowed the simulation of agricultural practices, which lead to adequate crop yield simulation in the model. Furthermore, the nitrogen exportation also shows satisfactory performance (with NSE: 0.64 and 0.69 during calibration and validation periods, respectively). While the lack of field measurements do not allow to evaluate the nutrient cycling processes in depth, it has been observed that the MOHID model simulates the annual denitrification according to general ranges established for agricultural watersheds (in this study, 9 kg N ha-1 year-1). In addition, the model has simulated coherently the spatial distribution of the denitrification process, which is directly linked to the simulated hydrological conditions. Thus, the model has localized the highest rates nearby the discharge zone of the aquifer and also where the aquifer thickness is low. These results evidence the strength of this model to simulate watershed scale hydrological processes as well as the crop production and the agricultural activity derived water quality degradation (considering both nutrient exportation and nutrient cycling processes).
A framework of knowledge creation processes in participatory simulation of hospital work systems.
Andersen, Simone Nyholm; Broberg, Ole
2017-04-01
Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.
Process simulation and dynamic control for marine oily wastewater treatment using UV irradiation.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Li, Pu
2015-09-15
UV irradiation and advanced oxidation processes have been recently regarded as promising solutions in removing polycyclic aromatic hydrocarbons (PAHs) from marine oily wastewater. However, such treatment methods are generally not sufficiently understood in terms of reaction mechanisms, process simulation and process control. These deficiencies can drastically hinder their application in shipping and offshore petroleum industries which produce bilge/ballast water and produced water as the main streams of marine oily wastewater. In this study, the factorial design of experiment was carried out to investigate the degradation mechanism of a typical PAH, namely naphthalene, under UV irradiation in seawater. Based on the experimental results, a three-layer feed-forward artificial neural network simulation model was developed to simulate the treatment process and to forecast the removal performance. A simulation-based dynamic mixed integer nonlinear programming (SDMINP) approach was then proposed to intelligently control the treatment process by integrating the developed simulation model, genetic algorithm and multi-stage programming. The applicability and effectiveness of the developed approach were further tested though a case study. The experimental results showed that the influences of fluence rate and temperature on the removal of naphthalene were greater than those of salinity and initial concentration. The developed simulation model could well predict the UV-induced removal process under varying conditions. The case study suggested that the SDMINP approach, with the aid of the multi-stage control strategy, was able to significantly reduce treatment cost when comparing to the traditional single-stage process optimization. The developed approach and its concept/framework have high potential of applicability in other environmental fields where a treatment process is involved and experimentation and modeling are used for process simulation and control. Copyright © 2015 Elsevier Ltd. All rights reserved.
Software quality and process improvement in scientific simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, J.; Webster, R.
1997-11-01
This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.
ERIC Educational Resources Information Center
Neely, Pat; Tucker, Jan
2013-01-01
Purpose: Simulations are designed as activities which imitate real world scenarios and are often used to teach and enhance skill building. The purpose of this case study is to examine the decision making process and outcomes of a faculty committee tasked with examining simulations in the marketplace to determine if the simulations could be used as…
Shim, Sung J; Kumar, Arun; Jiao, Roger
2016-01-01
A hospital is considering deploying a radiofrequency identification (RFID) system and setting up a new "discharge lounge" to improve the patient discharge process. This study uses computer simulation to model and compare the current process and the new process, and it assesses the impact of the RFID system and the discharge lounge on the process in terms of resource utilization and time taken in the process. The simulation results regarding resource utilization suggest that the RFID system can slightly relieve the burden on all resources, whereas the RFID system and the discharge lounge together can significantly mitigate the nurses' tasks. The simulation results in terms of the time taken demonstrate that the RFID system can shorten patient wait times, staff busy times, and bed occupation times. The results of the study could prove helpful to others who are considering the use of an RFID system in the patient discharge process in hospitals or similar processes.
Quantitative computer simulations of extraterrestrial processing operations
NASA Technical Reports Server (NTRS)
Vincent, T. L.; Nikravesh, P. E.
1989-01-01
The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.
Simulation of beam-induced plasma in gas-filled rf cavities
Yu, Kwangmin; Samulyak, Roman; Yonehara, Katsuya; ...
2017-03-07
Processes occurring in a radio-frequency (rf) cavity, filled with high pressure gas and interacting with proton beams, have been studied via advanced numerical simulations. Simulations support the experimental program on the hydrogen gas-filled rf cavity in the Mucool Test Area (MTA) at Fermilab, and broader research on the design of muon cooling devices. space, a 3D electromagnetic particle-in-cell (EM-PIC) code with atomic physics support, was used in simulation studies. Plasma dynamics in the rf cavity, including the process of neutral gas ionization by proton beams, plasma loading of the rf cavity, and atomic processes in plasma such as electron-ion andmore » ion-ion recombination and electron attachment to dopant molecules, have been studied. Here, through comparison with experiments in the MTA, simulations quantified several uncertain values of plasma properties such as effective recombination rates and the attachment time of electrons to dopant molecules. Simulations have achieved very good agreement with experiments on plasma loading and related processes. Lastly, the experimentally validated code space is capable of predictive simulations of muon cooling devices.« less
NASA Astrophysics Data System (ADS)
Singh, Swadesh Kumar; Kumar, D. Ravi
2005-08-01
Hydro-mechanical deep drawing is a process for producing cup shaped parts with the assistance of a pressurized fluid. In the present work, numerical simulation of the conventional and counter pressure deep drawing processes has been done with the help of a finite element method based software. Simulation results were analyzed to study the improvement in drawability by using hydro-mechanical processes. The thickness variations in the drawn cups were analyzed and also the effect of counter pressure and oil gap on the thickness distribution was studied. Numerical simulations were also used for the die design, which combines both drawing and ironing processes in a single operation. This modification in the die provides high drawability, facilitates smooth material flow, gives more uniform thickness distribution and corrects the shape distortion.
Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang
2017-01-01
Although wafer-level camera lenses are a very promising technology, problems such as warpage with time and non-uniform thickness of products still exist. In this study, finite element simulation was performed to simulate the compression molding process for acquiring the pressure distribution on the product on completion of the process and predicting the deformation with respect to the pressure distribution. Results show that the single-gate compression molding process significantly increases the pressure at the center of the product, whereas the multi-gate compressing molding process can effectively distribute the pressure. This study evaluated the non-uniform thickness of product and changes in the process parameters through computer simulations, which could help to improve the compression molding process. PMID:28617315
NASA Astrophysics Data System (ADS)
Abustan, M. S.; Rahman, N. A.; Gotoh, H.; Harada, E.; Talib, S. H. A.
2016-07-01
In Malaysia, not many researches on crowd evacuation simulation had been reported. Hence, the development of numerical crowd evacuation process by taking into account people behavioral patterns and psychological characteristics is crucial in Malaysia. On the other hand, tsunami disaster began to gain attention of Malaysian citizens after the 2004 Indian Ocean Tsunami that need quick evacuation process. In relation to the above circumstances, we have conducted simulations of tsunami evacuation process at the Miami Beach of Penang Island by using Distinct Element Method (DEM)-based crowd behavior simulator. The main objectives are to investigate and reproduce current conditions of evacuation process at the said locations under different hypothetical scenarios for the efficiency study of the evacuation. The sim-1 is initial condition of evacuation planning while sim-2 as improvement of evacuation planning by adding new evacuation area. From the simulation result, sim-2 have a shorter time of evacuation process compared to the sim-1. The evacuation time recuded 53 second. The effect of the additional evacuation place is confirmed from decreasing of the evacuation completion time. Simultaneously, the numerical simulation may be promoted as an effective tool in studying crowd evacuation process.
Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Johansson, Gerd
2014-01-01
To demonstrate the use of visualization and simulation tools in order to involve stakeholders and inform the process in hospital change processes, illustrated by an empirical study from a children's emergency clinic. Reorganization and redevelopment of a hospital is a complex activity that involves many stakeholders and demands. Visualization and simulation tools have proven useful for involving practitioners and eliciting relevant knowledge. More knowledge is desired about how these tools can be implemented in practice for hospital planning processes. A participatory planning process including practitioners and researchers was executed over a 3-year period to evaluate a combination of visualization and simulation tools to involve stakeholders in the planning process and to elicit knowledge about needs and requirements. The initial clinic proposal from the architect was discarded as a result of the empirical study. Much general knowledge about the needs of the organization was extracted by means of the adopted tools. Some of the tools proved to be more accessible than others for the practitioners participating in the study. The combination of tools added value to the process by presenting information in alternative ways and eliciting questions from different angles. Visualization and simulation tools inform a planning process (or other types of change processes) by providing the means to see beyond present demands and current work structures. Long-term involvement in combination with accessible tools is central for creating a participatory setting where the practitioners' knowledge guides the process. © 2014 Vendome Group, LLC.
State of the evidence on simulation-based training for laparoscopic surgery: a systematic review.
Zendejas, Benjamin; Brydges, Ryan; Hamstra, Stanley J; Cook, David A
2013-04-01
Summarize the outcomes and best practices of simulation training for laparoscopic surgery. Simulation-based training for laparoscopic surgery has become a mainstay of surgical training. Much new evidence has accrued since previous reviews were published. We systematically searched the literature through May 2011 for studies evaluating simulation, in comparison with no intervention or an alternate training activity, for training health professionals in laparoscopic surgery. Outcomes were classified as satisfaction, skills (in a test setting) of time (to perform the task), process (eg, performance rating), product (eg, knot strength), and behaviors when caring for patients. We used random effects to pool effect sizes. From 10,903 articles screened, we identified 219 eligible studies enrolling 7138 trainees, including 91 (42%) randomized trials. For comparisons with no intervention (n = 151 studies), pooled effect size (ES) favored simulation for outcomes of knowledge (1.18; N = 9 studies), skills time (1.13; N = 89), skills process (1.23; N = 114), skills product (1.09; N = 7), behavior time (1.15; N = 7), behavior process (1.22; N = 15), and patient effects (1.28; N = 1), all P < 0.05. When compared with nonsimulation instruction (n = 3 studies), results significantly favored simulation for outcomes of skills time (ES, 0.75) and skills process (ES, 0.54). Comparisons between different simulation interventions (n = 79 studies) clarified best practices. For example, in comparison with virtual reality, box trainers have similar effects for process skills outcomes and seem to be superior for outcomes of satisfaction and skills time. Simulation-based laparoscopic surgery training of health professionals has large benefits when compared with no intervention and is moderately more effective than nonsimulation instruction.
ERIC Educational Resources Information Center
Duffy, Melissa C.; Azevedo, Roger; Sun, Ning-Zi; Griscom, Sophia E.; Stead, Victoria; Crelinsten, Linda; Wiseman, Jeffrey; Maniatis, Thomas; Lachapelle, Kevin
2015-01-01
This study examined the nature of cognitive, metacognitive, and affective processes among a medical team experiencing difficulty managing a challenging simulated medical emergency case by conducting in-depth analysis of process data. Medical residents participated in a simulation exercise designed to help trainees to develop medical expertise,…
COMPUTERIZED TRAINING OF CRYOSURGERY – A SYSTEM APPROACH
Keelan, Robert; Yamakawa, Soji; Shimada, Kenji; Rabin, Yoed
2014-01-01
The objective of the current study is to provide the foundation for a computerized training platform for cryosurgery. Consistent with clinical practice, the training process targets the correlation of the frozen region contour with the target region shape, using medical imaging and accepted criteria for clinical success. The current study focuses on system design considerations, including a bioheat transfer model, simulation techniques, optimal cryoprobe layout strategy, and a simulation core framework. Two fundamentally different approaches were considered for the development of a cryosurgery simulator, based on a finite-elements (FE) commercial code (ANSYS) and a proprietary finite-difference (FD) code. Results of this study demonstrate that the FE simulator is superior in terms of geometric modeling, while the FD simulator is superior in terms of runtime. Benchmarking results further indicate that the FD simulator is superior in terms of usage of memory resources, pre-processing, parallel processing, and post-processing. It is envisioned that future integration of a human-interface module and clinical data into the proposed computer framework will make computerized training of cryosurgery a practical reality. PMID:23995400
Business process study simulation for resource management in an emergency department.
Poomkothammal, Velusamy
2006-01-01
Alexandra Hospital conducted a business process reengineering exercise for all its main processes in order to further improve on their efficiencies with the ultimate aim to provide a higher level of services to patients. The goal of the DEM is to manage an anticipated increase in the volume of patients without much increase in resources. As a start, the Department of Emergency (DEM) medicine studied its AS-IS process and has designed and implemented the new TO-BE process. As part of this continuous improvement effort, staff from Nanyang Polytechnic (NYP) has been assigned the task of applying engineering and analytical techniques to simulate the new process. The simulations were conducted to show on process management and resource planning.
Study on wet scavenging of atmospheric pollutants in south Brazil
NASA Astrophysics Data System (ADS)
Wiegand, Flavio; Pereira, Felipe Norte; Teixeira, Elba Calesso
2011-09-01
The present paper presents the study of in-cloud and below-cloud SO 2 and SO 42-scavenging processes by applying numerical models in the Candiota region, located in the state of Rio Grande do Sul, South Brazil. The BRAMS (Brazilian Regional Atmospheric Modeling System) model was applied to simulate the vertical structure of the clouds, and the B.V.2 (Below-Cloud Beheng Version 2) scavenging model was applied to simulate in-cloud and below-cloud scavenging processes of the pollutants SO 2 and SO 42-. Five events in 2004 were selected for this study and were sampled at the Candiota Airport station. The concentrations of SO 2 and SO 42- sampled in the air and the simulated meteorological parameters of rainfall episodes were used as input data in the B.V.2, which simulates raindrop interactions associated with the scavenging process. Results for the Candiota region showed that in-cloud scavenging processes were more significant than below-cloud scavenging processes for two of the five events studied, with a contribution of approximately 90-100% of SO 2 and SO 42- concentrations in rainwater. A few adjustments to the original version of B.V.2 were made to allow simulation of scavenging processes in several types of clouds, not only cumulus humilis and cumulus congestus.
An application of sedimentation simulation in Tahe oilfield
NASA Astrophysics Data System (ADS)
Tingting, He; Lei, Zhao; Xin, Tan; Dongxu, He
2017-12-01
The braided river delta develops in Triassic low oil formation in the block 9 of Tahe oilfield, but its sedimentation evolution process is unclear. By using sedimentation simulation technology, sedimentation process and distribution of braided river delta are studied based on the geological parameters including sequence stratigraphic division, initial sedimentation environment, relative lake level change and accommodation change, source supply and sedimentary transport pattern. The simulation result shows that the error rate between strata thickness of simulation and actual strata thickness is small, and the single well analysis result of simulation is highly consistent with the actual analysis, which can prove that the model is reliable. The study area belongs to braided river delta retrogradation evolution process, which provides favorable basis for fine reservoir description and prediction.
Reduced order model based on principal component analysis for process simulation and optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lang, Y.; Malacina, A.; Biegler, L.
2009-01-01
It is well-known that distributed parameter computational fluid dynamics (CFD) models provide more accurate results than conventional, lumped-parameter unit operation models used in process simulation. Consequently, the use of CFD models in process/equipment co-simulation offers the potential to optimize overall plant performance with respect to complex thermal and fluid flow phenomena. Because solving CFD models is time-consuming compared to the overall process simulation, we consider the development of fast reduced order models (ROMs) based on CFD results to closely approximate the high-fidelity equipment models in the co-simulation. By considering process equipment items with complicated geometries and detailed thermodynamic property models,more » this study proposes a strategy to develop ROMs based on principal component analysis (PCA). Taking advantage of commercial process simulation and CFD software (for example, Aspen Plus and FLUENT), we are able to develop systematic CFD-based ROMs for equipment models in an efficient manner. In particular, we show that the validity of the ROM is more robust within well-sampled input domain and the CPU time is significantly reduced. Typically, it takes at most several CPU seconds to evaluate the ROM compared to several CPU hours or more to solve the CFD model. Two case studies, involving two power plant equipment examples, are described and demonstrate the benefits of using our proposed ROM methodology for process simulation and optimization.« less
Simulation Learning: PC-Screen Based (PCSB) versus High Fidelity Simulation (HFS)
2012-08-01
methods for the use of simulation for teaching clinical skills to military and civilian clinicians . High fidelity simulation is an expensive method of...without the knowledge and approval of the IRB. Changes include, but not limited to, modifications in study design, recruitment process and number of...Person C-Collar simulation algorithm Pathway A Scenario A - Spinal stabilization: Sub processes Legend: Pathway Points Complex task to be performed by
Darkwah, Kwabena; Nokes, Sue E; Seay, Jeffrey R; Knutson, Barbara L
2018-05-22
Process simulations of batch fermentations with in situ product separation traditionally decouple these interdependent steps by simulating a separate "steady state" continuous fermentation and separation units. In this study, an integrated batch fermentation and separation process was simulated for a model system of acetone-butanol-ethanol (ABE) fermentation with in situ gas stripping, such that the fermentation kinetics are linked in real-time to the gas stripping process. A time-dependent cell growth, substrate utilization, and product production is translated to an Aspen Plus batch reactor. This approach capitalizes on the phase equilibria calculations of Aspen Plus to predict the effect of stripping on the ABE fermentation kinetics. The product profiles of the integrated fermentation and separation are shown to be sensitive to gas flow rate, unlike separate steady state fermentation and separation simulations. This study demonstrates the importance of coupled fermentation and separation simulation approaches for the systematic analyses of unsteady state processes.
ERIC Educational Resources Information Center
Fang, N.; Tajvidi, M.
2018-01-01
This study focuses on the investigation of the effects of computer simulation and animation (CSA) on students' cognitive processes in an undergraduate engineering course. The revised Bloom's taxonomy, which consists of six categories in the cognitive process domain, was employed in this study. Five of the six categories were investigated,…
NASA Astrophysics Data System (ADS)
Wu, Longtao; Wong, Sun; Wang, Tao; Huffman, George J.
2018-01-01
Simulation of moist convective processes is critical for accurately representing the interaction among tropical wave activities, atmospheric water vapor transport, and clouds associated with the Indian monsoon Intraseasonal Oscillation (ISO). In this study, we apply the Weather Research and Forecasting (WRF) model to simulate Indian monsoon ISO with three different treatments of moist convective processes: (1) the Betts-Miller-Janjić (BMJ) adjustment cumulus scheme without explicit simulation of moist convective processes; (2) the New Simplified Arakawa-Schubert (NSAS) mass-flux scheme with simplified moist convective processes; and (3) explicit simulation of moist convective processes at convection permitting scale (Nest). Results show that the BMJ experiment is unable to properly reproduce the equatorial Rossby wave activities and the corresponding phase relationship between moisture advection and dynamical convergence during the ISO. These features associated with the ISO are approximately captured in the NSAS experiment. The simulation with resolved moist convective processes significantly improves the representation of the ISO evolution, and has good agreements with the observations. This study features the first attempt to investigate the Indian monsoon at convection permitting scale.
Taplay, Karyn; Jack, Susan M; Baxter, Pamela; Eva, Kevin; Martin, Lynn
2015-01-01
The aim of this study is to explain the process of adopting and incorporating simulation as a teaching strategy in undergraduate nursing programs, define uptake, and discuss potential outcomes. In many countries, simulation is increasingly adopted as a common teaching strategy. However, there is a dearth of knowledge related to the process of adoption and incorporation. We used an interpretive, constructivist approach to grounded theory to guide this research study. We conducted the study was in Ontario, Canada, during 2011-2012. Using multiple data sources, we informed the development of this theory including in-depth interviews (n = 43) and a review of key organizational documents, such as mission and vision statements (n = 67) from multiple nursing programs (n = 13). The adoption and uptake of mid- to high-fidelity simulation equipment is a multistep iterative process involving various organizational levels within the institution that entails a seven-phase process: (a) securing resources, (b) nursing leaders working in tandem, (c) getting it out of the box, (d) learning about simulation and its potential for teaching, (e) finding a fit, (f) trialing the equipment, and (g) integrating into the curriculum. These findings could assist nursing programs in Canada and internationally that wish to adopt or further incorporate simulation into their curricula and highlight potential organizational and program level outcomes. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.
A Simplified Finite Element Simulation for Straightening Process of Thin-Walled Tube
NASA Astrophysics Data System (ADS)
Zhang, Ziqian; Yang, Huilin
2017-12-01
The finite element simulation is an effective way for the study of thin-walled tube in the two cross rolls straightening process. To determine the accurate radius of curvature of the roll profile more efficiently, a simplified finite element model based on the technical parameters of an actual two cross roll straightening machine, was developed to simulate the complex straightening process. Then a dynamic simulation was carried out using ANSYS LS-DYNA program. The result implied that the simplified finite element model was reasonable for simulate the two cross rolls straightening process, and can be obtained the radius of curvature of the roll profile with the tube’s straightness 2 mm/m.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Rui, E-mail: Sunsr@hit.edu.cn; Ismail, Tamer M., E-mail: temoil@aucegypt.edu; Ren, Xiaohan
Highlights: • The effects of moisture content on the burning process of MSW are investigated. • A two-dimensional mathematical model was built to simulate the combustion process. • Temperature distributions, process rates, gas species were measured and simulated. • The The conversion ratio of C/CO and N/NO in MSW are inverse to moisture content. - Abstract: In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on themore » combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k–ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW.« less
Kim, Sunghee; Shin, Gisoo
2016-02-01
Since previous studies on simulation-based education have been focused on fundamental nursing skills for nursing students in South Korea, there is little research available that focuses on clinical nurses in simulation-based training. Further, there is a paucity of research literature related to the integration of the nursing process into simulation training particularly in the emergency nursing care of high-risk maternal and neonatal patients. The purpose of this study was to identify the effects of nursing process-based simulation on knowledge, attitudes, and skills for maternal and child emergency nursing care in clinical nurses in South Korea. Data were collected from 49 nurses, 25 in the experimental group and 24 in the control group, from August 13 to 14, 2013. This study was an equivalent control group pre- and post-test experimental design to compare the differences in knowledge, attitudes, and skills for maternal and child emergency nursing care between the experimental group and the control group. The experimental group was trained by the nursing process-based simulation training program, while the control group received traditional methods of training for maternal and child emergency nursing care. The experimental group was more likely to improve knowledge, attitudes, and skills required for clinical judgment about maternal and child emergency nursing care than the control group. Among five stages of nursing process in simulation, the experimental group was more likely to improve clinical skills required for nursing diagnosis and nursing evaluation than the control group. These results will provide valuable information on developing nursing process-based simulation training to improve clinical competency in nurses. Further research should be conducted to verify the effectiveness of nursing process-based simulation with more diverse nurse groups on more diverse subjects in the future. Copyright © 2015 Elsevier Ltd. All rights reserved.
Virtual milk for modelling and simulation of dairy processes.
Munir, M T; Zhang, Y; Yu, W; Wilson, D I; Young, B R
2016-05-01
The modeling of dairy processing using a generic process simulator suffers from shortcomings, given that many simulators do not contain milk components in their component libraries. Recently, pseudo-milk components for a commercial process simulator were proposed for simulation and the current work extends this pseudo-milk concept by studying the effect of both total milk solids and temperature on key physical properties such as thermal conductivity, density, viscosity, and heat capacity. This paper also uses expanded fluid and power law models to predict milk viscosity over the temperature range from 4 to 75°C and develops a succinct regressed model for heat capacity as a function of temperature and fat composition. The pseudo-milk was validated by comparing the simulated and actual values of the physical properties of milk. The milk thermal conductivity, density, viscosity, and heat capacity showed differences of less than 2, 4, 3, and 1.5%, respectively, between the simulated results and actual values. This work extends the capabilities of the previously proposed pseudo-milk and of a process simulator to model dairy processes, processing different types of milk (e.g., whole milk, skim milk, and concentrated milk) with different intrinsic compositions, and to predict correct material and energy balances for dairy processes. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-10
... qualification process as an important tool for the assessment of vehicle performance. These simulations are... qualification process, simulations would be conducted using both a measured track geometry segment... on the results of simulation studies designed to identify track geometry irregularities associated...
Exact simulation of max-stable processes.
Dombry, Clément; Engelke, Sebastian; Oesting, Marco
2016-06-01
Max-stable processes play an important role as models for spatial extreme events. Their complex structure as the pointwise maximum over an infinite number of random functions makes their simulation difficult. Algorithms based on finite approximations are often inexact and computationally inefficient. We present a new algorithm for exact simulation of a max-stable process at a finite number of locations. It relies on the idea of simulating only the extremal functions, that is, those functions in the construction of a max-stable process that effectively contribute to the pointwise maximum. We further generalize the algorithm by Dieker & Mikosch (2015) for Brown-Resnick processes and use it for exact simulation via the spectral measure. We study the complexity of both algorithms, prove that our new approach via extremal functions is always more efficient, and provide closed-form expressions for their implementation that cover most popular models for max-stable processes and multivariate extreme value distributions. For simulation on dense grids, an adaptive design of the extremal function algorithm is proposed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Sepehrnoori, K.
1994-09-01
The objective of this research is to develop cost-effective surfactant flooding technology by using surfactant simulation studies to evaluate and optimize alternative design strategies taking into account reservoir characteristics, process chemistry, and process design options such as horizontal wells. Task 1 is the development of an improved numerical method for our simulator that will enable us to solve a wider class of these difficult simulation problems accurately and affordably. Task 2 is the application of this simulator to the optimization of surfactant flooding to reduce its risk and cost. The goal of Task 2 is to understand and generalize themore » impact of both process and reservoir characteristics on the optimal design of surfactant flooding. We have studied the effect of process parameters such as salinity gradient, surfactant adsorption, surfactant concentration, surfactant slug size, pH, polymer concentration and well constraints on surfactant floods. In this report, we show three dimensional field scale simulation results to illustrate the impact of one important design parameter, the salinity gradient. Although the use of a salinity gradient to improve the efficiency and robustness of surfactant flooding has been studied and applied for many years, this is the first time that we have evaluated it using stochastic simulations rather than simulations using the traditional layered reservoir description. The surfactant flooding simulations were performed using The University of Texas chemical flooding simulator called UTCHEM.« less
Collaborative simulation method with spatiotemporal synchronization process control
NASA Astrophysics Data System (ADS)
Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian
2016-10-01
When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.
Numerical Simulation of Cast Distortion in Gas Turbine Engine Components
NASA Astrophysics Data System (ADS)
Inozemtsev, A. A.; Dubrovskaya, A. S.; Dongauser, K. A.; Trufanov, N. A.
2015-06-01
In this paper the process of multiple airfoilvanes manufacturing through investment casting is considered. The mathematical model of the full contact problem is built to determine stress strain state in a cast during the process of solidification. Studies are carried out in viscoelastoplastic statement. Numerical simulation of the explored process is implemented with ProCASTsoftware package. The results of simulation are compared with the real production process. By means of computer analysis the optimization of technical process parameters is done in order to eliminate the defect of cast walls thickness variation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.; McCorkle, D.; Yang, C.
Process modeling and simulation tools are widely used for the design and operation of advanced power generation systems. These tools enable engineers to solve the critical process systems engineering problems that arise throughout the lifecycle of a power plant, such as designing a new process, troubleshooting a process unit or optimizing operations of the full process. To analyze the impact of complex thermal and fluid flow phenomena on overall power plant performance, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has developed the Advanced Process Engineering Co-Simulator (APECS). The APECS system is an integrated software suite that combinesmore » process simulation (e.g., Aspen Plus) and high-fidelity equipment simulations such as those based on computational fluid dynamics (CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper we discuss the initial phases of the integration of the APECS system with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite uses the ActiveX (OLE Automation) controls in the Aspen Plus process simulator wrapped by the CASI library developed by Reaction Engineering International to run process/CFD co-simulations and query for results. This integration represents a necessary step in the development of virtual power plant co-simulations that will ultimately reduce the time, cost, and technical risk of developing advanced power generation systems.« less
NASA Astrophysics Data System (ADS)
Lee, C. H.; Yang, D. Y.; Lee, S. R.; Chang, I. G.; Lee, T. W.
2011-08-01
The shielded slot plate, which has a sheared corrugated trapezoidal pattern, is a component of the metallic bipolar plate for the molten carbonate fuel cell (MCFC). In order to increase the efficiency of the fuel cell, the unit cell of the shielded slot plate should have a relatively large upper area. Additionally, defects from the forming process should be minimized. In order to simulate the slitting process, whereby sheared corrugated patterns are formed, ductile fracture criteria based on the histories of stress and strain are employed. The user material subroutine VUMAT is employed for implementation of the material and ductile fracture criteria in the commercial FEM software ABAQUS. The variables of the ductile fracture criteria were determined by comparing the simulation results and the experimental results of the tension test and the shearing test. Parametric studies were conducted to determine the critical value of the ductile fracture criterion. Employing these ductile fracture criteria, the three dimensional forming process of the shielded slot plate was numerically simulated. The effects of the slitting process in the forming process of the shielded slot plate were analyzed through a FEM simulation and experimental studies. Finally, experiments involving microscopic and macroscopic observations were conducted to verify the numerical simulations of the 3-step forming process.
NASA Astrophysics Data System (ADS)
Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao
2017-01-01
Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is shown that the parallel FE simulator developed in this study is an efficient tool for modeling the whole trans-scale failure process of rock from meso- to engineering-scale.
ERIC Educational Resources Information Center
Keskitalo, Tuulikki
2012-01-01
Expectations for simulations in healthcare education are high; however, little is known about healthcare students' expectations of the learning process in virtual reality (VR) and simulation-based learning environments (SBLEs). This research aims to describe first-year healthcare students' (N=97) expectations regarding teaching, studying, and…
ISPE: A knowledge-based system for fluidization studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, S.
1991-01-01
Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all specified goals'' are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less
GPU-based efficient realistic techniques for bleeding and smoke generation in surgical simulators.
Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu
2010-12-01
In actual surgery, smoke and bleeding due to cauterization processes provide important visual cues to the surgeon, which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated the effects of bleeding and smoke generation, they are not realistic due to the requirement of real-time performance. To be interactive, visual update must be performed at at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques, since other computationally intensive processes compete for the available Central Processing Unit (CPU) resources. In this study we developed a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators, which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. The smoke and bleeding simulation were implemented as part of a laparoscopic adjustable gastric banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur noticeable overhead. However, for smoke generation, an input/output (I/O) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited to VR-based surgical simulators. Copyright © 2010 John Wiley & Sons, Ltd.
The Use of Particle/Substrate Material Models in Simulation of Cold-Gas Dynamic-Spray Process
NASA Astrophysics Data System (ADS)
Rahmati, Saeed; Ghaei, Abbas
2014-02-01
Cold spray is a coating deposition method in which the solid particles are accelerated to the substrate using a low temperature supersonic gas flow. Many numerical studies have been carried out in the literature in order to study this process in more depth. Despite the inability of Johnson-Cook plasticity model in prediction of material behavior at high strain rates, it is the model that has been frequently used in simulation of cold spray. Therefore, this research was devoted to compare the performance of different material models in the simulation of cold spray process. Six different material models, appropriate for high strain-rate plasticity, were employed in finite element simulation of cold spray process for copper. The results showed that the material model had a considerable effect on the predicted deformed shapes.
Dynamic simulation of Static Var Compensators in distribution systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koessler, R.J.
1992-08-01
This paper is a system study guide for the correction of voltage dips due to large motor startups with Static Var Compensators (SVCs). The method utilizes time simulations, which are an important aid in the equipment design and specification. The paper illustrates the process of setting-up a computer model and performing time simulations. The study process is demonstrated through an example, the Shawnee feeder in the Niagara Mohawk Power Corporation service area.
Improving surgeon utilization in an orthopedic department using simulation modeling
Simwita, Yusta W; Helgheim, Berit I
2016-01-01
Purpose Worldwide more than two billion people lack appropriate access to surgical services due to mismatch between existing human resource and patient demands. Improving utilization of existing workforce capacity can reduce the existing gap between surgical demand and available workforce capacity. In this paper, the authors use discrete event simulation to explore the care process at an orthopedic department. Our main focus is improving utilization of surgeons while minimizing patient wait time. Methods The authors collaborated with orthopedic department personnel to map the current operations of orthopedic care process in order to identify factors that influence poor surgeons utilization and high patient waiting time. The authors used an observational approach to collect data. The developed model was validated by comparing the simulation output with the actual patient data that were collected from the studied orthopedic care process. The authors developed a proposal scenario to show how to improve surgeon utilization. Results The simulation results showed that if ancillary services could be performed before the start of clinic examination services, the orthopedic care process could be highly improved. That is, improved surgeon utilization and reduced patient waiting time. Simulation results demonstrate that with improved surgeon utilizations, up to 55% increase of future demand can be accommodated without patients reaching current waiting time at this clinic, thus, improving patient access to health care services. Conclusion This study shows how simulation modeling can be used to improve health care processes. This study was limited to a single care process; however the findings can be applied to improve other orthopedic care process with similar operational characteristics. PMID:29355193
Designing a SCADA system simulator for fast breeder reactor
NASA Astrophysics Data System (ADS)
Nugraha, E.; Abdullah, A. G.; Hakim, D. L.
2016-04-01
SCADA (Supervisory Control and Data Acquisition) system simulator is a Human Machine Interface-based software that is able to visualize the process of a plant. This study describes the results of the process of designing a SCADA system simulator that aims to facilitate the operator in monitoring, controlling, handling the alarm, accessing historical data and historical trend in Nuclear Power Plant (NPP) type Fast Breeder Reactor (FBR). This research used simulation to simulate NPP type FBR Kalpakkam in India. This simulator was developed using Wonderware Intouch software 10 and is equipped with main menu, plant overview, area graphics, control display, set point display, alarm system, real-time trending, historical trending and security system. This simulator can properly simulate the principle of energy flow and energy conversion process on NPP type FBR. This SCADA system simulator can be used as training media for NPP type FBR prospective operators.
Moussa, Ahmed; Loye, Nathalie; Charlin, Bernard; Audétat, Marie-Claude
2016-01-01
Background Helping trainees develop appropriate clinical reasoning abilities is a challenging goal in an environment where clinical situations are marked by high levels of complexity and unpredictability. The benefit of simulation-based education to assess clinical reasoning skills has rarely been reported. More specifically, it is unclear if clinical reasoning is better acquired if the instructor's input occurs entirely after or is integrated during the scenario. Based on educational principles of the dual-process theory of clinical reasoning, a new simulation approach called simulation with iterative discussions (SID) is introduced. The instructor interrupts the flow of the scenario at three key moments of the reasoning process (data gathering, integration, and confirmation). After each stop, the scenario is continued where it was interrupted. Finally, a brief general debriefing ends the session. System-1 process of clinical reasoning is assessed by verbalization during management of the case, and System-2 during the iterative discussions without providing feedback. Objective The aim of this study is to evaluate the effectiveness of Simulation with Iterative Discussions versus the classical approach of simulation in developing reasoning skills of General Pediatrics and Neonatal-Perinatal Medicine residents. Methods This will be a prospective exploratory, randomized study conducted at Sainte-Justine hospital in Montreal, Qc, between January and March 2016. All post-graduate year (PGY) 1 to 6 residents will be invited to complete one SID or classical simulation 30 minutes audio video-recorded complex high-fidelity simulations covering a similar neonatology topic. Pre- and post-simulation questionnaires will be completed and a semistructured interview will be conducted after each simulation. Data analyses will use SPSS and NVivo softwares. Results This study is in its preliminary stages and the results are expected to be made available by April, 2016. Conclusions This will be the first study to explore a new simulation approach designed to enhance clinical reasoning. By assessing more closely reasoning processes throughout a simulation session, we believe that Simulation with Iterative Discussions will be an interesting and more effective approach for students. The findings of the study will benefit medical educators, education programs, and medical students. PMID:26888076
Pennaforte, Thomas; Moussa, Ahmed; Loye, Nathalie; Charlin, Bernard; Audétat, Marie-Claude
2016-02-17
Helping trainees develop appropriate clinical reasoning abilities is a challenging goal in an environment where clinical situations are marked by high levels of complexity and unpredictability. The benefit of simulation-based education to assess clinical reasoning skills has rarely been reported. More specifically, it is unclear if clinical reasoning is better acquired if the instructor's input occurs entirely after or is integrated during the scenario. Based on educational principles of the dual-process theory of clinical reasoning, a new simulation approach called simulation with iterative discussions (SID) is introduced. The instructor interrupts the flow of the scenario at three key moments of the reasoning process (data gathering, integration, and confirmation). After each stop, the scenario is continued where it was interrupted. Finally, a brief general debriefing ends the session. System-1 process of clinical reasoning is assessed by verbalization during management of the case, and System-2 during the iterative discussions without providing feedback. The aim of this study is to evaluate the effectiveness of Simulation with Iterative Discussions versus the classical approach of simulation in developing reasoning skills of General Pediatrics and Neonatal-Perinatal Medicine residents. This will be a prospective exploratory, randomized study conducted at Sainte-Justine hospital in Montreal, Qc, between January and March 2016. All post-graduate year (PGY) 1 to 6 residents will be invited to complete one SID or classical simulation 30 minutes audio video-recorded complex high-fidelity simulations covering a similar neonatology topic. Pre- and post-simulation questionnaires will be completed and a semistructured interview will be conducted after each simulation. Data analyses will use SPSS and NVivo softwares. This study is in its preliminary stages and the results are expected to be made available by April, 2016. This will be the first study to explore a new simulation approach designed to enhance clinical reasoning. By assessing more closely reasoning processes throughout a simulation session, we believe that Simulation with Iterative Discussions will be an interesting and more effective approach for students. The findings of the study will benefit medical educators, education programs, and medical students.
DDS: The Dental Diagnostic Simulation System.
ERIC Educational Resources Information Center
Tira, Daniel E.
The Dental Diagnostic Simulation (DDS) System provides an alternative to simulation systems which represent diagnostic case studies of relatively limited scope. It may be used to generate simulated case studies in all of the dental specialty areas with case materials progressing through the gamut of the diagnostic process. The generation of a…
Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason P.; Modenese, Luca; Reinbolt, Jeffrey A.; Thelen, Darryl G.; Umberger, Brian R.
2016-01-01
Objective The overall goal of this document is to demonstrate that dissemination of models and analyses for assessing the reproducibility of simulation results can be incorporated in the scientific review process in biomechanics. Methods As part of a special issue on model sharing and reproducibility in IEEE Transactions on Biomedical Engineering, two manuscripts on computational biomechanics were submitted: A. Rajagopal et al., IEEE Trans. Biomed. Eng., 2016 and A. Schmitz and D. Piovesan, IEEE Trans. Biomed. Eng., 2016. Models used in these studies were shared with the scientific reviewers and the public. In addition to the standard review of the manuscripts, the reviewers downloaded the models and performed simulations that reproduced results reported in the studies. Results There was general agreement between simulation results of the authors and those of the reviewers. Discrepancies were resolved during the necessary revisions. The manuscripts and instructions for download and simulation were updated in response to the reviewers’ feedback; changes that may otherwise have been missed if explicit model sharing and simulation reproducibility analysis were not conducted in the review process. Increased burden on the authors and the reviewers, to facilitate model sharing and to repeat simulations, were noted. Conclusion When the authors of computational biomechanics studies provide access to models and data, the scientific reviewers can download and thoroughly explore the model, perform simulations, and evaluate simulation reproducibility beyond the traditional manuscript-only review process. Significance Model sharing and reproducibility analysis in scholarly publishing will result in a more rigorous review process, which will enhance the quality of modeling and simulation studies and inform future users of computational models. PMID:28072567
Simulation of Magnetic Field Assisted Finishing (MFAF) Process Utilizing Smart MR Polishing Tool
NASA Astrophysics Data System (ADS)
Barman, Anwesa; Das, Manas
2017-02-01
Magnetic field assisted finishing process is an advanced finishing process. This process is capable of producing nanometer level surface finish. In this process magnetic field is applied to control the finishing forces using magnetorheological polishing medium. In the current study, permanent magnet is used to provide the required magnetic field in the finishing zone. The working gap between the workpiece and the magnet is filled with MR fluid which is used as the polishing brush to remove surface undulations from the top surface of the workpiece. In this paper, the distribution of magnetic flux density on the workpiece surface and behaviour of MR polishing medium during finishing are analyzed using commercial finite element packages (Ansys Maxwell® and Comsol®). The role of magnetic force in the indentation of abrasive particles on the workpiece surface is studied. A two-dimensional simulation study of the steady, laminar, and incompressible MR fluid flow behaviour during finishing process is carried out. The material removal and surface roughness modelling of the finishing process are also presented. The indentation force by a single active abrasive particle on the workpiece surface is modelled during simulation. The velocity profile of MR fluid with and without application of magnetic field is plotted. It shows non-Newtonian property without application of magnetic field. After that the total material displacement due to one abrasive particle is plotted. The simulated roughness profile is in a good agreement with the experimental results. The conducted study will help in understanding the fluid behavior and the mechanism of finishing during finishing process. Also, the modelling and simulation of the process will help in achieving better finishing performance.
Traversari, Roberto; Goedhart, Rien; Schraagen, Jan Maarten
2013-01-01
The objective is evaluation of a traditionally designed operating room using simulation of various surgical workflows. A literature search showed that there is no evidence for an optimal operating room layout regarding the position and size of an ultraclean ventilation (UCV) canopy with a separate preparation room for laying out instruments and in which patients are induced in the operating room itself. Neither was literature found reporting on process simulation being used for this application. Many technical guidelines and designs have mainly evolved over time, and there is no evidence on whether the proposed measures are also effective for the optimization of the layout for workflows. The study was conducted by applying observational techniques to simulated typical surgical procedures. Process simulations which included complete surgical teams and equipment required for the intervention were carried out for four typical interventions. Four observers used a form to record conflicts with the clean area boundaries and the height of the supply bridge. Preferences for particular layouts were discussed with the surgical team after each simulated procedure. We established that a clean area measuring 3 × 3 m and a supply bridge height of 2.05 m was satisfactory for most situations, provided a movable operation table is used. The only cases in which conflicts with the supply bridge were observed were during the use of a surgical robot (Da Vinci) and a surgical microscope. During multiple trauma interventions, bottlenecks regarding the dimensions of the clean area will probably arise. The process simulation of four typical interventions has led to significantly different operating room layouts than were arrived at through the traditional design process. Evidence-based design, human factors, work environment, operating room, traditional design, process simulation, surgical workflowsPreferred Citation: Traversari, R., Goedhart, R., & Schraagen, J. M. (2013). Process simulation during the design process makes the difference: Process simulations applied to a traditional design. Health Environments Research & Design Journal 6(2), pp 58-76.
Simulation analysis of resource flexibility on healthcare processes
Simwita, Yusta W; Helgheim, Berit I
2016-01-01
Purpose This paper uses discrete event simulation to explore the best resource flexibility scenario and examine the effect of implementing resource flexibility on different stages of patient treatment process. Specifically we investigate the effect of resource flexibility on patient waiting time and throughput in an orthopedic care process. We further seek to explore on how implementation of resource flexibility on patient treatment processes affects patient access to healthcare services. We focus on two resources, namely, orthopedic surgeon and operating room. Methods The observational approach was used to collect process data. The developed model was validated by comparing the simulation output with actual patient data collected from the studied orthopedic care process. We developed different scenarios to identify the best resource flexibility scenario and explore the effect of resource flexibility on patient waiting time, throughput, and future changes in demand. The developed scenarios focused on creating flexibility on service capacity of this care process by altering the amount of additional human resource capacity at different stages of patient care process and extending the use of operating room capacity. Results The study found that resource flexibility can improve responsiveness to patient demand in the treatment process. Testing different scenarios showed that the introduction of resource flexibility reduces patient waiting time and improves throughput. The simulation results show that patient access to health services can be improved by implementing resource flexibility at different stages of the patient treatment process. Conclusion This study contributes to the current health care literature by explaining how implementing resource flexibility at different stages of patient care processes can improve ability to respond to increasing patients demands. This study was limited to a single patient process; studies focusing on additional processes are recommended. PMID:27785046
Simulation analysis of resource flexibility on healthcare processes.
Simwita, Yusta W; Helgheim, Berit I
2016-01-01
This paper uses discrete event simulation to explore the best resource flexibility scenario and examine the effect of implementing resource flexibility on different stages of patient treatment process. Specifically we investigate the effect of resource flexibility on patient waiting time and throughput in an orthopedic care process. We further seek to explore on how implementation of resource flexibility on patient treatment processes affects patient access to healthcare services. We focus on two resources, namely, orthopedic surgeon and operating room. The observational approach was used to collect process data. The developed model was validated by comparing the simulation output with actual patient data collected from the studied orthopedic care process. We developed different scenarios to identify the best resource flexibility scenario and explore the effect of resource flexibility on patient waiting time, throughput, and future changes in demand. The developed scenarios focused on creating flexibility on service capacity of this care process by altering the amount of additional human resource capacity at different stages of patient care process and extending the use of operating room capacity. The study found that resource flexibility can improve responsiveness to patient demand in the treatment process. Testing different scenarios showed that the introduction of resource flexibility reduces patient waiting time and improves throughput. The simulation results show that patient access to health services can be improved by implementing resource flexibility at different stages of the patient treatment process. This study contributes to the current health care literature by explaining how implementing resource flexibility at different stages of patient care processes can improve ability to respond to increasing patients demands. This study was limited to a single patient process; studies focusing on additional processes are recommended.
NASA Astrophysics Data System (ADS)
Justus, Christopher
2005-04-01
In this study, we simulated top-antitop (tt-bar) quark events at the Compact Muon Solenoid (CMS), an experiment presently being constructed at the Large Hadron Collider in Geneva, Switzerland. The tt-bar process is an important background for Higgs events. We used a chain of software to simulate and reconstruct processes that will occur inside the detector. CMKIN was used to generate and store Monte Carlo Events. OSCAR, a GEANT4 based CMS detector simulator, was used to simulate the CMS detector and how particles would interact with the detector. Next, we used ORCA to simulate the response of the readout electronics at CMS. Last, we used the Jet/MET Root maker to create root files of jets and missing energy. We are now using this software analysis chain to complete a systematic study of initial state radiation at hadron colliders. This study is essential because tt-bar is the main background for the Higgs boson and these processes are extremely sensitive to initial state radiation. Results of our initial state radiation study will be presented. We started this study at the new LHC Physics Center (LPC) located at Fermi National Accelerator Laboratory, and we are now completing the study at the University of Rochester.
Mattsson, Sofia; Sjöström, Hans-Erik; Englund, Claire
2016-06-25
Objective. To develop and implement a virtual tablet machine simulation to aid distance students' understanding of the processes involved in tablet production. Design. A tablet simulation was created enabling students to study the effects different parameters have on the properties of the tablet. Once results were generated, students interpreted and explained them on the basis of current theory. Assessment. The simulation was evaluated using written questionnaires and focus group interviews. Students appreciated the exercise and considered it to be motivational. Students commented that they found the simulation, together with the online seminar and the writing of the report, was beneficial for their learning process. Conclusion. According to students' perceptions, the use of the tablet simulation contributed to their understanding of the compaction process.
Sjöström, Hans-Erik; Englund, Claire
2016-01-01
Objective. To develop and implement a virtual tablet machine simulation to aid distance students’ understanding of the processes involved in tablet production. Design. A tablet simulation was created enabling students to study the effects different parameters have on the properties of the tablet. Once results were generated, students interpreted and explained them on the basis of current theory. Assessment. The simulation was evaluated using written questionnaires and focus group interviews. Students appreciated the exercise and considered it to be motivational. Students commented that they found the simulation, together with the online seminar and the writing of the report, was beneficial for their learning process. Conclusion. According to students’ perceptions, the use of the tablet simulation contributed to their understanding of the compaction process. PMID:27402990
Dynamic Simulation of a Helium Liquefier
NASA Astrophysics Data System (ADS)
Maekawa, R.; Ooba, K.; Nobutoki, M.; Mito, T.
2004-06-01
Dynamic behavior of a helium liquefier has been studied in detail with a Cryogenic Process REal-time SimulaTor (C-PREST) at the National Institute for Fusion Science (NIFS). The C-PREST is being developed to integrate large-scale helium cryogenic plant design, operation and maintenance for optimum process establishment. As a first step of simulations of cooldown to 4.5 K with the helium liquefier model is conducted, which provides a plant-process validation platform. The helium liquefier consists of seven heat exchangers, a liquid-nitrogen (LN2) precooler, two expansion turbines and a liquid-helium (LHe) reservoir. Process simulations are fulfilled with sequence programs, which were implemented with C-PREST based on an existing liquefier operation. The interactions of a JT valve, a JT-bypass valve and a reservoir-return valve have been dynamically simulated. The paper discusses various aspects of refrigeration process simulation, including its difficulties such as a balance between complexity of the adopted models and CPU time.
ERIC Educational Resources Information Center
Peng, Jacob; Abdullah, Ira
2018-01-01
The emphases of student involvement and meaningful engagement in the learner-centered education model have created a new paradigm in an effort to generate a more engaging learning environment. This study examines the success of using different simulation platforms in creating a market simulation to teach business processes in the accounting…
Artistic understanding as embodied simulation.
Gibbs, Raymond W
2013-04-01
Bullot & Reber (B&R) correctly include historical perspectives into the scientific study of art appreciation. But artistic understanding always emerges from embodied simulation processes that incorporate the ongoing dynamics of brains, bodies, and world interactions. There may not be separate modes of artistic understanding, but a continuum of processes that provide imaginative simulations of the artworks we see or hear.
Process-Oriented Diagnostics of Tropical Cyclones in Global Climate Models
NASA Astrophysics Data System (ADS)
Moon, Y.; Kim, D.; Camargo, S. J.; Wing, A. A.; Sobel, A. H.; Bosilovich, M. G.; Murakami, H.; Reed, K. A.; Vecchi, G. A.; Wehner, M. F.; Zarzycki, C. M.; Zhao, M.
2017-12-01
Simulating tropical cyclone (TC) activity with global climate models (GCMs) remains a challenging problem. While some GCMs are able to simulate TC activity that is in good agreement with the observations, many other models exhibit strong biases. Decreasing horizontal grid spacing of the GCM simulations tends to improve the characteristics of simulated TCs, but this enhancement alone does not necessarily lead to greater skill in simulating TC activity. This study uses process-based diagnostics to identify model characteristics that could explain why some GCM simulations are able to produce more realistic TC activity than others. The diagnostics examine how convection, moisture, clouds and related processes are coupled at individual grid points, which yields useful information into how convective parameterizations interact with resolved model dynamics. These diagnostics share similarities with those originally developed to examine the Madden-Julian Oscillations in climate models. This study will examine TCs in eight different GCM simulations performed at NOAA/GFDL, NCAR and NASA that have different horizontal resolutions and ocean coupling. Preliminary results suggest that stronger TCs are closely associated with greater rainfall - thus greater diabatic heating - in the inner-core regions of the storms, which is consistent with previous theoretical studies. Other storm characteristics that can be used to infer why GCM simulations with comparable horizontal grid spacings produce different TC activity will be examined.
The development of an industrial-scale fed-batch fermentation simulation.
Goldrick, Stephen; Ştefan, Andrei; Lovett, David; Montague, Gary; Lennox, Barry
2015-01-10
This paper describes a simulation of an industrial-scale fed-batch fermentation that can be used as a benchmark in process systems analysis and control studies. The simulation was developed using a mechanistic model and validated using historical data collected from an industrial-scale penicillin fermentation process. Each batch was carried out in a 100,000 L bioreactor that used an industrial strain of Penicillium chrysogenum. The manipulated variables recorded during each batch were used as inputs to the simulator and the predicted outputs were then compared with the on-line and off-line measurements recorded in the real process. The simulator adapted a previously published structured model to describe the penicillin fermentation and extended it to include the main environmental effects of dissolved oxygen, viscosity, temperature, pH and dissolved carbon dioxide. In addition the effects of nitrogen and phenylacetic acid concentrations on the biomass and penicillin production rates were also included. The simulated model predictions of all the on-line and off-line process measurements, including the off-gas analysis, were in good agreement with the batch records. The simulator and industrial process data are available to download at www.industrialpenicillinsimulation.com and can be used to evaluate, study and improve on the current control strategy implemented on this facility. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
Characterizing the role of the hippocampus during episodic simulation and encoding.
Thakral, Preston P; Benoit, Roland G; Schacter, Daniel L
2017-12-01
The hippocampus has been consistently associated with episodic simulation (i.e., the mental construction of a possible future episode). In a recent study, we identified an anterior-posterior temporal dissociation within the hippocampus during simulation. Specifically, transient simulation-related activity occurred in relatively posterior portions of the hippocampus and sustained activity occurred in anterior portions. In line with previous theoretical proposals of hippocampal function during simulation, the posterior hippocampal activity was interpreted as reflecting a transient retrieval process for the episodic details necessary to construct an episode. In contrast, the sustained anterior hippocampal activity was interpreted as reflecting the continual recruitment of encoding and/or relational processing associated with a simulation. In the present study, we provide a direct test of these interpretations by conducting a subsequent memory analysis of our previously published data to assess whether successful encoding during episodic simulation is associated with the anterior hippocampus. Analyses revealed a subsequent memory effect (i.e., later remembered > later forgotten simulations) in the anterior hippocampus. The subsequent memory effect was transient and not sustained. Taken together, the current findings provide further support for a component process model of hippocampal function during simulation. That is, unique regions of the hippocampus support dissociable processes during simulation, which include the transient retrieval of episodic information, the sustained binding of such information into a coherent episode, and the transient encoding of that episode for later retrieval. © 2017 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Profumieri, A.; Bonell, C.; Catalfamo, P.; Cherniz, A.
2016-04-01
Virtual reality has been proposed for different applications, including the evaluation of new control strategies and training protocols for upper limb prostheses and for the study of new rehabilitation programs. In this study, a lower limb simulation environment commanded by surface electromyography signals is evaluated. The time delays generated by the acquisition and processing stages for the signals that would command the knee joint, were measured and different acquisition windows were analysed. The subjective perception of the quality of simulation was also evaluated when extra delays were added to the process. The results showed that the acquisition window is responsible for the longest delay. Also, the basic implemented processes allowed for the acquisition of three signal channels for commanding the simulation. Finally, the communication between different applications is arguably efficient, although it depends on the amount of data to be sent.
NASA Astrophysics Data System (ADS)
Dwivany, Fenny Martha; Esyanti, Rizkita R.; Prapaisie, Adeline; Puspa Kirana, Listya; Latief, Chunaeni; Ginaldi, Ari
2016-11-01
The objective of the research was to determine the effect of microgravity simulation by 3D clinostat on Cavendish banana (Musa acuminata AAA group) ripening process. In this study, physical, physiological changes as well as genes expression were analysed. The result showed that in microgravity simulation condition ripening process in banana was delayed and the MaACOl, MaACSl and MaACS5 gene expression were affected.
Meaningful Use of Simulation as an Educational Method in Nursing Programs
ERIC Educational Resources Information Center
Thompson, Teri L.
2011-01-01
The purpose of this descriptive study was to examine the use of simulation technology within nursing programs leading to licensure as registered nurses. In preparation for this study the Use of Simulation Technology Inventory (USTI) was developed and based in the structure, processes, outcomes model and the current literature on simulation. The…
ERIC Educational Resources Information Center
Dieckmann, Peter; Friis, Susanne Molin; Lippert, Anne; Ostergaard, Doris
2012-01-01
Introduction: This study describes (a) process goals, (b) success factors, and (c) barriers for optimizing simulation-based learning environments within the simulation setting model developed by Dieckmann. Methods: Seven simulation educators of different experience levels were interviewed using the Critical Incident Technique. Results: (a) The…
NASA Astrophysics Data System (ADS)
Chen, Sisi; Yau, Man-Kong; Bartello, Peter; Xue, Lulin
2018-05-01
In most previous direct numerical simulation (DNS) studies on droplet growth in turbulence, condensational growth and collisional growth were treated separately. Studies in recent decades have postulated that small-scale turbulence may accelerate droplet collisions when droplets are still small when condensational growth is effective. This implies that both processes should be considered simultaneously to unveil the full history of droplet growth and rain formation. This paper introduces the first direct numerical simulation approach to explicitly study the continuous droplet growth by condensation and collisions inside an adiabatic ascending cloud parcel. Results from the condensation-only, collision-only, and condensation-collision experiments are compared to examine the contribution to the broadening of droplet size distribution (DSD) by the individual process and by the combined processes. Simulations of different turbulent intensities are conducted to investigate the impact of turbulence on each process and on the condensation-induced collisions. The results show that the condensational process promotes the collisions in a turbulent environment and reduces the collisions when in still air, indicating a positive impact of condensation on turbulent collisions. This work suggests the necessity of including both processes simultaneously when studying droplet-turbulence interaction to quantify the turbulence effect on the evolution of cloud droplet spectrum and rain formation.
NASA Astrophysics Data System (ADS)
Fedulov, Boris N.; Safonov, Alexander A.; Sergeichev, Ivan V.; Ushakov, Andrey E.; Klenin, Yuri G.; Makarenko, Irina V.
2016-10-01
An application of composites for construction of subway brackets is a very effective approach to extend their lifetime. However, this approach involves the necessity to prevent process-induced distortions of the bracket due to thermal deformation and chemical shrinkage. At present study, a process simulation has been carried out to support the design of the production tooling. The simulation was based on the application of viscoelastic model for the resin. Simulation results were verified by comparison with results of manufacturing experiments. To optimize the bracket structure the strength analysis was carried out as well.
Guinet, Roland; Berthoumieu, Nicole; Dutot, Philippe; Triquet, Julien; Ratajczak, Medhi; Thibaudon, Michel; Bechaud, Philippe; Arliaud, Christophe; Miclet, Edith; Giordano, Florine; Larcon, Marjorie; Arthaud, Catherine
Environmental monitoring and aseptic process simulations represent an integral part of the microbiological quality control system of sterile pharmaceutical products manufacturing operations. However, guidance documents and manufacturers practices differ regarding recommendations for incubation time and incubation temperature, and, consequently, the environmental monitoring and aseptic process simulation incubation strategy should be supported by validation data. To avoid any bias coming from in vitro studies or from single-site manufacturing in situ studies, we performed a collaborative study at four manufacturing sites with four samples at each location. The environmental monitoring study was performed with tryptic soy agar settle plates and contact plates, and the aseptic process simulation study was performed with tryptic soy broth and thioglycolate broth. The highest recovery rate was obtained with settle plates (97.7%) followed by contact plates (65.4%) and was less than 20% for liquid media (tryptic soy broth 19% and thioglycolate broth 17%). Gram-positive cocci and non-spore-forming Gram-positive rods were largely predominant with more than 95% of growth and recovered best at 32.5 °C. The highest recovery of molds was obtained at 22.5 °C alone or as the first incubation temperature. Strict anaerobes were not recovered. At the end of the five days of incubation no significant statistical difference was obtained between the four conditions. Based on these data a single incubation temperature at 32.5 °C could be recommended for these four manufacturing sites for both environmental monitoring and aseptic process simulation, and a second plate could be used, periodically incubated at 22.5 °C. Similar studies should be considered for all manufacturing facilities in order to determine the optimal incubation temperature regime for both viable environmental monitoring and aseptic process simulation. Microbiological environmental monitoring and aseptic process simulation confirm that pharmaceutical cleanrooms are in an appropriate hygienic condition for manufacturing of sterile drug products. Guidance documents from different health authorities or expert groups differ regarding recommendation of the applied incubation time and incubation temperature, leading to variable manufacturers practices. Some recent publications have demonstrated that laboratory studies are not relevant to determine the best incubation regime and that in situ manufacturing site studies should be used. To solve any possible bias coming from laboratory studies or single-site in situ studies, we conducted a multicenter study at four manufacturing sites with a significant amount of real environmental monitoring samples collected directly from the environment in pharmaceutical production during manufacturing operations with four solid and liquid nutrient media. These samples were then incubated under four different conditions suggested in the guidance documents. We believe that the results of our multicenter study confirming recent other single-site in situ studies could be the basis of the strategy to determine the best incubation regime for both viable environmental monitoring and aseptic process simulation in any manufacturing facility. © PDA, Inc. 2017.
Bencala, Kenneth E.
1984-01-01
Solute transport in streams is determined by the interaction of physical and chemical processes. Data from an injection experiment for chloride and several cations indicate significant influence of solutestreambed processes on transport in a mountain stream. These data are interpreted in terms of transient storage processes for all tracers and sorption processes for the cations. Process parameter values are estimated with simulations based on coupled quasi-two-dimensional transport and first-order mass transfer sorption. Comparative simulations demonstrate the relative roles of the physical and chemical processes in determining solute transport. During the first 24 hours of the experiment, chloride concentrations were attenuated relative to expected plateau levels. Additional attenuation occurred for the sorbing cation strontium. The simulations account for these storage processes. Parameter values determined by calibration compare favorably with estimates from other studies in mountain streams. Without further calibration, the transport of potassium and lithium is adequately simulated using parameters determined in the chloride-strontium simulation and with measured cation distribution coefficients.
A parallel algorithm for switch-level timing simulation on a hypercube multiprocessor
NASA Technical Reports Server (NTRS)
Rao, Hariprasad Nannapaneni
1989-01-01
The parallel approach to speeding up simulation is studied, specifically the simulation of digital LSI MOS circuitry on the Intel iPSC/2 hypercube. The simulation algorithm is based on RSIM, an event driven switch-level simulator that incorporates a linear transistor model for simulating digital MOS circuits. Parallel processing techniques based on the concepts of Virtual Time and rollback are utilized so that portions of the circuit may be simulated on separate processors, in parallel for as large an increase in speed as possible. A partitioning algorithm is also developed in order to subdivide the circuit for parallel processing.
ISPE: A knowledge-based system for fluidization studies. 1990 Annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, S.
1991-01-01
Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all ``specified goals`` are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less
Simulating Data for Clinical Research: A Tutorial
ERIC Educational Resources Information Center
Beaujean, A. Alexander
2018-01-01
Simulation studies use computer-generated data to examine questions of interest that have traditionally been used to study properties of statistics and estimating algorithms. With the recent advent of powerful processing capabilities in affordable computers along with readily usable software, it is now feasible to use a simulation study to aid in…
Numerical simulation study on rolling-chemical milling process of aluminum-lithium alloy skin panel
NASA Astrophysics Data System (ADS)
Huang, Z. B.; Sun, Z. G.; Sun, X. F.; Li, X. Q.
2017-09-01
Single curvature parts such as aircraft fuselage skin panels are usually manufactured by rolling-chemical milling process, which is usually faced with the problem of geometric accuracy caused by springback. In most cases, the methods of manual adjustment and multiple roll bending are used to control or eliminate the springback. However, these methods can cause the increase of product cost and cycle, and lead to material performance degradation. Therefore, it is of significance to precisely control the springback of rolling-chemical milling process. In this paper, using the method of experiment and numerical simulation on rolling-chemical milling process, the simulation model for rolling-chemical milling process of 2060-T8 aluminum-lithium alloy skin was established and testified by the comparison between numerical simulation and experiment results for the validity. Then, based on the numerical simulation model, the relative technological parameters which influence on the curvature of the skin panel were analyzed. Finally, the prediction of springback and the compensation can be realized by controlling the process parameters.
The use of discrete-event simulation modelling to improve radiation therapy planning processes.
Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven
2009-07-01
The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.
Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design, and Data Management
2014-01-01
Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design , and Data Management Kevin W. Williams1 Bonny Christopher2 Gena...Simulation Studies: Experimental Planning, Design , and Data Management January 2014 6. Performing Organization Code 7. Author(s) 8. Performing...describe the process by which we designed our human-in-the-loop (HITL) simulation study and the methodology used to collect and analyze the results
USDA-ARS?s Scientific Manuscript database
Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...
NASA Technical Reports Server (NTRS)
Sreekantamurthy, Thammaiah; Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.
2016-01-01
Composite cure process induced residual strains and warping deformations in composite components present significant challenges in the manufacturing of advanced composite structure. As a part of the Manufacturing Process and Simulation initiative of the NASA Advanced Composite Project (ACP), research is being conducted on the composite cure process by developing an understanding of the fundamental mechanisms by which the process induced factors influence the residual responses. In this regard, analytical studies have been conducted on the cure process modeling of composite structural parts with varied physical, thermal, and resin flow process characteristics. The cure process simulation results were analyzed to interpret the cure response predictions based on the underlying physics incorporated into the modeling tool. In the cure-kinetic analysis, the model predictions on the degree of cure, resin viscosity and modulus were interpreted with reference to the temperature distribution in the composite panel part and tool setup during autoclave or hot-press curing cycles. In the fiber-bed compaction simulation, the pore pressure and resin flow velocity in the porous media models, and the compaction strain responses under applied pressure were studied to interpret the fiber volume fraction distribution predictions. In the structural simulation, the effect of temperature on the resin and ply modulus, and thermal coefficient changes during curing on predicted mechanical strains and chemical cure shrinkage strains were studied to understand the residual strains and stress response predictions. In addition to computational analysis, experimental studies were conducted to measure strains during the curing of laminated panels by means of optical fiber Bragg grating sensors (FBGs) embedded in the resin impregnated panels. The residual strain measurements from laboratory tests were then compared with the analytical model predictions. The paper describes the cure process procedures and residual strain predications, and discusses pertinent experimental results from the validation studies.
NASA Astrophysics Data System (ADS)
Rock, Gilles; Fischer, Kim; Schlerf, Martin; Gerhards, Max; Udelhoven, Thomas
2017-04-01
The development and optimization of image processing algorithms requires the availability of datasets depicting every step from earth surface to the sensor's detector. The lack of ground truth data obliges to develop algorithms on simulated data. The simulation of hyperspectral remote sensing data is a useful tool for a variety of tasks such as the design of systems, the understanding of the image formation process, and the development and validation of data processing algorithms. An end-to-end simulator has been set up consisting of a forward simulator, a backward simulator and a validation module. The forward simulator derives radiance datasets based on laboratory sample spectra, applies atmospheric contributions using radiative transfer equations, and simulates the instrument response using configurable sensor models. This is followed by the backward simulation branch, consisting of an atmospheric correction (AC), a temperature and emissivity separation (TES) or a hybrid AC and TES algorithm. An independent validation module allows the comparison between input and output dataset and the benchmarking of different processing algorithms. In this study, hyperspectral thermal infrared scenes of a variety of surfaces have been simulated to analyze existing AC and TES algorithms. The ARTEMISS algorithm was optimized and benchmarked against the original implementations. The errors in TES were found to be related to incorrect water vapor retrieval. The atmospheric characterization could be optimized resulting in increasing accuracies in temperature and emissivity retrieval. Airborne datasets of different spectral resolutions were simulated from terrestrial HyperCam-LW measurements. The simulated airborne radiance spectra were subjected to atmospheric correction and TES and further used for a plant species classification study analyzing effects related to noise and mixed pixels.
Study on the CFD simulation of refrigerated container
NASA Astrophysics Data System (ADS)
Arif Budiyanto, Muhammad; Shinoda, Takeshi; Nasruddin
2017-10-01
The objective this study is to performed Computational Fluid Dynamic (CFD) simulation of refrigerated container in the container port. Refrigerated container is a thermal cargo container constructed from an insulation wall to carry kind of perishable goods. CFD simulation was carried out use cross sectional of container walls to predict surface temperatures of refrigerated container and to estimate its cooling load. The simulation model is based on the solution of the partial differential equations governing the fluid flow and heat transfer processes. The physical model of heat-transfer processes considered in this simulation are consist of solar radiation from the sun, heat conduction on the container walls, heat convection on the container surfaces and thermal radiation among the solid surfaces. The validation of simulation model was assessed uses surface temperatures at center points on each container walls obtained from the measurement experimentation in the previous study. The results shows the surface temperatures of simulation model has good agreement with the measurement data on all container walls.
NASA Astrophysics Data System (ADS)
Kunkel, D.; Hoor, P. M.; Wirth, V.
2016-12-01
Recent studies revealed the existence of a quasi-permanent layer of enhanced static stability above the thermal tropopause. This so-called tropopause inversion layer (TIL) is evident in adiabatic baroclinic life cycles suggesting that dry dynamics contribute to its formation. However, compared to observations the TIL in these life cycles is too weak, indicating that other contributions from diabatic processes are relevant. Such processes could be related to moisture or radiation, or other non-linear, subgrid-scale processes such as gravity wave breaking. Moreover, whether there is a causal relation between the occurrence of the TIL and stratosphere-troposphere exchange (STE) is still under debate. In this study various types of baroclinic life cycles are simulated using a non-hydrostatic model in an idealized mid-latitude channel configuration. A simulation using only the dynamical core of the model serves as base simulation, which is modified subsequently by adding different processes. First, these processes such as vertical turbulence, cloud microphysics, radiation as well as surface fluxes for heat and momentum are added individually. In a second set of simulations combinations of these processes are studied to assess the relative importance of the individual processes in the formation of the TIL. Finally, the static stability is analyzed in regions of STE. These regions are identified with the help of passive tracer as well as a Lagrangian trajectory analysis.
Political Simulations Using Excel
ERIC Educational Resources Information Center
Jackson, Steven F.
2013-01-01
Simulations have received considerable attention as a tool to promote problem-solving skills, intense involvement, and high-order thinking among students. Whether semester-long exercises or a single-class session, simulations are often used in areas of conflict studies, diplomatic studies, trade disputes, electoral processes, and policy and legal…
Emergency Management Operations Process Mapping: Public Safety Technical Program Study
2011-02-01
Enterprise Architectures in industry, and have been successfully applied to assist companies to optimise interdependencies and relationships between...model for more in-depth analysis of EM processes, and for use in tandem with other studies that apply modeling and simulation to assess EM...for use in tandem with other studies that apply modeling and simulation to assess EM operational effectiveness before and after changing elements
Atomistic simulations of dislocation pileup: Grain boundaries interaction
Wang, Jian
2015-05-27
Here, using molecular dynamics (MD) simulations, we studied the dislocation pileup–grain boundary (GB) interactions. Two Σ11 asymmetrical tilt grain boundaries in Al are studied to explore the influence of orientation relationship and interface structure on dislocation activities at grain boundaries. To mimic the reality of a dislocation pileup in a coarse-grained polycrystalline, we optimized the dislocation population in MD simulations and developed a predict-correct method to create a dislocation pileup in MD simulations. MD simulations explored several kinetic processes of dislocations–GB reactions: grain boundary sliding, grain boundary migration, slip transmission, dislocation reflection, reconstruction of grain boundary, and the correlation ofmore » these kinetic processes with the available slip systems across the GB and atomic structures of the GB.« less
Lee, Ju-Young; Lee, Soon Hee; Kim, Jung-Hee
2018-05-01
Despite the increase in simulators at nursing schools and the high expectations regarding simulation for nursing education, the unique features of integrating simulation-based education into the curriculum are unclear. The purpose of this study was to assess the curriculum development process of simulation-based educational interventions in nursing in Korea. Integrative review of literature used. Korean Studies Information Services System (KISS), Korean Medical Database (KMbase), KoreaMed, Research Information Sharing Service (RISS), and National Digital Library (NDL). Comprehensive databases were searched for records without a time limit (until December 2016), using terms such as "nursing," "simulation," and "education." A total of 1006 studies were screened. According to the model for simulation-based curriculum development (Khamis et al., 2016), the quality of reporting on the curriculum development was reviewed. A total of 125 papers were included in this review. In three studies, simulation scenarios were made from easy to difficulty levels, and none of the studies presented the level of learners' proficiency. Only 17.6% of the studies reported faculty development or preparation. The inter-rater reliability was presented in performance test by 24 studies and two studies evaluated the long-term effects of simulation education although there was no statistically significant change in terms of publication years. These findings suggest that educators and researchers should pay more attention to the educational strategies to integrate simulation into nursing education. It could contribute to guiding educators and researchers to develop a simulation-based curriculum and improve the quality of nursing education research. Copyright © 2018 Elsevier Ltd. All rights reserved.
Mota, J.P.B.; Esteves, I.A.A.C.; Rostam-Abadi, M.
2004-01-01
A computational fluid dynamics (CFD) software package has been coupled with the dynamic process simulator of an adsorption storage tank for methane fuelled vehicles. The two solvers run as independent processes and handle non-overlapping portions of the computational domain. The codes exchange data on the boundary interface of the two domains to ensure continuity of the solution and of its gradient. A software interface was developed to dynamically suspend and activate each process as necessary, and be responsible for data exchange and process synchronization. This hybrid computational tool has been successfully employed to accurately simulate the discharge of a new tank design and evaluate its performance. The case study presented here shows that CFD and process simulation are highly complementary computational tools, and that there are clear benefits to be gained from a close integration of the two. ?? 2004 Elsevier Ltd. All rights reserved.
Evaluating the Pros and Cons of Different Peer Review Policies via Simulation.
Zhu, Jia; Fung, Gabriel; Wong, Wai Hung; Li, Zhixu; Xu, Chuanhua
2016-08-01
In the academic world, peer review is one of the major processes in evaluating a scholars contribution. In this study, we are interested in quantifying the merits of different policies in a peer review process, such as single-blind review, double-blind review, and obtaining authors feedback. Currently, insufficient work has been undertaken to evaluate the benefits of different peer review policies. One of the major reasons for this situation is the inability to conduct any empirical study because data are presently unavailable. In this case, a computer simulation is one of the best ways to conduct a study. We perform a series of simulations to study the effects of different policies on a peer review process. In this study, we focus on the peer review process of a typical computer science conference. Our results point to the crucial role of program chairs in determining the quality and diversity of the articles to be accepted for publication. We demonstrate the importance of discussion among reviewers, suggest circumstances in which the double-blind review policy should be adopted, and question the credibility of the authors feedback mechanism. Finally, we stress that randomness plays an important role in the peer review process, and this role cannot be eliminated. Although our model may not capture every component of a peer review process, it covers some of the most essential elements. Thus, even the simulation results clearly cannot be taken as literal descriptions of an actual peer review process. However, we can at least still use them to identify alternative directions for future study.
Lee, Cheng-Kuang; Pao, Chun-Wei
2016-08-17
Solution-processed small-molecule organic solar cells are a promising renewable energy source because of their low production cost, mechanical flexibility, and light weight relative to their pure inorganic counterparts. In this work, we developed a coarse-grained (CG) Gay-Berne ellipsoid molecular simulation model based on atomistic trajectories from all-atom molecular dynamics simulations of smaller system sizes to systematically study the nanomorphology of the SMDPPEH/PCBM/solvent ternary blend during solution processing, including the blade-coating process by applying external shear to the solution. With the significantly reduced overall system degrees of freedom and computational acceleration from GPU, we were able to go well beyond the limitation of conventional all-atom molecular simulations with a system size on the order of hundreds of nanometers with mesoscale molecular detail. Our simulations indicate that, similar to polymer solar cells, the optimal blending ratio in small-molecule organic solar cells must provide the highest specific interfacial area for efficient exciton dissociation, while retaining balanced hole/electron transport pathway percolation. We also reveal that blade-coating processes have a significant impact on nanomorphology. For given donor/acceptor blending ratios, applying an external shear force can effectively promote donor/acceptor phase segregation and stacking in the SMDPPEH domains. The present study demonstrated the capability of an ellipsoid-based coarse-grained model for studying the nanomorphology evolution of small-molecule organic solar cells during solution processing/blade-coating and provided links between fabrication protocols and device nanomorphologies.
NASA Astrophysics Data System (ADS)
Amran, M. A. M.; Idayu, N.; Faizal, K. M.; Sanusi, M.; Izamshah, R.; Shahir, M.
2016-11-01
In this study, the main objective is to determine the percentage difference of part weight between experimental and simulation work. The effect of process parameters on weight of plastic part is also investigated. The process parameters involved were mould temperature, melt temperature, injection time and cooling time. Autodesk Simulation Moldflow software was used to run the simulation of the plastic part. Taguchi method was selected as Design of Experiment to conduct the experiment. Then, the simulation result was validated with the experimental result. It was found that the minimum and maximum percentage of differential of part weight between simulation and experimental work are 0.35 % and 1.43 % respectively. In addition, the most significant parameter that affected part weight is the mould temperature, followed by melt temperature, injection time and cooling time.
Simulation technology for resuscitation training: a systematic review and meta-analysis.
Mundell, William C; Kennedy, Cassie C; Szostek, Jason H; Cook, David A
2013-09-01
To summarize current available data on simulation-based training in resuscitation for health care professionals. MEDLINE, EMBASE, CINAHL, PsycINFO, ERIC, Web of Science, Scopus and reference lists of published reviews. Published studies of any language or date that enrolled health professions' learners to investigate the use of technology-enhanced simulation to teach resuscitation in comparison with no intervention or alternative training. Data were abstracted in duplicate. We identified themes examining different approaches to curriculum design. We pooled results using random effects meta-analysis. 182 studies were identified involving 16,636 participants. Overall, simulation-based training of resuscitation skills, in comparison to no intervention, appears effective regardless of assessed outcome, level of learner, study design, or specific task trained. In comparison to no intervention, simulation training improved outcomes of knowledge (Hedges' g) 1.05 (95% confidence interval, 0.81-1.29), process skill 1.13 (0.99-1.27), product skill 1.92 (1.26-2.60), time skill 1.77 (1.13-2.42) and patient outcomes 0.26 (0.047-0.48). In comparison with non-simulation intervention, learner satisfaction 0.79 (0.27-1.31) and process skill 0.35 (0.12-0.59) outcomes favored simulation. Studies investigating how to optimize simulation training found higher process skill outcomes in courses employing "booster" practice 0.13 (0.03-0.22), team/group dynamics 0.51 (0.06-0.97), distraction 1.76 (1.02-2.50) and integrated feedback 0.49 (0.17-0.80) compared to courses without these features. Most analyses reflected high between-study inconsistency (I(2) values >50%). Simulation-based training for resuscitation is highly effective. Design features of "booster" practice, team/group dynamics, distraction and integrated feedback improve effectiveness. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
The VIIRS Ocean Data Simulator Enhancements and Results
NASA Technical Reports Server (NTRS)
Robinson, Wayne D.; Patt, Fredrick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.
2011-01-01
The VIIRS Ocean Science Team (VOST) has been developing an Ocean Data Simulator to create realistic VIIRS SDR datasets based on MODIS water-leaving radiances. The simulator is helping to assess instrument performance and scientific processing algorithms. Several changes were made in the last two years to complete the simulator and broaden its usefulness. The simulator is now fully functional and includes all sensor characteristics measured during prelaunch testing, including electronic and optical crosstalk influences, polarization sensitivity, and relative spectral response. Also included is the simulation of cloud and land radiances to make more realistic data sets and to understand their important influence on nearby ocean color data. The atmospheric tables used in the processing, including aerosol and Rayleigh reflectance coefficients, have been modeled using VIIRS relative spectral responses. The capabilities of the simulator were expanded to work in an unaggregated sample mode and to produce scans with additional samples beyond the standard scan. These features improve the capability to realistically add artifacts which act upon individual instrument samples prior to aggregation and which may originate from beyond the actual scan boundaries. The simulator was expanded to simulate all 16 M-bands and the EDR processing was improved to use these bands to make an SST product. The simulator is being used to generate global VIIRS data from and in parallel with the MODIS Aqua data stream. Studies have been conducted using the simulator to investigate the impact of instrument artifacts. This paper discusses the simulator improvements and results from the artifact impact studies.
The VIIRS ocean data simulator enhancements and results
NASA Astrophysics Data System (ADS)
Robinson, Wayne D.; Patt, Frederick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.
2011-10-01
The VIIRS Ocean Science Team (VOST) has been developing an Ocean Data Simulator to create realistic VIIRS SDR datasets based on MODIS water-leaving radiances. The simulator is helping to assess instrument performance and scientific processing algorithms. Several changes were made in the last two years to complete the simulator and broaden its usefulness. The simulator is now fully functional and includes all sensor characteristics measured during prelaunch testing, including electronic and optical crosstalk influences, polarization sensitivity, and relative spectral response. Also included is the simulation of cloud and land radiances to make more realistic data sets and to understand their important influence on nearby ocean color data. The atmospheric tables used in the processing, including aerosol and Rayleigh reflectance coefficients, have been modeled using VIIRS relative spectral responses. The capabilities of the simulator were expanded to work in an unaggregated sample mode and to produce scans with additional samples beyond the standard scan. These features improve the capability to realistically add artifacts which act upon individual instrument samples prior to aggregation and which may originate from beyond the actual scan boundaries. The simulator was expanded to simulate all 16 M-bands and the EDR processing was improved to use these bands to make an SST product. The simulator is being used to generate global VIIRS data from and in parallel with the MODIS Aqua data stream. Studies have been conducted using the simulator to investigate the impact of instrument artifacts. This paper discusses the simulator improvements and results from the artifact impact studies.
Conforti, Patrick F; Prasad, Manish; Garrison, Barbara J
2008-08-01
[Figure: see text]. Laser ablation harnesses photon energy to remove material from a surface. Although applications such as laser-assisted in situ keratomileusis (LASIK) surgery, lithography, and nanoscale device fabrication take advantage of this process, a better understanding the underlying mechanism of ablation in polymeric materials remains much sought after. Molecular simulation is a particularly attractive technique to study the basic aspects of ablation because it allows control over specific process parameters and enables observation of microscopic mechanistic details. This Account describes a hybrid molecular dynamics-Monte Carlo technique to simulate laser ablation in poly(methyl methacrylate) (PMMA). It also discusses the impact of thermal and chemical excitation on the ensuing ejection processes. We used molecular dynamics simulation to study the molecular interactions in a coarse-grained PMMA substrate following photon absorption. To ascertain the role of chemistry in initiating ablation, we embedded a Monte Carlo protocol within the simulation framework. These calculations permit chemical reactions to occur probabilistically during the molecular dynamics calculation using predetermined reaction pathways and Arrhenius rates. With this hybrid scheme, we can examine thermal and chemical pathways of decomposition separately. In the simulations, we observed distinct mechanisms of ablation for each type of photoexcitation pathway. Ablation via thermal processes is governed by a critical number of bond breaks following the deposition of energy. For the case in which an absorbed photon directly causes a bond scission, ablation occurs following the rapid chemical decomposition of material. A detailed analysis of the processes shows that a critical energy for ablation can describe this complex series of events. The simulations show a decrease in the critical energy with a greater amount of photochemistry. Additionally, the simulations demonstrate the effects of the energy deposition rate on the ejection mechanism. When the energy is deposited rapidly, not allowing for mechanical relaxation of the sample, the formation of a pressure wave and subsequent tensile wave dominates the ejection process. This study provides insight into the influence of thermal, chemical, and mechanical processes in PMMA and facilitates greater understanding of the complex nature of polymer ablation. These simulations complement experiments that have used chemical design to harness the photochemical properties of materials to enhance laser ablation. We successfully fit the results of the simulations to established analytical models of both photothermal and photochemical ablation and demonstrate their relevance. Although the simulations are for PMMA, the mechanistic concepts are applicable to a large range of systems and provide a conceptual foundation for interpretation of experimental data.
Sun, Rui; Ismail, Tamer M; Ren, Xiaohan; Abd El-Salam, M
2015-05-01
In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on the combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW. Copyright © 2015 Elsevier Ltd. All rights reserved.
A framework for service enterprise workflow simulation with multi-agents cooperation
NASA Astrophysics Data System (ADS)
Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun
2013-11-01
Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.
Computer Simulation of Biological Ageing-A Bird's-Eye View
NASA Astrophysics Data System (ADS)
Dasgupta, Subinay
For living organisms, the process of ageing consists of acquiring good and bad genetic mutations, which increase and decrease (respectively) the survival probability. When a child is born, the hereditary mutations of the parents are transmitted to the offspring. Such stochastic processes seem to be amenable to computer simulation. Over the last 10 years, simulation studies of this sort have been done in different parts of the globe to explain ageing. The objective of these studies have been to attempt an explanation of demographic data and of natural phenomena like preference of nature to the process of sexual reproduction (in comparison to the process of asexual reproduction). Here we shall attempt to discuss briefly the principles and the results of these works, with an emphasis on what is called Penna bit-string model.
J. A. Mardini; A. S. Lavine; V. K. Dhir
1996-01-01
Abstract--An experimental and analytical study of heat and mass transfer in wooden dowels during a simulated fire is presented in this paper. The goal of this study is to understand the processes of heat and mass transfer in wood during wildland fires. A mathematical model is developed to describe the processes of heating, drying and pyrolysis of wood until ignition...
Simulation Games: Practical References, Potential Use, Selected Bibliography.
ERIC Educational Resources Information Center
Kidder, Steven J.
Several recently published books on simulation and games are briefly discussed. Selected research studies and demonstration projects are examined to show the potential of simulation and gaming for teaching and training and for the study of social and psychological processes. The bibliography lists 113 publications which should lead the reader to…
An effective online data monitoring and saving strategy for large-scale climate simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
An effective online data monitoring and saving strategy for large-scale climate simulations
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin; ...
2018-01-22
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
NASA Astrophysics Data System (ADS)
Mani, N. J.; Waliser, D. E.; Jiang, X.
2014-12-01
While the boreal summer monsoon intraseasonal variability (BSISV) exerts profound influence on the south Asian monsoon, the capability of present day dynamical models in simulating and predicting the BSISV is still limited. The global model evaluation project on vertical structure and diabatic processes of the Madden Julian Oscillations (MJO) is a joint venture, coordinated by the Working Group on Numerical Experimentation (WGNE) MJO Task Force and GEWEX Atmospheric System Study (GASS) program, for assessing the model deficiencies in simulating the ISV and for improving our understanding of the underlying processes. In this study the simulation of the northward propagating BSISV is investigated in 26 climate models with special focus on the vertical diabatic heating structure and clouds. Following parallel lines of inquiry as the MJO Task Force has done with the eastward propagating MJO, we utilize previously proposed and newly developed model performance metrics and process diagnostics and apply them to the global climate model simulations of BSISV.
Biodiesel Production using Heterogeneous Catalyst in CSTR: Sensitivity Analysis and Optimization
NASA Astrophysics Data System (ADS)
Keong, L. S.; Patle, D. S.; Shukor, S. R.; Ahmad, Z.
2016-03-01
Biodiesel as a renewable fuel has emerged as a potential replacement for petroleum-based diesels. Heterogeneous catalyst has become the focus of researches in biodiesel production with the intention to overcome problems associated with homogeneous catalyzed processes. The simulation of heterogeneous catalyzed biodiesel production has not been thoroughly studied. Hence, a simulation of carbon-based solid acid catalyzed biodiesel production from waste oil with high FFA content (50 weight%) was developed in the present work to study the feasibility and potential of the simulated process. The simulated process produces biodiesel through simultaneous transesterification and esterification with the consideration of reaction kinetics. The developed simulation is feasible and capable to produce 2.81kmol/hr of FAME meeting the international standard (EN 14214). Yields of 68.61% and 97.19% are achieved for transesterification and esterification respectively. Sensitivity analyses of FFA composition in waste oil, methanol to oil ratio, reactor pressure and temperature towards FAME yield from both reactions were carried out. Optimization of reactor temperature was done to maximize FAME products.
Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh
2010-02-01
Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.
Gao, Shan; Liao, Quanwen; Liu, Wei; Liu, Zhichun
2017-10-31
Recently, numerous studies focused on the wetting process of droplets on various surfaces at a microscale level. However, there are a limited number of studies about the mechanism of condensation on patterned surfaces. The present study performed the dynamic wetting behavior of water droplets and condensation process of water molecules on substrates with different pillar structure parameters, through molecular dynamic simulation. The dynamic wetting results indicated that droplets exhibit Cassie state, PW state, and Wenzel state successively on textured surfaces with decreasing solid fraction. The droplets possess a higher static contact angle and a smaller spreading exponent on textured surfaces than those on smooth surfaces. The condensation processes, including the formation, growth, and coalescence of a nanodroplet, are simulated and quantitatively recorded, which are difficult to be observed by experiments. In addition, a wetting transition and a dewetting transition were observed and analyzed in condensation on textured surfaces. Combining these simulation results with previous theoretical and experimental studies will guide us to understand the hypostasis and mechanism of the condensation more clearly.
Model-Based Verification and Validation of the SMAP Uplink Processes
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun
2013-01-01
This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.
NASA Astrophysics Data System (ADS)
Ghavami, Seyed Morsal; Taleai, Mohammad
2017-04-01
Most spatial problems are multi-actor, multi-issue and multi-phase in nature. In addition to their intrinsic complexity, spatial problems usually involve groups of actors from different organizational and cognitive backgrounds, all of whom participate in a social structure to resolve or reduce the complexity of a given problem. Hence, it is important to study and evaluate what different aspects influence the spatial problem resolution process. Recently, multi-agent systems consisting of groups of separate agent entities all interacting with each other have been put forward as appropriate tools to use to study and resolve such problems. In this study, then in order to generate a better level of understanding regarding the spatial problem group decision-making process, a conceptual multi-agent-based framework is used that represents and specifies all the necessary concepts and entities needed to aid group decision making, based on a simulation of the group decision-making process as well as the relationships that exist among the different concepts involved. The study uses five main influencing entities as concepts in the simulation process: spatial influence, individual-level influence, group-level influence, negotiation influence and group performance measures. Further, it explains the relationship among different concepts in a descriptive rather than explanatory manner. To illustrate the proposed framework, the approval process for an urban land use master plan in Zanjan—a provincial capital in Iran—is simulated using MAS, the results highlighting the effectiveness of applying an MAS-based framework when wishing to study the group decision-making process used to resolve spatial problems.
NASA Astrophysics Data System (ADS)
Rodrigo Comino, Jesús; Iserloh, Thomas; Morvan, Xavier; Malam Issa, Oumarou; Naisse, Christophe; Keesstra, Saskia; Cerdà, Artemi; Prosdocimi, Massimo; Arnáez, José; Lasanta, Teodoro; Concepción Ramos, María; José Marqués, María; Ruiz Colmenero, Marta; Bienes, Ramón; Damián Ruiz Sinoga, José; Seeger, Manuel; Ries, Johannes B.
2016-04-01
Small portable rainfall simulators are considered as a useful tool to analyze soil erosion processes in cultivated lands. European research groups of Spain (Valencia, Málaga, Lleida, Madrid and La Rioja), France (Reims) or Germany (Trier) have used different rainfall simulators (varying in drop size distribution and fall velocities, kinetic energy, plot forms and sizes, and field of application)to study soil loss, surface flow, runoff and infiltration coefficients in different experimental plots (Valencia, Montes de Málaga, Penedès, Campo Real and La Rioja in Spain, Champagne in France and Mosel-Ruwer valley in Germany). The measurements and experiments developed by these research teams give an overview of the variety in the methodologies with rainfall simulations in studying the problem of soil erosion and describing the erosion features in different climatic environments, management practices and soil types. The aim of this study is: i) to investigate where, how and why researchers from different wine-growing regions applied rainfall simulations with successful results as a tool to measure soil erosion processes; ii) to make a qualitative comparison about the general soil erosion processes in European terroirs; iii) to demonstrate the importance of the development a standard method for soil erosion processes in vineyards, using rainfall simulators; iv) and to analyze the key factors that should be taken into account to carry out rainfall simulations. The rainfall simulations in all cases allowed knowing the infiltration capacity and the susceptibility of the soil to be detached and to generate sediment loads to runoff. Despite using small plots, the experiments were useful to analyze the influence of soil cover to reduce soil erosion and to make comparison between different locations or the influence of different soil characteristics.
Huang, J; Loeffler, M; Muehle, U; Moeller, W; Mulders, J J L; Kwakman, L F Tz; Van Dorp, W F; Zschech, E
2018-01-01
A Ga focused ion beam (FIB) is often used in transmission electron microscopy (TEM) analysis sample preparation. In case of a crystalline Si sample, an amorphous near-surface layer is formed by the FIB process. In order to optimize the FIB recipe by minimizing the amorphization, it is important to predict the amorphous layer thickness from simulation. Molecular Dynamics (MD) simulation has been used to describe the amorphization, however, it is limited by computational power for a realistic FIB process simulation. On the other hand, Binary Collision Approximation (BCA) simulation is able and has been used to simulate ion-solid interaction process at a realistic scale. In this study, a Point Defect Density approach is introduced to a dynamic BCA simulation, considering dynamic ion-solid interactions. We used this method to predict the c-Si amorphization caused by FIB milling on Si. To validate the method, dedicated TEM studies are performed. It shows that the amorphous layer thickness predicted by the numerical simulation is consistent with the experimental data. In summary, the thickness of the near-surface Si amorphization layer caused by FIB milling can be well predicted using the Point Defect Density approach within the dynamic BCA model. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wichmann, Volker
2017-09-01
The Gravitational Process Path (GPP) model can be used to simulate the process path and run-out area of gravitational processes based on a digital terrain model (DTM). The conceptual model combines several components (process path, run-out length, sink filling and material deposition) to simulate the movement of a mass point from an initiation site to the deposition area. For each component several modeling approaches are provided, which makes the tool configurable for different processes such as rockfall, debris flows or snow avalanches. The tool can be applied to regional-scale studies such as natural hazard susceptibility mapping but also contains components for scenario-based modeling of single events. Both the modeling approaches and precursor implementations of the tool have proven their applicability in numerous studies, also including geomorphological research questions such as the delineation of sediment cascades or the study of process connectivity. This is the first open-source implementation, completely re-written, extended and improved in many ways. The tool has been committed to the main repository of the System for Automated Geoscientific Analyses (SAGA) and thus will be available with every SAGA release.
NASA Astrophysics Data System (ADS)
Yadav, Basant; Ch, Sudheer; Mathur, Shashi; Adamowski, Jan
2016-12-01
In-situ bioremediation is the most common groundwater remediation procedure used for treating organically contaminated sites. A simulation-optimization approach, which incorporates a simulation model for groundwaterflow and transport processes within an optimization program, could help engineers in designing a remediation system that best satisfies management objectives as well as regulatory constraints. In-situ bioremediation is a highly complex, non-linear process and the modelling of such a complex system requires significant computational exertion. Soft computing techniques have a flexible mathematical structure which can generalize complex nonlinear processes. In in-situ bioremediation management, a physically-based model is used for the simulation and the simulated data is utilized by the optimization model to optimize the remediation cost. The recalling of simulator to satisfy the constraints is an extremely tedious and time consuming process and thus there is need for a simulator which can reduce the computational burden. This study presents a simulation-optimization approach to achieve an accurate and cost effective in-situ bioremediation system design for groundwater contaminated with BTEX (Benzene, Toluene, Ethylbenzene, and Xylenes) compounds. In this study, the Extreme Learning Machine (ELM) is used as a proxy simulator to replace BIOPLUME III for the simulation. The selection of ELM is done by a comparative analysis with Artificial Neural Network (ANN) and Support Vector Machine (SVM) as they were successfully used in previous studies of in-situ bioremediation system design. Further, a single-objective optimization problem is solved by a coupled Extreme Learning Machine (ELM)-Particle Swarm Optimization (PSO) technique to achieve the minimum cost for the in-situ bioremediation system design. The results indicate that ELM is a faster and more accurate proxy simulator than ANN and SVM. The total cost obtained by the ELM-PSO approach is held to a minimum while successfully satisfying all the regulatory constraints of the contaminated site.
NASA Astrophysics Data System (ADS)
Li, Zhanjie; Yu, Jingshan; Xu, Xinyi; Sun, Wenchao; Pang, Bo; Yue, Jiajia
2018-06-01
Hydrological models are important and effective tools for detecting complex hydrological processes. Different models have different strengths when capturing the various aspects of hydrological processes. Relying on a single model usually leads to simulation uncertainties. Ensemble approaches, based on multi-model hydrological simulations, can improve application performance over single models. In this study, the upper Yalongjiang River Basin was selected for a case study. Three commonly used hydrological models (SWAT, VIC, and BTOPMC) were selected and used for independent simulations with the same input and initial values. Then, the BP neural network method was employed to combine the results from the three models. The results show that the accuracy of BP ensemble simulation is better than that of the single models.
A chemical EOR benchmark study of different reservoir simulators
NASA Astrophysics Data System (ADS)
Goudarzi, Ali; Delshad, Mojdeh; Sepehrnoori, Kamy
2016-09-01
Interest in chemical EOR processes has intensified in recent years due to the advancements in chemical formulations and injection techniques. Injecting Polymer (P), surfactant/polymer (SP), and alkaline/surfactant/polymer (ASP) are techniques for improving sweep and displacement efficiencies with the aim of improving oil production in both secondary and tertiary floods. There has been great interest in chemical flooding recently for different challenging situations. These include high temperature reservoirs, formations with extreme salinity and hardness, naturally fractured carbonates, and sandstone reservoirs with heavy and viscous crude oils. More oil reservoirs are reaching maturity where secondary polymer floods and tertiary surfactant methods have become increasingly important. This significance has added to the industry's interest in using reservoir simulators as tools for reservoir evaluation and management to minimize costs and increase the process efficiency. Reservoir simulators with special features are needed to represent coupled chemical and physical processes present in chemical EOR processes. The simulators need to be first validated against well controlled lab and pilot scale experiments to reliably predict the full field implementations. The available data from laboratory scale include 1) phase behavior and rheological data; and 2) results of secondary and tertiary coreflood experiments for P, SP, and ASP floods under reservoir conditions, i.e. chemical retentions, pressure drop, and oil recovery. Data collected from corefloods are used as benchmark tests comparing numerical reservoir simulators with chemical EOR modeling capabilities such as STARS of CMG, ECLIPSE-100 of Schlumberger, REVEAL of Petroleum Experts. The research UTCHEM simulator from The University of Texas at Austin is also included since it has been the benchmark for chemical flooding simulation for over 25 years. The results of this benchmark comparison will be utilized to improve chemical design for field-scale studies using commercial simulators. The benchmark tests illustrate the potential of commercial simulators for chemical flooding projects and provide a comprehensive table of strengths and limitations of each simulator for a given chemical EOR process. Mechanistic simulations of chemical EOR processes will provide predictive capability and can aid in optimization of the field injection projects. The objective of this paper is not to compare the computational efficiency and solution algorithms; it only focuses on the process modeling comparison.
The QuakeSim Project: Numerical Simulations for Active Tectonic Processes
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry
2004-01-01
In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.
Simulation and flavor compound analysis of dealcoholized beer via one-step vacuum distillation.
Andrés-Iglesias, Cristina; García-Serna, Juan; Montero, Olimpio; Blanco, Carlos A
2015-10-01
The coupled operation of vacuum distillation process to produce alcohol free beer at laboratory scale and Aspen HYSYS simulation software was studied to define the chemical changes during the dealcoholization process in the aroma profiles of 2 different lager beers. At the lab-scale process, 2 different parameters were chosen to dealcoholize beer samples, 102mbar at 50°C and 200mbar at 67°C. Samples taken at different steps of the process were analyzed by HS-SPME-GC-MS focusing on the concentration of 7 flavor compounds, 5 alcohols and 2 esters. For simulation process, the EoS parameters of the Wilson-2 property package were adjusted to the experimental data and one more pressure was tested (60mbar). Simulation methods represent a viable alternative to predict results of the volatile compound composition of a final dealcoholized beer. Copyright © 2015 Elsevier Ltd. All rights reserved.
Kim, Youngmi; Mosier, Nathan; Ladisch, Michael R
2008-08-01
Distillers' grains (DG), a co-product of a dry grind ethanol process, is an excellent source of supplemental proteins in livestock feed. Studies have shown that, due to its high polymeric sugar contents and ease of hydrolysis, the distillers' grains have potential as an additional source of fermentable sugars for ethanol fermentation. The benefit of processing the distillers' grains to extract fermentable sugars lies in an increased ethanol yield without significant modification in the current dry grind technology. Three different potential configurations of process alternatives in which pretreated and hydrolyzed distillers' grains are recycled for an enhanced overall ethanol yield are proposed and discussed in this paper based on the liquid hot water (LHW) pretreatment of distillers' grains. Possible limitations of each proposed process are also discussed. This paper presents a compositional analysis of distillers' grains, as well as a simulation of the modified dry grind processes with recycle of distillers' grains. Simulated material balances for the modified dry grind processes are established based on the base case assumptions. These balances are compared to the conventional dry grind process in terms of ethanol yield, compositions of its co-products, and accumulation of fermentation inhibitors. Results show that 14% higher ethanol yield is achievable by processing and hydrolyzing the distillers' grains for additional fermentable sugars, as compared to the conventional dry grind process. Accumulation of fermentation by-products and inhibitory components in the proposed process is predicted to be 2-5 times higher than in the conventional dry grind process. The impact of fermentation inhibitors is reviewed and discussed. The final eDDGS (enhanced dried distillers' grains) from the modified processes has 30-40% greater protein content per mass than DDGS, and its potential as a value-added process is also analyzed. While the case studies used to illustrate the process simulation are based on LHW pretreated DG, the process simulation itself provides a framework for evaluation of the impact of other pretreatments.
When teams shift among processes: insights from simulation and optimization.
Kennedy, Deanna M; McComb, Sara A
2014-09-01
This article introduces process shifts to study the temporal interplay among transition and action processes espoused in the recurring phase model proposed by Marks, Mathieu, and Zacarro (2001). Process shifts are those points in time when teams complete a focal process and change to another process. By using team communication patterns to measure process shifts, this research explores (a) when teams shift among different transition processes and initiate action processes and (b) the potential of different interventions, such as communication directives, to manipulate process shift timing and order and, ultimately, team performance. Virtual experiments are employed to compare data from observed laboratory teams not receiving interventions, simulated teams receiving interventions, and optimal simulated teams generated using genetic algorithm procedures. Our results offer insights about the potential for different interventions to affect team performance. Moreover, certain interventions may promote discussions about key issues (e.g., tactical strategies) and facilitate shifting among transition processes in a manner that emulates optimal simulated teams' communication patterns. Thus, we contribute to theory regarding team processes in 2 important ways. First, we present process shifts as a way to explore the timing of when teams shift from transition to action processes. Second, we use virtual experimentation to identify those interventions with the greatest potential to affect performance by changing when teams shift among processes. Additionally, we employ computational methods including neural networks, simulation, and optimization, thereby demonstrating their applicability in conducting team research. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Process simulations for manufacturing of thick composites
NASA Astrophysics Data System (ADS)
Kempner, Evan A.
The availability of manufacturing simulations for composites can significantly reduce the costs associated with process development. Simulations provide a tool for evaluating the effect of processing conditions on the quality of parts produced without requiring numerous experiments. This is especially significant in parts that have troublesome features such as large thickness. The development of simulations for thick walled composites has been approached by examining the mechanics of resin flow and fiber deformation during processing, applying these evaluations to develop simulations, and evaluating the simulation with experimental results. A unified analysis is developed to describe the three-dimensional resin flow and fiber preform deformation during processing regardless of the manufacturing process used. It is shown how the generic governing evaluations in the unified analysis can be applied to autoclave molding, compression molding, pultrusion, filament winding, and resin transfer molding. A comparison is provided with earlier models derived individually for these processes. The evaluations described for autoclave curing were used to produce a one-dimensional cure simulation for autoclave curing of thick composites. The simulation consists of an analysis for heat transfer and resin flow in the composite as well as bleeder plies used to absorb resin removed from the part. Experiments were performed in a hot press to approximate curing in an autoclave. Graphite/epoxy laminates of 3 cm and 5 cm thickness were cured while monitoring temperatures at several points inside the laminate and thickness. The simulation predicted temperatures fairly closely, but difficulties were encountered in correlation of thickness results. This simulation was also used to study the effects of prepreg aging on processing of thick composites. An investigation was also performed on filament winding with prepreg tow. Cylinders were wound of approximately 12 mm thickness with pressure gages at the mandrel-composite interface. Cylinders were hoop wound with tensions ranging from 13-34 N. An analytical model was developed to calculate change in stress due to relaxation during winding. Although compressive circumferential stresses occurred throughout each of the cylinders, the magnitude was fairly low.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chuang, Claire Y.; Zepeda-Ruiz, Luis A.; Han, Sang M.
2015-06-01
Molecular dynamics simulations were used to study Ge island nucleation and growth on amorphous SiO 2 substrates. This process is relevant in selective epitaxial growth of Ge on Si, for which SiO 2 is often used as a template mask. The islanding process was studied over a wide range of temperatures and fluxes, using a recently proposed empirical potential model for the Si–SiO 2–Ge system. The simulations provide an excellent quantitative picture of the Ge islanding and compare well with detailed experimental measurements. These quantitative comparisons were enabled by an analytical rate model as a bridge between simulations and experimentsmore » despite the fact that deposition fluxes accessible in simulations and experiments are necessarily different by many orders of magnitude. In particular, the simulations led to accurate predictions of the critical island size and the scaling of island density as a function of temperature. Lastly, the overall approach used here should be useful not just for future studies in this particular system, but also for molecular simulations of deposition in other materials.« less
Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software
NASA Astrophysics Data System (ADS)
Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.
2017-09-01
This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
3D Simulation Modeling of the Tooth Wear Process.
Dai, Ning; Hu, Jian; Liu, Hao
2015-01-01
Severe tooth wear is the most common non-caries dental disease, and it can seriously affect oral health. Studying the tooth wear process is time-consuming and difficult, and technological tools are frequently lacking. This paper presents a novel method of digital simulation modeling that represents a new way to study tooth wear. First, a feature extraction algorithm is used to obtain anatomical feature points of the tooth without attrition. Second, after the alignment of non-attrition areas, the initial homogeneous surface is generated by means of the RBF (Radial Basic Function) implicit surface and then deformed to the final homogeneous by the contraction and bounding algorithm. Finally, the method of bilinear interpolation based on Laplacian coordinates between tooth with attrition and without attrition is used to inversely reconstruct the sequence of changes of the 3D tooth morphology during gradual tooth wear process. This method can also be used to generate a process simulation of nonlinear tooth wear by means of fitting an attrition curve to the statistical data of attrition index in a certain region. The effectiveness and efficiency of the attrition simulation algorithm are verified through experimental simulation.
3D Simulation Modeling of the Tooth Wear Process
Dai, Ning; Hu, Jian; Liu, Hao
2015-01-01
Severe tooth wear is the most common non-caries dental disease, and it can seriously affect oral health. Studying the tooth wear process is time-consuming and difficult, and technological tools are frequently lacking. This paper presents a novel method of digital simulation modeling that represents a new way to study tooth wear. First, a feature extraction algorithm is used to obtain anatomical feature points of the tooth without attrition. Second, after the alignment of non-attrition areas, the initial homogeneous surface is generated by means of the RBF (Radial Basic Function) implicit surface and then deformed to the final homogeneous by the contraction and bounding algorithm. Finally, the method of bilinear interpolation based on Laplacian coordinates between tooth with attrition and without attrition is used to inversely reconstruct the sequence of changes of the 3D tooth morphology during gradual tooth wear process. This method can also be used to generate a process simulation of nonlinear tooth wear by means of fitting an attrition curve to the statistical data of attrition index in a certain region. The effectiveness and efficiency of the attrition simulation algorithm are verified through experimental simulation. PMID:26241942
A simulation study on garment manufacturing process
NASA Astrophysics Data System (ADS)
Liong, Choong-Yeun; Rahim, Nur Azreen Abdul
2015-02-01
Garment industry is an important industry and continues to evolve in order to meet the consumers' high demands. Therefore, elements of innovation and improvement are important. In this work, research studies were conducted at a local company in order to model the sewing process of clothes manufacturing by using simulation modeling. Clothes manufacturing at the company involves 14 main processes, which are connecting the pattern, center sewing and side neating, pockets sewing, backside-sewing, attaching the front and back, sleeves preparation, attaching the sleeves and over lock, collar preparation, collar sewing, bottomedge sewing, buttonholing sewing, removing excess thread, marking button, and button cross sewing. Those fourteen processes are operated by six tailors only. The last four sets of processes are done by a single tailor. Data collection was conducted by on site observation and the probability distribution of processing time for each of the processes is determined by using @Risk's Bestfit. Then a simulation model is developed using Arena Software based on the data collected. Animated simulation model is developed in order to facilitate understanding and verifying that the model represents the actual system. With such model, what if analysis and different scenarios of operations can be experimented with virtually. The animation and improvement models will be presented in further work.
Simulating tracer transport in variably saturated soils and shallow groundwater
USDA-ARS?s Scientific Manuscript database
The objective of this study was to develop a realistic model to simulate the complex processes of flow and tracer transport in variably saturated soils and to compare simulation results with the detailed monitoring observations. The USDA-ARS OPE3 field site was selected for the case study due to ava...
NASA Astrophysics Data System (ADS)
Lindstrom, Erik Vilhelm Mathias
Gasification of black liquor could drastically increase the flexibility and improve the profit potential of a mature industry. The completed work was focused on research around the economics and benefits of its implementation, utilizing laboratory pulping experiments and process simulation. The separation of sodium and sulfur achieved through gasification of recovered black liquor, can be utilized in processes like modified continuous cooking, split sulfidity and green liquor pretreatment pulping, and polysulfide-anthraquinone pulping, to improve pulp yield and properties. Laboratory pulping protocols have been developed for these modified pulping technologies and different process options evaluated. The process simulation work around BLG has led to the development of a WinGEMS module for the low temperature MTCI steam reforming process, and case studies comparing a simulated conventional kraft process to different process options built around the implementation of a BLG unit operation into the kraft recovery cycle. Pulp yield increases of 1-3% points with improved product quality, and the potential for capital and operating cost savings relative to the conventional kraft process have been demonstrated. Process simulation work has shown that the net variable operating cost for a pulping process using BLGCC is highly dependent on the cost of lime kiln fuel and the selling price of green power to the grid. Under the assumptions taken in the performed case study, the BLGCC process combined with split sulfidity or PSAQ pulping operations had net variable operating cost 2-4% greater than the kraft reference. The influence of the sales price of power to the grid is the most significant cost factor. If a sales price increase to 6 ¢/KWh for green power could be achieved, cost savings of about $40/ODtP could be realized in all investigated BLG processes. Other alternatives to improve the process economics around BLG would be to modify or eliminate the lime kiln unit operations, utilizing high sulfidity green liquor pretreatment, PSAQ with auto-causticization, or converting the process to mini-sulfide sulfite-AQ.
NASA Technical Reports Server (NTRS)
Nusinov, M. D.; Kochnev, V. A.; Chernyak, Y. B.; Kuznetsov, A. V.; Kosolapov, A. I.; Yakovlev, O. I.
1974-01-01
Study of evaporation, condensation and sputtering on the moon can provide information on the same processes on other planets, and reveal details of the formation of the lunar regolith. Simulation methods include vacuum evaporation, laser evaporation, and bubbling gas through melts.
Nonlinear Epigenetic Variance: Review and Simulations
ERIC Educational Resources Information Center
Kan, Kees-Jan; Ploeger, Annemie; Raijmakers, Maartje E. J.; Dolan, Conor V.; van Der Maas, Han L. J.
2010-01-01
We present a review of empirical evidence that suggests that a substantial portion of phenotypic variance is due to nonlinear (epigenetic) processes during ontogenesis. The role of such processes as a source of phenotypic variance in human behaviour genetic studies is not fully appreciated. In addition to our review, we present simulation studies…
The Effects of Questioning on Thinking Processes.
ERIC Educational Resources Information Center
Shiang, Ching-Pyng; McDaniel, Ernest
This study investigated the effects of self-generated questions and external questions on thinking processes. Thirty-three college students acted as investigators in a computer simulation of a Congressional investigation into the Pearl Harbor attack. The simulation--known as "The Attack on Pearl Harbor: Cloud of Mystery?"--presented the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com; Suprijadi; Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic imagesmore » and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.« less
NASA Astrophysics Data System (ADS)
Hieu, Nguyen Huu
2017-09-01
Pervaporation is a potential process for the final step of ethanol biofuel production. In this study, a mathematical model was developed based on the resistance-in-series model and a simulation was carried out using the specialized simulation software COMSOL Multiphysics to describe a tubular type pervaporation module with membranes for the dehydration of ethanol solution. The permeance of membranes, operating conditions, and feed conditions in the simulation were referred from experimental data reported previously in literature. Accordingly, the simulated temperature and density profiles of pure water and ethanol-water mixture were validated based on existing published data.
Simulation of the effect of incline incident angle in DMD Maskless Lithography
NASA Astrophysics Data System (ADS)
Liang, L. W.; Zhou, J. Y.; Xiang, L. L.; Wang, B.; Wen, K. H.; Lei, L.
2017-06-01
The aim of this study is to provide a simulation method for investigation of the intensity fluctuation caused by the inclined incident angle in DMD (digital micromirror device) maskless lithography. The simulation consists of eight main processes involving the simplification of the DMD aperture function and light propagation utilizing the non-parallel angular spectrum method. These processes provide a possibility of co-simulation in the spatial frequency domain, which combines the microlens array and DMD in the maskless lithography system. The simulation provided the spot shape and illumination distribution. These two parameters are crucial in determining the exposure dose in the existing maskless lithography system.
Discovering mental models and frames in learning of nursing ethics through simulations.
Díaz Agea, J L; Martín Robles, M R; Jiménez Rodríguez, D; Morales Moreno, I; Viedma Viedma, I; Leal Costa, C
2018-05-15
The acquisition of ethical competence is necessary in nursing. The aims of the study were to analyse students' perceptions of the process of learning ethics through simulations and to describe the underlying frames that inform the decision making process of nursing students. A qualitative study based on the analysis of simulated experiences and debriefings of six simulated scenarios with ethical content in three different groups of fourth-year nursing students (n = 30), was performed. The simulated situations were designed to contain ethical dilemmas. The students' perspective regarding their learning and acquisition of ethical competence through simulations was positive. A total of 15 mental models were identified that underlie the ethical decision making of the students. The student's opinions reinforce the use of simulations as a tool for learning ethics. Thus, the putting into practice the knowledge regarding the frames that guide ethical actions is a suitable pedagogical strategy. Copyright © 2018 Elsevier Ltd. All rights reserved.
Atmospheric Modeling And Sensor Simulation (AMASS) study
NASA Technical Reports Server (NTRS)
Parker, K. G.
1984-01-01
The capabilities of the atmospheric modeling and sensor simulation (AMASS) system were studied in order to enhance them. This system is used in processing atmospheric measurements which are utilized in the evaluation of sensor performance, conducting design-concept simulation studies, and also in the modeling of the physical and dynamical nature of atmospheric processes. The study tasks proposed in order to both enhance the AMASS system utilization and to integrate the AMASS system with other existing equipment to facilitate the analysis of data for modeling and image processing are enumerated. The following array processors were evaluated for anticipated effectiveness and/or improvements in throughput by attachment of the device to the P-e: (1) Floating Point Systems AP-120B; (2) Floating Point Systems 5000; (3) CSP, Inc. MAP-400; (4) Analogic AP500; (5) Numerix MARS-432; and (6) Star Technologies, Inc. ST-100.
NASA Astrophysics Data System (ADS)
Danáčová, Michaela; Valent, Peter; Výleta, Roman
2017-12-01
Nowadays, rainfall simulators are being used by many researchers in field or laboratory experiments. The main objective of most of these experiments is to better understand the underlying runoff generation processes, and to use the results in the process of calibration and validation of hydrological models. Many research groups have assembled their own rainfall simulators, which comply with their understanding of rainfall processes, and the requirements of their experiments. Most often, the existing rainfall simulators differ mainly in the size of the irrigated area, and the way they generate rain drops. They can be characterized by the accuracy, with which they produce a rainfall of a given intensity, the size of the irrigated area, and the rain drop generating mechanism. Rainfall simulation experiments can provide valuable information about the genesis of surface runoff, infiltration of water into soil and rainfall erodibility. Apart from the impact of physical properties of soil, its moisture and compaction on the generation of surface runoff and the amount of eroded particles, some studies also investigate the impact of vegetation cover of the whole area of interest. In this study, the rainfall simulator was used to simulate the impact of the slope gradient of the irrigated area on the amount of generated runoff and sediment yield. In order to eliminate the impact of external factors and to improve the reproducibility of the initial conditions, the experiments were conducted in laboratory conditions. The laboratory experiments were carried out using a commercial rainfall simulator, which was connected to an external peristaltic pump. The pump maintained a constant and adjustable inflow of water, which enabled to overcome the maximum volume of simulated precipitation of 2.3 l, given by the construction of the rainfall simulator, while maintaining constant characteristics of the simulated precipitation. In this study a 12-minute rainfall with a constant intensity of 5 mm/min was used to irrigate a corrupted soil sample. The experiment was undertaken for several different slopes, under the condition of no vegetation cover. The results of the rainfall simulation experiment complied with the expectations of a strong relationship between the slope gradient, and the amount of surface runoff generated. The experiments with higher slope gradients were characterised by larger volumes of surface runoff generated, and by shorter times after which it occurred. The experiments with rainfall simulators in both laboratory and field conditions play an important role in better understanding of runoff generation processes. The results of such small scale experiments could be used to estimate some of the parameters of complex hydrological models, which are used to model rainfall-runoff and erosion processes at catchment scale.
Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M
2008-06-01
Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.
Neural Processing of Musical and Vocal Emotions Through Cochlear Implants Simulation.
Ahmed, Duha G; Paquette, Sebastian; Zeitouni, Anthony; Lehmann, Alexandre
2018-05-01
Cochlear implants (CIs) partially restore the sense of hearing in the deaf. However, the ability to recognize emotions in speech and music is reduced due to the implant's electrical signal limitations and the patient's altered neural pathways. Electrophysiological correlations of these limitations are not yet well established. Here we aimed to characterize the effect of CIs on auditory emotion processing and, for the first time, directly compare vocal and musical emotion processing through a CI-simulator. We recorded 16 normal hearing participants' electroencephalographic activity while listening to vocal and musical emotional bursts in their original form and in a degraded (CI-simulated) condition. We found prolonged P50 latency and reduced N100-P200 complex amplitude in the CI-simulated condition. This points to a limitation in encoding sound signals processed through CI simulation. When comparing the processing of vocal and musical bursts, we found a delay in latency with the musical bursts compared to the vocal bursts in both conditions (original and CI-simulated). This suggests that despite the cochlear implants' limitations, the auditory cortex can distinguish between vocal and musical stimuli. In addition, it adds to the literature supporting the complexity of musical emotion. Replicating this study with actual CI users might lead to characterizing emotional processing in CI users and could ultimately help develop optimal rehabilitation programs or device processing strategies to improve CI users' quality of life.
NASA Astrophysics Data System (ADS)
Choi, Kwang Yong; Kim, Yun Chang; Choi, Hee Kwan; Kang, Chul Ho; Kim, Heon Young
2013-12-01
During a sheet metal forming process of automotive outer panels, the air trapped between a blank sheet and a die tool can become highly compressed, ultimately influencing the blank deformation and the press force. To prevent this problem, vent holes are drilled into die tools and needs several tens to hundreds according to the model size. The design and the drilling of vent holes are based on expert's experience and try-out result and thus the process can be one of reasons increasing development cycle. Therefore the study on the size, the number, and the position of vent holes is demanded for reducing development cycle, but there is no simulation technology for analyzing forming defects, making numerical sheet metal forming process simulations that incorporate the fluid dynamics of air. This study presents a sheet metal forming simulation of automotive outer panels (a roof and a body side outer) that simultaneously simulates the behavior of air in a die cavity. Through CAE results, the effect of air behavior and vent holes to blank deformation was analyzed. For this study, the commercial software PAM-STAMP{trade mark, serif} and PAM-SAFE{trade mark, serif} was used.
The simulation study on optical target laser active detection performance
NASA Astrophysics Data System (ADS)
Li, Ying-chun; Hou, Zhao-fei; Fan, Youchen
2014-12-01
According to the working principle of laser active detection system, the paper establishes the optical target laser active detection simulation system, carry out the simulation study on the detection process and detection performance of the system. For instance, the performance model such as the laser emitting, the laser propagation in the atmosphere, the reflection of optical target, the receiver detection system, the signal processing and recognition. We focus on the analysis and modeling the relationship between the laser emitting angle and defocus amount and "cat eye" effect echo laser in the reflection of optical target. Further, in the paper some performance index such as operating range, SNR and the probability of the system have been simulated. The parameters including laser emitting parameters, the reflection of the optical target and the laser propagation in the atmosphere which make a great influence on the performance of the optical target laser active detection system. Finally, using the object-oriented software design methods, the laser active detection system with the opening type, complete function and operating platform, realizes the process simulation that the detection system detect and recognize the optical target, complete the performance simulation of each subsystem, and generate the data report and the graph. It can make the laser active detection system performance models more intuitive because of the visible simulation process. The simulation data obtained from the system provide a reference to adjust the structure of the system parameters. And it provides theoretical and technical support for the top level design of the optical target laser active detection system and performance index optimization.
NASA Astrophysics Data System (ADS)
Johnson, Donald R.; Lenzen, Allen J.; Zapotocny, Tom H.; Schaack, Todd K.
2000-11-01
A challenge common to weather, climate, and seasonal numerical prediction is the need to simulate accurately reversible isentropic processes in combination with appropriate determination of sources/sinks of energy and entropy. Ultimately, this task includes the distribution and transport of internal, gravitational, and kinetic energies, the energies of water substances in all forms, and the related thermodynamic processes of phase changes involved with clouds, including condensation, evaporation, and precipitation processes.All of the processes noted above involve the entropies of matter, radiation, and chemical substances, conservation during transport, and/or changes in entropies by physical processes internal to the atmosphere. With respect to the entropy of matter, a means to study a model's accuracy in simulating internal hydrologic processes is to determine its capability to simulate the appropriate conservation of potential and equivalent potential temperature as surrogates of dry and moist entropy under reversible adiabatic processes in which clouds form, evaporate, and precipitate. In this study, a statistical strategy utilizing the concept of `pure error' is set forth to assess the numerical accuracies of models to simulate reversible processes during 10-day integrations of the global circulation corresponding to the global residence time of water vapor. During the integrations, the sums of squared differences between equivalent potential temperature e numerically simulated by the governing equations of mass, energy, water vapor, and cloud water and a proxy equivalent potential temperature te numerically simulated as a conservative property are monitored. Inspection of the differences of e and te in time and space and the relative frequency distribution of the differences details bias and random errors that develop from nonlinear numerical inaccuracies in the advection and transport of potential temperature and water substances within the global atmosphere.A series of nine global simulations employing various versions of Community Climate Models CCM2 and CCM3-all Eulerian spectral numerics, all semi-Lagrangian numerics, mixed Eulerian spectral, and semi-Lagrangian numerics-and the University of Wisconsin-Madison (UW) isentropic-sigma gridpoint model provides an interesting comparison of numerical accuracies in the simulation of reversibility. By day 10, large bias and random differences were identified in the simulation of reversible processes in all of the models except for the UW isentropic-sigma model. The CCM2 and CCM3 simulations yielded systematic differences that varied zonally, vertically, and temporally. Within the comparison, the UW isentropic-sigma model was superior in transporting water vapor and cloud water/ice and in simulating reversibility involving the conservation of dry and moist entropy. The only relative frequency distribution of differences that appeared optimal, in that the distribution remained unbiased and equilibrated with minimal variance as it remained statistically stationary, was the distribution from the UW isentropic-sigma model. All other distributions revealed nonstationary characteristics with spreading and/or shifting of the maxima as the biases and variances of the numerical differences of e and te amplified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Provost, G.; Zitney, S.; Turton, R.
2009-01-01
To meet increasing demand for education and experience with commercial-scale, coal-fired, integrated gasification combined cycle (IGCC) plants with CO2 capture, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) is leading a project to deploy a generic, full-scope, real-time IGCC dynamic plant simulator for use in establishing a world-class research and training center, and to promote and demonstrate IGCC technology to power industry personnel. The simulator, being built by Invensys Process Systems (IPS), will be installed at two separate sites, at NETL and West Virginia University (WVU), and will combine a process/gasification simulator with a power/combined-cycle simulator together inmore » a single dynamic simulation framework for use in engineering research studies and training applications. The simulator, scheduled to be launched in mid-year 2010, will have the following capabilities: High-fidelity, dynamic model of process-side (gasification and gas cleaning with CO2 capture) and power-block-side (combined cycle) for a generic IGCC plant fueled by coal and/or petroleum coke. Highly flexible configuration that allows concurrent training on separate gasification and combined cycle simulators, or up to two IGCC simulators. Ability to enhance and modify the plant model to facilitate studies of changes in plant configuration, equipment, and control strategies to support future R&D efforts. Training capabilities including startup, shutdown, load following and shedding, response to fuel and ambient condition variations, control strategy analysis (turbine vs. gasifier lead, etc.), representative malfunctions/trips, alarms, scenarios, trending, snapshots, data historian, etc. To support this effort, process descriptions and control strategies were developed for key sections of the plant as part of the detailed functional specification, which is serving as the basis of the simulator development. In this paper, we highlight the contents of the detailed functional specification for the simulator. We also describe the engineering, design, and expert testing process that the simulator will undergo in order to ensure that maximum fidelity is built into the generic simulator. Future applications and training programs associated with gasification, combined cycle, and IGCC simulations are discussed, including plant operation and control demonstrations, as well as education and training services.« less
NASA Astrophysics Data System (ADS)
Ribes Bertomeu, Josep
Wastewater treatments require the execution of many conversion processes simultaneously and/or consecutively, making them a tricky object of study. Furthermore, complexity of treatment processes is increasing not only for the more stringent effluent standards required, but also for the new trends towards sustainable development, which in this process are mainly focused on energy saving and nutrient recovery from wastewaters in order to improve their life cycle. For this reason it becomes necessary to use simulation tools which are able to represent all these processes by means of a suitable mathematical model. They can help in determining and predicting the behaviour of the different treatment schemes. These simulators have become essential for the design, control and optimization of wastewater treatment plants (WWTP). Settling processes have a significant role in the accomplishment of effluent standards and the correct operation of the plant. However, many models that are currently employed for WWTP design and simulation do not take into account settling processes or they are handled in a very simple way, by neglecting the biochemical processes that can occur during sedimentation. People of CALAGUA research group have focussed their efforts towards a new philosophy of simulating treatment plants, which is based on the use of a unique model to represent all physical, chemical and biological processes taking place in WWTPs. In this research topic, they have worked on the development of a general quality model that considers biological conversion processes carried out by different microorganism groups, acid base chemical interactions affecting the pH value in the system, and gas-liquid transfer processes. However, a generalized use of such a quality model requires its combination with a flux model, principally for those processes where completely mixture can not be assumed, as for instance, settlers and thickeners in WWTPs. The main objective of this work has been the development and validation of a general settling model that allows simulating the main settling operations taking place in a WWTP, considering both primary and secondary settlers and thickeners. It consists in a one-dimensional model based on the flux theory of Kynch and the double-exponential settling function of Takacs that takes into account flocculation, hindered settling and compression processes. The model has been applied to simulation of settlers and thickeners by means of splitting the system into several horizontal layers, all of them considered as completely mixed reactors which are interconnected by mass flux obtained from the settling model. In order to simulate the conversion processes taking place during sedimentation, the general quality model BNRM1 has been added, and it has been proposed an iterative procedure for solving the equations for each layer in which the settler has been divided. The settling flux model validation, along with the quality model, has been carried out by applying them to a simulation of primary sludge fermentation - elutriation process. This process has been studied on a pilot plant located in the Carraixet WWTP in Alboraia (Valencia). In order to simulate the observed decrease in solids separation efficiency in the studied fermentation - elutriation process, the quality model has been modified with the addition of a new process called "disintegration of complex particulate material". This process influences the settleability of the sludge because it is considered that the disintegrated solids become non-settleable solids. This modification implies the addition of two new kinetic parameters (the specific disintegration velocity for volatile particulate material and the specific disintegration velocity for non volatile particulate material). However, the settling parameter that represents the non-settleable fraction of total suspended solids is eliminated from the model and it has been transformed into an experimental variable which is quite easy to analyze. The result of this modification is a more general model, which is applicable to fermentation - elutriation process working at any operating condition. Finally, the behaviour and capabilities of the developed model have been tested by simulating a complete WWTP on the DESASS simulation software, developed by the research group. This example includes the most important processes that can be used in a WWTP: biological nutrient removal, primary sludge fermentation and sludge digestion. The model allows considering both settling processes and biochemical processes as a whole (denitrification in secondary settlers, primary sludge fermentation and VFA elutriation, phosphorus release in thickeners because of the PAO decay, etc.). The developed model implies an important advance in study of new wastewater treatment processes because it allows dealing with global process optimization problems, by means of full plants simulation. It is very useful for studying the effects of a modification in operation conditions of one element over the operation of the rest of the elements of the WWTP. (Abstract shortened by UMI.).
Forsythe, Lydia
2009-01-01
In healthcare, professionals usually function in a time-constrained paradigm because of the nature of care delivery functions and the acute patient populations usually in need of emergent and urgent care. This leaves little, if no time for team reflection, or team processing as a collaborative action. Simulation can be used to create a safe space as a structure for recognition and innovation to continue to develop a culture of safety for healthcare delivery and patient care. To create and develop a safe space, three qualitative modified action research institutional review board-approved studies were developed using simulation to explore team communication as an unfolding in the acute care environment of the operating room. An action heuristic was used for data collection by capturing the participants' narratives in the form of collaborative recall and reflection to standardize task, process, and language. During the qualitative simulations, the team participants identified and changed multiple tasks, process, and language items. The simulations contributed to positive changes for task and efficiencies, team interactions, and overall functionality of the team. The studies demonstrated that simulation can be used in healthcare to define safe spaces to practice, reflect, and develop collaborative relationships, which contribute to the realization of a culture of safety.
Multislice spiral CT simulator for dynamic cardiopulmonary studies
NASA Astrophysics Data System (ADS)
De Francesco, Silvia; Ferreira da Silva, Augusto M.
2002-04-01
We've developed a Multi-slice Spiral CT Simulator modeling the acquisition process of a real tomograph over a 4-dimensional phantom (4D MCAT) of the human thorax. The simulator allows us to visually characterize artifacts due to insufficient temporal sampling and a priori evaluate the quality of the images obtained in cardio-pulmonary studies (both with single-/multi-slice and ECG gated acquisition processes). The simulating environment allows both for conventional and spiral scanning modes and includes a model of noise in the acquisition process. In case of spiral scanning, reconstruction facilities include longitudinal interpolation methods (360LI and 180LI both for single and multi-slice). Then, the reconstruction of the section is performed through FBP. The reconstructed images/volumes are affected by distortion due to insufficient temporal sampling of the moving object. The developed simulating environment allows us to investigate the nature of the distortion characterizing it qualitatively and quantitatively (using, for example, Herman's measures). Much of our work is focused on the determination of adequate temporal sampling and sinogram regularization techniques. At the moment, the simulator model is limited to the case of multi-slice tomograph, being planned as a next step of development the extension to cone beam or area detectors.
Yang, Wenting; Wang, Dongmei; Lei, Zhoujixin; Wang, Chunhui; Chen, Shanguang
2017-12-01
Astronauts who are exposed to weightless environment in long-term spaceflight might encounter bone density and mass loss for the mechanical stimulus is smaller than normal value. This study built a three dimensional model of human femur to simulate the remodeling process of human femur during bed rest experiment based on finite element analysis (FEA). The remodeling parameters of this finite element model was validated after comparing experimental and numerical results. Then, the remodeling process of human femur in weightless environment was simulated, and the remodeling function of time was derived. The loading magnitude and loading cycle on human femur during weightless environment were increased to simulate the exercise against bone loss. Simulation results showed that increasing loading magnitude is more effective in diminishing bone loss than increasing loading cycles, which demonstrated that exercise of certain intensity could help resist bone loss during long-term spaceflight. At the end, this study simulated the bone recovery process after spaceflight. It was found that the bone absorption rate is larger than bone formation rate. We advise that astronauts should take exercise during spaceflight to resist bone loss.
Kumar, Sameer
2011-01-01
It is increasingly recognized that hospital operation is an intricate system with limited resources and many interacting sources of both positive and negative feedback. The purpose of this study is to design a surgical delivery process in a county hospital in the U.S where patient flow through a surgical ward is optimized. The system simulation modeling is used to address questions of capacity planning, throughput management and interacting resources which constitute the constantly changing complexity that characterizes designing a contemporary surgical delivery process in a hospital. The steps in building a system simulation model is demonstrated using an example of building a county hospital in a small city in the US. It is used to illustrate a modular system simulation modeling of patient surgery process flows. The system simulation model development will enable planners and designers how they can build in overall efficiencies in a healthcare facility through optimal bed capacity for peak patient flow of emergency and routine patients.
Koivisto, J-M; Haavisto, E; Niemi, H; Haho, P; Nylund, S; Multisilta, J
2018-01-01
Nurses sometimes lack the competence needed for recognising deterioration in patient conditions and this is often due to poor clinical reasoning. There is a need to develop new possibilities for learning this crucial competence area. In addition, educators need to be future oriented; they need to be able to design and adopt new pedagogical innovations. The purpose of the study is to describe the development process and to generate principles for the design of nursing simulation games. A design-based research methodology is applied in this study. Iterative cycles of analysis, design, development, testing and refinement were conducted via collaboration among researchers, educators, students, and game designers. The study facilitated the generation of reusable design principles for simulation games to guide future designers when designing and developing simulation games for learning clinical reasoning. This study makes a major contribution to research on simulation game development in the field of nursing education. The results of this study provide important insights into the significance of involving nurse educators in the design and development process of educational simulation games for the purpose of nursing education. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kinetic Theory and Simulation of Single-Channel Water Transport
NASA Astrophysics Data System (ADS)
Tajkhorshid, Emad; Zhu, Fangqiang; Schulten, Klaus
Water translocation between various compartments of a system is a fundamental process in biology of all living cells and in a wide variety of technological problems. The process is of interest in different fields of physiology, physical chemistry, and physics, and many scientists have tried to describe the process through physical models. Owing to advances in computer simulation of molecular processes at an atomic level, water transport has been studied in a variety of molecular systems ranging from biological water channels to artificial nanotubes. While simulations have successfully described various kinetic aspects of water transport, offering a simple, unified model to describe trans-channel translocation of water turned out to be a nontrivial task.
Kinetic Monte Carlo (kMC) simulation of carbon co-implant on pre-amorphization process.
Park, Soonyeol; Cho, Bumgoo; Yang, Seungsu; Won, Taeyoung
2010-05-01
We report our kinetic Monte Carlo (kMC) study of the effect of carbon co-implant on the pre-amorphization implant (PAL) process. We employed BCA (Binary Collision Approximation) approach for the acquisition of the initial as-implant dopant profile and kMC method for the simulation of diffusion process during the annealing process. The simulation results implied that carbon co-implant suppresses the boron diffusion due to the recombination with interstitials. Also, we could compare the boron diffusion with carbon diffusion by calculating carbon reaction with interstitial. And we can find that boron diffusion is affected from the carbon co-implant energy by enhancing the trapping of interstitial between boron and interstitial.
Simulation study of the discharge characteristics of silos with cohesive particles
NASA Astrophysics Data System (ADS)
Hund, David; Weis, Dominik; Hesse, Robert; Antonyuk, Sergiy
2017-06-01
In many industrial applications the silo for bulk materials is an important part of an overall process. Silos are used for instance to buffer intermediate products to ensure a continuous supply for the next process step. This study deals with the discharging behaviour of silos containing cohesive bulk solids with particle sizes in the range of 100-500 μm. In this contribution the TOMAS [1,2] model developed for stationary and non-stationary discharging of a convergent hopper is verified with experiments and simulations using the Discrete Element Method. Moreover the influence of the cohesion of the bulk solids on the discharge behaviour is analysed by the simulation. The simulation results showed a qualitative agreement with the analytical model of TOMAS.
NASA Astrophysics Data System (ADS)
Huang, Pengnian; Li, Zhijia; Chen, Ji; Li, Qiaoling; Yao, Cheng
2016-11-01
To simulate the hydrological processes in semi-arid areas properly is still challenging. This study assesses the impact of different modeling strategies on simulating flood processes in semi-arid catchments. Four classic hydrological models, TOPMODEL, XINANJIANG (XAJ), SAC-SMA and TANK, were selected and applied to three semi-arid catchments in North China. Based on analysis and comparison of the simulation results of these classic models, four new flexible models were constructed and used to further investigate the suitability of various modeling strategies for semi-arid environments. Numerical experiments were also designed to examine the performances of the models. The results show that in semi-arid catchments a suitable model needs to include at least one nonlinear component to simulate the main process of surface runoff generation. If there are more than two nonlinear components in the hydrological model, they should be arranged in parallel, rather than in series. In addition, the results show that the parallel nonlinear components should be combined by multiplication rather than addition. Moreover, this study reveals that the key hydrological process over semi-arid catchments is the infiltration excess surface runoff, a non-linear component.
Simulating an Enactment Effect: Pronouns Guide Action Simulation during Narrative Comprehension
ERIC Educational Resources Information Center
Ditman, Tali; Brunye, Tad T.; Mahoney, Caroline R.; Taylor, Holly A.
2010-01-01
Recent research has suggested that reading involves the mental simulation of events and actions described in a text. It is possible however that previous findings did not tap into processes engaged during natural reading but rather those triggered by task demands. The present study examined whether readers spontaneously mentally simulate the…
An Empirical Study of Combining Communicating Processes in a Parallel Discrete Event Simulation
1990-12-01
dynamics of the cost/performance criteria which typically made up computer resource acquisition decisions . offering a broad range of tradeoffs in the way... prcesses has a significant impact on simulation performance. It is the hypothesis of this 3-4 SYSTEM DECOMPOSITION PHYSICAL SYSTEM 1: N PHYSICAL PROCESS 1...EMPTY)) next-event = pop(next-event-queue); lp-clock = next-event - time; Simulate next event departure- consume event-enqueue new event end while; If no
OpenSimulator Interoperability with DRDC Simulation Tools: Compatibility Study
2014-09-01
into two components: (1) backend data services consisting of user accounts, login service, assets, and inventory; and (2) the simulator server which...components are combined into a single OpenSimulator process. In grid mode, the two components are separated, placing the backend services into a ROBUST... mobile devices. Potential points of compatibility between Unity and OpenSimulator include: a Unity-based desktop computer OpenSimulator viewer; a
Thematic mapper design parameter investigation
NASA Technical Reports Server (NTRS)
Colby, C. P., Jr.; Wheeler, S. G.
1978-01-01
This study simulated the multispectral data sets to be expected from three different Thematic Mapper configurations, and the ground processing of these data sets by three different resampling techniques. The simulated data sets were then evaluated by processing them for multispectral classification, and the Thematic Mapper configuration, and resampling technique which provided the best classification accuracy were identified.
Dynamic Simulation and Static Matching for Action Prediction: Evidence from Body Part Priming
ERIC Educational Resources Information Center
Springer, Anne; Brandstadter, Simone; Prinz, Wolfgang
2013-01-01
Accurately predicting other people's actions may involve two processes: internal real-time simulation (dynamic updating) and matching recently perceived action images (static matching). Using a priming of body parts, this study aimed to differentiate the two processes. Specifically, participants played a motion-controlled video game with…
ERIC Educational Resources Information Center
Ahmet, Kara
2015-01-01
This paper presents a simple model of the provision of higher educational services that considers and exemplifies nonlinear, stochastic, and potentially chaotic processes. I use the methods of system dynamics to simulate these processes in the context of a particular sociologically interesting case, namely that of the Turkish higher education…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
NASA Astrophysics Data System (ADS)
Xie, Z.; Zou, J.; Qin, P.; Sun, Q.
2014-12-01
In this study, we incorporated a groundwater exploitation scheme into the land surface model CLM3.5 to investigate the effects of the anthropogenic exploitation of groundwater on land surface processes in a river basin. Simulations of the Haihe River Basin in northern China were conducted for the years 1965-2000 using the model. A control simulation without exploitation and three exploitation simulations with different water demands derived from socioeconomic data related to the Basin were conducted. The results showed that groundwater exploitation for human activities resulted in increased wetting and cooling effects at the land surface and reduced groundwater storage. A lowering of the groundwater table, increased upper soil moisture, reduced 2 m air temperature, and enhanced latent heat flux were detected by the end of the simulated period, and the changes at the land surface were related linearly to the water demands. To determine the possible responses of the land surface processes in extreme cases (i.e., in which the exploitation process either continued or ceased), additional hypothetical simulations for the coming 200 years with constant climate forcing were conducted, regardless of changes in climate. The simulations revealed that the local groundwater storage on the plains could not contend with high-intensity exploitation for long if the exploitation process continues at the current rate. Changes attributable to groundwater exploitation reached extreme values and then weakened within decades with the depletion of groundwater resources and the exploitation process will therefore cease. However, if exploitation is stopped completely to allow groundwater to recover, drying and warming effects, such as increased temperature, reduced soil moisture, and reduced total runoff, would occur in the Basin within the early decades of the simulation period. The effects of exploitation will then gradually disappear, and the land surface variables will approach the natural state and stabilize at different rates. Simulations were also conducted for cases in which exploitation either continues or ceases using future climate scenario outputs from a general circulation model. The resulting trends were almost the same as those of the simulations with constant climate forcing.
GPU-based Efficient Realistic Techniques for Bleeding and Smoke Generation in Surgical Simulators
Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu
2010-01-01
Background In actual surgery, smoke and bleeding due to cautery processes, provide important visual cues to the surgeon which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated effects of bleeding and smoke generation, they are not realistic due to the requirement of real time performance. To be interactive, visual update must be performed at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques since other computationally intensive processes compete for the available CPU resources. Methods In this work, we develop a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. Results The smoke and bleeding simulation were implemented as part of a Laparoscopic Adjustable Gastric Banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur in noticeable overhead. However, for smoke generation, an I/O (Input/Output) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Conclusions Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited in VR-based surgical simulators. PMID:20878651
Numerical investigation of coupled density-driven flow and hydrogeochemical processes below playas
NASA Astrophysics Data System (ADS)
Hamann, Enrico; Post, Vincent; Kohfahl, Claus; Prommer, Henning; Simmons, Craig T.
2015-11-01
Numerical modeling approaches with varying complexity were explored to investigate coupled groundwater flow and geochemical processes in saline basins. Long-term model simulations of a playa system gain insights into the complex feedback mechanisms between density-driven flow and the spatiotemporal patterns of precipitating evaporites and evolving brines. Using a reactive multicomponent transport model approach, the simulations reproduced, for the first time in a numerical study, the evaporite precipitation sequences frequently observed in saline basins ("bull's eyes"). Playa-specific flow, evapoconcentration, and chemical divides were found to be the primary controls for the location of evaporites formed, and the resulting brine chemistry. Comparative simulations with the computationally far less demanding surrogate single-species transport models showed that these were still able to replicate the major flow patterns obtained by the more complex reactive transport simulations. However, the simulated degree of salinization was clearly lower than in reactive multicomponent transport simulations. For example, in the late stages of the simulations, when the brine becomes halite-saturated, the nonreactive simulation overestimated the solute mass by almost 20%. The simulations highlight the importance of the consideration of reactive transport processes for understanding and quantifying geochemical patterns, concentrations of individual dissolved solutes, and evaporite evolution.
Stochastic simulation of spatially correlated geo-processes
Christakos, G.
1987-01-01
In this study, developments in the theory of stochastic simulation are discussed. The unifying element is the notion of Radon projection in Euclidean spaces. This notion provides a natural way of reconstructing the real process from a corresponding process observable on a reduced dimensionality space, where analysis is theoretically easier and computationally tractable. Within this framework, the concept of space transformation is defined and several of its properties, which are of significant importance within the context of spatially correlated processes, are explored. The turning bands operator is shown to follow from this. This strengthens considerably the theoretical background of the geostatistical method of simulation, and some new results are obtained in both the space and frequency domains. The inverse problem is solved generally and the applicability of the method is extended to anisotropic as well as integrated processes. Some ill-posed problems of the inverse operator are discussed. Effects of the measurement error and impulses at origin are examined. Important features of the simulated process as described by geomechanical laws, the morphology of the deposit, etc., may be incorporated in the analysis. The simulation may become a model-dependent procedure and this, in turn, may provide numerical solutions to spatial-temporal geologic models. Because the spatial simu??lation may be technically reduced to unidimensional simulations, various techniques of generating one-dimensional realizations are reviewed. To link theory and practice, an example is computed in detail. ?? 1987 International Association for Mathematical Geology.
Study of Natural Fiber Breakage during Composite Processing
NASA Astrophysics Data System (ADS)
Quijano-Solis, Carlos Jafet
Biofiber-thermoplastic composites have gained considerable importance in the last century. To provide mechanical reinforcement to the polymer, fibers must be larger than a critical aspect ratio (length-to-width ratio). However, biofibers undergo breakage in length or width during processing, affecting their final aspect ratio in the composites. In this study, influence on biofiber breakage by factors related to processing conditions, fiber morphology and the flow type was investigated through: a) experiments using an internal mixer, a twin-screw extruder (TSE) or a capillary rheometer; and b) a Monte Carlo computer simulation. Composites of thermomechanical fibers of aspen or wheat straw mixed with polypropylene were studied. Internal mixer experiments analyzed wheat straw and two batches of aspen fibers, named AL and AS. AL fibers had longer average length. Processing variables included the temperature, rotors speed and fiber concentration. TSE experiments studied AL and AS fiber composites under various screws speeds, temperatures and feeding rates of the polymer and fibers. Capillary rheometers experiments determined AL fiber breakage in shear and elongational flows for composites processed at different concentrations, temperatures, and strain rates. Finally, the internal mixer experimental results where compared to Monte Carlo simulation predictions. The simulation focused on fiber length breakage due to fiber-polymer interactions. Internal mixer results showed that final fiber average length depended almost solely on processing conditions while final fiber average width depended on both processing conditions and initial fiber morphology. In the TSE, processing conditions as well as initial fiber length influenced final average length. TSE results showed that the fiber concentration regime seems to influence the effect of processing variables on fiber breakage. Capillary rheometer experiments demonstrated that biofiber breakage happens in both elongational and shear flows. In some cases, percentage of biofiber breakage in elongational flow is higher. In general, simulation predictions of final average lengths were in good agreement with experiments, indicating the importance of fiber-polymer interactions on fiber breakage. The largest discrepancies were obtained at higher fiber concentration composites; these differences might be resolved, in future simulations, by including the effect of fiber-fiber interactions.
NASA Astrophysics Data System (ADS)
Schenck, Natalya A.; Horvath, Philip A.; Sinha, Amit K.
2018-02-01
While the literature on price discovery process and information flow between dominant and satellite market is exhaustive, most studies have applied an approach that can be traced back to Hasbrouck (1995) or Gonzalo and Granger (1995). In this paper, however, we propose a Generalized Langevin process with asymmetric double-well potential function, with co-integrated time series and interconnected diffusion processes to model the information flow and price discovery process in two, a dominant and a satellite, interconnected markets. A simulated illustration of the model is also provided.
Dai, Heng; Ye, Ming; Walker, Anthony P.; ...
2017-03-28
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
Simulating variable source problems via post processing of individual particle tallies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bleuel, D.L.; Donahue, R.J.; Ludewigt, B.A.
2000-10-20
Monte Carlo is an extremely powerful method of simulating complex, three dimensional environments without excessive problem simplification. However, it is often time consuming to simulate models in which the source can be highly varied. Similarly difficult are optimization studies involving sources in which many input parameters are variable, such as particle energy, angle, and spatial distribution. Such studies are often approached using brute force methods or intelligent guesswork. One field in which these problems are often encountered is accelerator-driven Boron Neutron Capture Therapy (BNCT) for the treatment of cancers. Solving the reverse problem of determining the best neutron source formore » optimal BNCT treatment can be accomplished by separating the time-consuming particle-tracking process of a full Monte Carlo simulation from the calculation of the source weighting factors which is typically performed at the beginning of a Monte Carlo simulation. By post-processing these weighting factors on a recorded file of individual particle tally information, the effect of changing source variables can be realized in a matter of seconds, instead of requiring hours or days for additional complete simulations. By intelligent source biasing, any number of different source distributions can be calculated quickly from a single Monte Carlo simulation. The source description can be treated as variable and the effect of changing multiple interdependent source variables on the problem's solution can be determined. Though the focus of this study is on BNCT applications, this procedure may be applicable to any problem that involves a variable source.« less
Reverse logistics system planning for recycling computers hardware: A case study
NASA Astrophysics Data System (ADS)
Januri, Siti Sarah; Zulkipli, Faridah; Zahari, Siti Meriam; Shamsuri, Siti Hajar
2014-09-01
This paper describes modeling and simulation of reverse logistics networks for collection of used computers in one of the company in Selangor. The study focuses on design of reverse logistics network for used computers recycling operation. Simulation modeling, presented in this work allows the user to analyze the future performance of the network and to understand the complex relationship between the parties involved. The findings from the simulation suggest that the model calculates processing time and resource utilization in a predictable manner. In this study, the simulation model was developed by using Arena simulation package.
NASA Technical Reports Server (NTRS)
Honikman, T.; Mcmahon, E.; Miller, E.; Pietrzak, L.; Yorsz, W.
1973-01-01
Digital image processing, image recorders, high-density digital data recorders, and data system element processing for use in an Earth Resources Survey image data processing system are studied. Loading to various ERS systems is also estimated by simulation.
Wu, Xiaohui; Yang, Yang; Wu, Gaoming; Mao, Juan; Zhou, Tao
2016-01-01
Applications of activated sludge models (ASM) in simulating industrial biological wastewater treatment plants (WWTPs) are still difficult due to refractory and complex components in influents as well as diversity in activated sludges. In this study, an ASM3 modeling study was conducted to simulate and optimize a practical coking wastewater treatment plant (CWTP). First, respirometric characterizations of the coking wastewater and CWTP biomasses were conducted to determine the specific kinetic and stoichiometric model parameters for the consecutive aeration-anoxic-aeration (O-A/O) biological process. All ASM3 parameters have been further estimated and calibrated, through cross validation by the model dynamic simulation procedure. Consequently, an ASM3 model was successfully established to accurately simulate the CWTP performances in removing COD and NH4-N. An optimized CWTP operation condition could be proposed reducing the operation cost from 6.2 to 5.5 €/m(3) wastewater. This study is expected to provide a useful reference for mathematic simulations of practical industrial WWTPs. Copyright © 2015 Elsevier Ltd. All rights reserved.
A New Numerical Simulation technology of Multistage Fracturing in Horizontal Well
NASA Astrophysics Data System (ADS)
Cheng, Ning; Kang, Kaifeng; Li, Jianming; Liu, Tao; Ding, Kun
2017-11-01
Horizontal multi-stage fracturing is recognized the effective development technology of unconventional oil resources. Geological mechanics in the numerical simulation of hydraulic fracturing technology occupies very important position, compared with the conventional numerical simulation technology, because of considering the influence of geological mechanics. New numerical simulation of hydraulic fracturing can more effectively optimize the design of fracturing and evaluate the production after fracturing. This paper studies is based on the three-dimensional stress and rock physics parameters model, using the latest fluid-solid coupling numerical simulation technology to engrave the extension process of fracture and describes the change of stress field in fracturing process, finally predict the production situation.
Thakral, Preston P.; Benoit, Roland G.; Schacter, Daniel L.
2017-01-01
Neuroimaging data indicate that episodic memory (i.e., remembering specific past experiences) and episodic simulation (i.e., imagining specific future experiences) are associated with enhanced activity in a common set of neural regions, often referred to as the core network. This network comprises the hippocampus, parahippocampal cortex, lateral and medial parietal cortex, lateral temporal cortex, and medial prefrontal cortex. Evidence for a core network has been taken as support for the idea that episodic memory and episodic simulation are supported by common processes. Much remains to be learned about how specific core network regions contribute to specific aspects of episodic simulation. Prior neuroimaging studies of episodic memory indicate that certain regions within the core network are differentially sensitive to the amount of information recollected (e.g., the left lateral parietal cortex). In addition, certain core network regions dissociate as a function of their timecourse of engagement during episodic memory (e.g., transient activity in the posterior hippocampus and sustained activity in the left lateral parietal cortex). In the current study, we assessed whether similar dissociations could be observed during episodic simulation. We found that the left lateral parietal cortex modulates as a function of the amount of simulated details. Of particular interest, while the hippocampus was insensitive to the amount of simulated details, we observed a temporal dissociation within the hippocampus: transient activity occurred in relatively posterior portions of the hippocampus and sustained activity occurred in anterior portions. Because the posterior hippocampal and lateral parietal findings parallel those observed previously during episodic memory, the present results add to the evidence that episodic memory and episodic simulation are supported by common processes. Critically, the present study also provides evidence that regions within the core network support dissociable processes. PMID:28324695
Multi-scale Modeling of Arctic Clouds
NASA Astrophysics Data System (ADS)
Hillman, B. R.; Roesler, E. L.; Dexheimer, D.
2017-12-01
The presence and properties of clouds are critically important to the radiative budget in the Arctic, but clouds are notoriously difficult to represent in global climate models (GCMs). The challenge stems partly from a disconnect in the scales at which these models are formulated and the scale of the physical processes important to the formation of clouds (e.g., convection and turbulence). Because of this, these processes are parameterized in large-scale models. Over the past decades, new approaches have been explored in which a cloud system resolving model (CSRM), or in the extreme a large eddy simulation (LES), is embedded into each gridcell of a traditional GCM to replace the cloud and convective parameterizations to explicitly simulate more of these important processes. This approach is attractive in that it allows for more explicit simulation of small-scale processes while also allowing for interaction between the small and large-scale processes. The goal of this study is to quantify the performance of this framework in simulating Arctic clouds relative to a traditional global model, and to explore the limitations of such a framework using coordinated high-resolution (eddy-resolving) simulations. Simulations from the global model are compared with satellite retrievals of cloud fraction partioned by cloud phase from CALIPSO, and limited-area LES simulations are compared with ground-based and tethered-balloon measurements from the ARM Barrow and Oliktok Point measurement facilities.
Ogata, Yuma; Ohnishi, Takashi; Moriya, Takahiro; Inadama, Naoko; Nishikido, Fumihiko; Yoshida, Eiji; Murayama, Hideo; Yamaya, Taiga; Haneishi, Hideaki
2014-01-01
The X'tal cube is a next-generation DOI detector for PET that we are developing to offer higher resolution and higher sensitivity than is available with present detectors. It is constructed from a cubic monolithic scintillation crystal and silicon photomultipliers which are coupled on various positions of the six surfaces of the cube. A laser-processing technique is applied to produce 3D optical boundaries composed of micro-cracks inside the monolithic scintillator crystal. The current configuration is based on an empirical trial of a laser-processed boundary. There is room to improve the spatial resolution by optimizing the setting of the laser-processed boundary. In fact, the laser-processing technique has high freedom in setting the parameters of the boundary such as size, pitch, and angle. Computer simulation can effectively optimize such parameters. In this study, to design optical characteristics properly for the laser-processed crystal, we developed a Monte Carlo simulator which can model arbitrary arrangements of laser-processed optical boundaries (LPBs). The optical characteristics of the LPBs were measured by use of a setup with a laser and a photo-diode, and then modeled in the simulator. The accuracy of the simulator was confirmed by comparison of position histograms obtained from the simulation and from experiments with a prototype detector composed of a cubic LYSO monolithic crystal with 6 × 6 × 6 segments and multi-pixel photon counters. Furthermore, the simulator was accelerated by parallel computing with general-purpose computing on a graphics processing unit. The calculation speed was about 400 times faster than that with a CPU.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tourret, D.; Mertens, J. C. E.; Lieberman, E.
We follow an Al-12 at. pct Cu alloy sample from the liquid state to mechanical failure, using in situ X-ray radiography during directional solidification and tensile testing, as well as three-dimensional computed tomography of the microstructure before and after mechanical testing. The solidification processing stage is simulated with a multi-scale dendritic needle network model, and the micromechanical behavior of the solidified microstructure is simulated using voxelized tomography data and an elasto-viscoplastic fast Fourier transform model. This study demonstrates the feasibility of direct in situ monitoring of a metal alloy microstructure from the liquid processing stage up to its mechanical failure,more » supported by quantitative simulations of microstructure formation and its mechanical behavior.« less
Tourret, D.; Mertens, J. C. E.; Lieberman, E.; ...
2017-09-13
We follow an Al-12 at. pct Cu alloy sample from the liquid state to mechanical failure, using in situ X-ray radiography during directional solidification and tensile testing, as well as three-dimensional computed tomography of the microstructure before and after mechanical testing. The solidification processing stage is simulated with a multi-scale dendritic needle network model, and the micromechanical behavior of the solidified microstructure is simulated using voxelized tomography data and an elasto-viscoplastic fast Fourier transform model. This study demonstrates the feasibility of direct in situ monitoring of a metal alloy microstructure from the liquid processing stage up to its mechanical failure,more » supported by quantitative simulations of microstructure formation and its mechanical behavior.« less
NASA Astrophysics Data System (ADS)
Tourret, D.; Mertens, J. C. E.; Lieberman, E.; Imhoff, S. D.; Gibbs, J. W.; Henderson, K.; Fezzaa, K.; Deriy, A. L.; Sun, T.; Lebensohn, R. A.; Patterson, B. M.; Clarke, A. J.
2017-11-01
We follow an Al-12 at. pct Cu alloy sample from the liquid state to mechanical failure, using in situ X-ray radiography during directional solidification and tensile testing, as well as three-dimensional computed tomography of the microstructure before and after mechanical testing. The solidification processing stage is simulated with a multi-scale dendritic needle network model, and the micromechanical behavior of the solidified microstructure is simulated using voxelized tomography data and an elasto-viscoplastic fast Fourier transform model. This study demonstrates the feasibility of direct in situ monitoring of a metal alloy microstructure from the liquid processing stage up to its mechanical failure, supported by quantitative simulations of microstructure formation and its mechanical behavior.
Sun, Peishi; Huang, Bing; Huang, Ruohua; Yang, Ping
2002-05-01
For the process of biopurifying waste gas containing VOC in low concentration by using a biological trickling filter, the related kinetic model and simulation of the new Adsorption-Biofilm theory were investigated in this study. By using the lab test data and the industrial test data, the results of contrast and validation indicated that the model had a good applicability for describing the practical bio-purification process of VOC waste gas. In the simulation study for the affection of main factor, such as the concentration of toluene in inlet gas, the gas flow and the height of biofilm-packing, a good pertinence was showed between calculated data and test dada, the interrelation coefficients were in 0.80-0.97.
Baseline process description for simulating plutonium oxide production for precalc project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pike, J. A.
Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO 2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as wellmore » as process and facility design details necessary for multi-scale, multi-physics models are provided.« less
Natural gas operations: considerations on process transients, design, and control.
Manenti, Flavio
2012-03-01
This manuscript highlights tangible benefits deriving from the dynamic simulation and control of operational transients of natural gas processing plants. Relevant improvements in safety, controllability, operability, and flexibility are obtained not only within the traditional applications, i.e. plant start-up and shutdown, but also in certain fields apparently time-independent such as the feasibility studies of gas processing plant layout and the process design of processes. Specifically, this paper enhances the myopic steady-state approach and its main shortcomings with respect to the more detailed studies that take into consideration the non-steady state behaviors. A portion of a gas processing facility is considered as case study. Process transients, design, and control solutions apparently more appealing from a steady-state approach are compared to the corresponding dynamic simulation solutions. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
Failure Analysis of a Sheet Metal Blanking Process Based on Damage Coupling Model
NASA Astrophysics Data System (ADS)
Wen, Y.; Chen, Z. H.; Zang, Y.
2013-11-01
In this paper, a blanking process of sheet metal is studied by the methods of numerical simulation and experimental observation. The effects of varying technological parameters related to the quality of products are investigated. An elastoplastic constitutive equation accounting for isotropic ductile damage is implemented into the finite element code ABAQUS with a user-defined material subroutine UMAT. The simulations of the damage evolution and ductile fracture in a sheet metal blanking process have been carried out by the FEM. In order to guarantee computation accuracy and avoid numerical divergence during large plastic deformation, a specified remeshing technique is successively applied when severe element distortion occurs. In the simulation, the evolutions of damage at different stage of the blanking process have been evaluated and the distributions of damage obtained from simulation are in proper agreement with the experimental results.
NASA Astrophysics Data System (ADS)
Wang, XiaoLiang; Li, JiaChun
2017-12-01
A new solver based on the high-resolution scheme with novel treatments of source terms and interface capture for the Savage-Hutter model is developed to simulate granular avalanche flows. The capability to simulate flow spread and deposit processes is verified through indoor experiments of a two-dimensional granular avalanche. Parameter studies show that reduction in bed friction enhances runout efficiency, and that lower earth pressure restraints enlarge the deposit spread. The April 9, 2000, Yigong avalanche in Tibet, China, is simulated as a case study by this new solver. The predicted results, including evolution process, deposit spread, and hazard impacts, generally agree with site observations. It is concluded that the new solver for the Savage-Hutter equation provides a comprehensive software platform for granular avalanche simulation at both experimental and field scales. In particular, the solver can be a valuable tool for providing necessary information for hazard forecasts, disaster mitigation, and countermeasure decisions in mountainous areas.
NASA Astrophysics Data System (ADS)
Yang, Yuansheng; Zhao, Fuze; Feng, Xiaohui
2017-10-01
The dispersion of carbon nanotubes (CNTs) in AZ91D melt by ultrasonic processing and microstructure formation of CNTs/AZ91D composite were studied using numerical and physical simulations. The sound field and acoustic streaming were predicted using finite element method. Meanwhile, optimal immersion depth of the ultrasonic probe and suitable ultrasonic power were obtained. Single-bubble model was used to predict ultrasonic cavitation in AZ91D melt. The relationship between sound pressure amplitude and ultrasonic cavitation was established. Physical simulations of acoustic streaming and ultrasonic cavitation agreed well with the numerical simulations. It was confirmed that the dispersion of carbon nanotubes was remarkably improved by ultrasonic processing. Microstructure formation of CNTs/AZ91D composite was numerically simulated using cellular automation method. In addition, grain refinement was achieved and the growth of dendrites was changed due to the uniform dispersion of CNTs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardiansyah, D.; Haryanto, F.; Male, S.
2014-09-30
Prism is a non-commercial Radiotherapy Treatment Planning System (RTPS) develop by Ira J. Kalet from Washington University. Inhomogeneity factor is included in Prism TPS dose calculation. The aim of this study is to investigate the sensitivity of dose calculation on Prism using Monte Carlo simulation. Phase space source from head linear accelerator (LINAC) for Monte Carlo simulation is implemented. To achieve this aim, Prism dose calculation is compared with EGSnrc Monte Carlo simulation. Percentage depth dose (PDD) and R50 from both calculations are observed. BEAMnrc is simulated electron transport in LINAC head and produced phase space file. This file ismore » used as DOSXYZnrc input to simulated electron transport in phantom. This study is started with commissioning process in water phantom. Commissioning process is adjusted Monte Carlo simulation with Prism RTPS. Commissioning result is used for study of inhomogeneity phantom. Physical parameters of inhomogeneity phantom that varied in this study are: density, location and thickness of tissue. Commissioning result is shown that optimum energy of Monte Carlo simulation for 6 MeV electron beam is 6.8 MeV. This commissioning is used R50 and PDD with Practical length (R{sub p}) as references. From inhomogeneity study, the average deviation for all case on interest region is below 5 %. Based on ICRU recommendations, Prism has good ability to calculate the radiation dose in inhomogeneity tissue.« less
NASA Technical Reports Server (NTRS)
Salmasi, A. B. (Editor); Springett, J. C.; Sumida, J. T.; Richter, P. H.
1984-01-01
The design and implementation of the Land Mobile Satellite Service (LMSS) channel simulator as a facility for an end to end hardware simulation of the LMSS communications links, primarily with the mobile terminal is described. A number of studies are reported which show the applications of the channel simulator as a facility for validation and assessment of the LMSS design requirements and capabilities by performing quantitative measurements and qualitative audio evaluations for various link design parameters and channel impairments under simulated LMSS operating conditions. As a first application, the LMSS channel simulator was used in the evaluation of a system based on the voice processing and modulation (e.g., NBFM with 30 kHz of channel spacing and a 2 kHz rms frequency deviation for average talkers) selected for the Bell System's Advanced Mobile Phone Service (AMPS). The various details of the hardware design, qualitative audio evaluation techniques, signal to channel impairment measurement techniques, the justifications for criteria of different parameter selection in regards to the voice processing and modulation methods, and the results of a number of parametric studies are further described.
NASA Astrophysics Data System (ADS)
Boisson, F.; Wimberley, C. J.; Lehnert, W.; Zahra, D.; Pham, T.; Perkins, G.; Hamze, H.; Gregoire, M.-C.; Reilhac, A.
2013-10-01
Monte Carlo-based simulation of positron emission tomography (PET) data plays a key role in the design and optimization of data correction and processing methods. Our first aim was to adapt and configure the PET-SORTEO Monte Carlo simulation program for the geometry of the widely distributed Inveon PET preclinical scanner manufactured by Siemens Preclinical Solutions. The validation was carried out against actual measurements performed on the Inveon PET scanner at the Australian Nuclear Science and Technology Organisation in Australia and at the Brain & Mind Research Institute and by strictly following the NEMA NU 4-2008 standard. The comparison of simulated and experimental performance measurements included spatial resolution, sensitivity, scatter fraction and count rates, image quality and Derenzo phantom studies. Results showed that PET-SORTEO reliably reproduces the performances of this Inveon preclinical system. In addition, imaging studies showed that the PET-SORTEO simulation program provides raw data for the Inveon scanner that can be fully corrected and reconstructed using the same programs as for the actual data. All correction techniques (attenuation, scatter, randoms, dead-time, and normalization) can be applied on the simulated data leading to fully quantitative reconstructed images. In the second part of the study, we demonstrated its ability to generate fast and realistic biological studies. PET-SORTEO is a workable and reliable tool that can be used, in a classical way, to validate and/or optimize a single PET data processing step such as a reconstruction method. However, we demonstrated that by combining a realistic simulated biological study ([11C]Raclopride here) involving different condition groups, simulation allows one also to assess and optimize the data correction, reconstruction and data processing line flow as a whole, specifically for each biological study, which is our ultimate intent.
An Aerodynamic Simulation Process for Iced Lifting Surfaces and Associated Issues
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Vickerman, Mary B.; Hackenberg, Anthony W.; Rigby, David L.
2003-01-01
This paper discusses technologies and software tools that are being implemented in a software toolkit currently under development at NASA Glenn Research Center. Its purpose is to help study the effects of icing on airfoil performance and assist with the aerodynamic simulation process which consists of characterization and modeling of ice geometry, application of block topology and grid generation, and flow simulation. Tools and technologies for each task have been carefully chosen based on their contribution to the overall process. For the geometry characterization and modeling, we have chosen an interactive rather than automatic process in order to handle numerous ice shapes. An Appendix presents features of a software toolkit developed to support the interactive process. Approaches taken for the generation of block topology and grids, and flow simulation, though not yet implemented in the software, are discussed with reasons for why particular methods are chosen. Some of the issues that need to be addressed and discussed by the icing community are also included.
In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway
NASA Astrophysics Data System (ADS)
Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun
2016-12-01
HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.
Jürgensen, Lars; Ehimen, Ehiaze Augustine; Born, Jens; Holm-Nielsen, Jens Bo
2015-02-01
This study aimed to investigate the feasibility of substitute natural gas (SNG) generation using biogas from anaerobic digestion and hydrogen from renewable energy systems. Using thermodynamic equilibrium analysis, kinetic reactor modeling and transient simulation, an integrated approach for the operation of a biogas-based Sabatier process was put forward, which was then verified using a lab scale heterogenous methanation reactor. The process simulation using a kinetic reactor model demonstrated the feasibility of the production of SNG at gas grid standards using a single reactor setup. The Wobbe index, CO2 content and calorific value were found to be controllable by the H2/CO2 ratio fed the methanation reactor. An optimal H2/CO2 ratio of 3.45-3.7 was seen to result in a product gas with high calorific value and Wobbe index. The dynamic reactor simulation verified that the process start-up was feasible within several minutes to facilitate surplus electricity use from renewable energy systems. Copyright © 2014 Elsevier Ltd. All rights reserved.
Li, Yang; Zhao, Qiangsheng; Mirdamadi, Mansour; ...
2016-01-06
Woven fabric carbon fiber/epoxy composites made through compression molding are one of the promising choices of material for the vehicle light-weighting strategy. Previous studies have shown that the processing conditions can have substantial influence on the performance of this type of the material. Therefore the optimization of the compression molding process is of great importance to the manufacturing practice. An efficient way to achieve the optimized design of this process would be through conducting finite element (FE) simulations of compression molding for woven fabric carbon fiber/epoxy composites. However, performing such simulation remains a challenging task for FE as multiple typesmore » of physics are involved during the compression molding process, including the epoxy resin curing and the complex mechanical behavior of woven fabric structure. In the present study, the FE simulation of the compression molding process of resin based woven fabric composites at continuum level is conducted, which is enabled by the implementation of an integrated material modeling methodology in LS-Dyna. Specifically, the chemo-thermo-mechanical problem of compression molding is solved through the coupling of three material models, i.e., one thermal model for temperature history in the resin, one mechanical model to update the curing-dependent properties of the resin and another mechanical model to simulate the behavior of the woven fabric composites. Preliminary simulations of the carbon fiber/epoxy woven fabric composites in LS-Dyna are presented as a demonstration, while validations and models with real part geometry are planned in the future work.« less
NASA Technical Reports Server (NTRS)
Nicholson, Wayne L.; Schuerger, Andrew C.
2005-01-01
Bacterial endospores in the genus Bacillus are considered good models for studying interplanetary transfer of microbes by natural or human processes. Although spore survival during transfer itself has been the subject of considerable study, the fate of spores in extraterrestrial environments has received less attention. In this report we subjected spores of a strain of Bacillus subtilis, containing luciferase resulting from expression of an sspB-luxAB gene fusion, to simulated martian atmospheric pressure (7-18 mbar) and composition (100% CO(2)) for up to 19 days in a Mars simulation chamber. We report here that survival was similar between spores exposed to Earth conditions and spores exposed up to 19 days to simulated martian conditions. However, germination-induced bioluminescence was lower in spores exposed to simulated martian atmosphere, which suggests sublethal impairment of some endogenous spore germination processes.
Simulation of secondary emission calorimeter for future colliders
NASA Astrophysics Data System (ADS)
Yetkin, E. A.; Yetkin, T.; Ozok, F.; Iren, E.; Erduran, M. N.
2018-03-01
We present updated results from a simulation study of a conceptual sampling electromagnetic calorimeter based on secondary electron emission process. We implemented the secondary electron emission process in Geant4 as a user physics list and produced the energy spectrum and yield of secondary electrons. The energy resolution of the SEE calorimeter was σ/E = (41%) GeV1/2/√E and the response linearity to electromagnetic showers was to within 1.5%. The simulation results were also compared with a traditional scintillator calorimeter.
NASA Astrophysics Data System (ADS)
Robinson, Wayne D.; Patt, Frederick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.
2009-08-01
One of the roles of the VIIRS Ocean Science Team (VOST) is to assess the performance of the instrument and scientific processing software that generates ocean color parameters such as normalized water-leaving radiances and chlorophyll. A VIIRS data simulator is being developed to help aid in this work. The simulator will create a sufficient set of simulated Sensor Data Records (SDR) so that the ocean component of the VIIRS processing system can be tested. It will also have the ability to study the impact of instrument artifacts on the derived parameter quality. The simulator will use existing resources available to generate the geolocation information and to transform calibrated radiances to geophysical parameters and visa-versa. In addition, the simulator will be able to introduce land features, cloud fields, and expected VIIRS instrument artifacts. The design of the simulator and its progress will be presented.
Croft, Hayley; Gilligan, Conor; Rasiah, Rohan; Levett-Jones, Tracy; Schneider, Jennifer
2017-01-01
Medication review and supply by pharmacists involves both cognitive and technical skills related to the safety and appropriateness of prescribed medicines. The cognitive ability of pharmacists to recall, synthesise and memorise information is a critical aspect of safe and optimal medicines use, yet few studies have investigated the clinical reasoning and decision-making processes pharmacists use when supplying prescribed medicines. The objective of this study was to examine the patterns and processes of pharmacists’ clinical reasoning and to identify the information sources used, when making decisions about the safety and appropriateness of prescribed medicines. Ten community pharmacists participated in a simulation in which they were required to review a prescription and make decisions about the safety and appropriateness of supplying the prescribed medicines to the patient, whilst at the same time thinking aloud about the tasks required. Following the simulation each pharmacist was asked a series of questions to prompt retrospective thinking aloud using video-stimulated recall. The simulated consultation and retrospective interview were recorded and transcribed for thematic analysis. All of the pharmacists made a safe and appropriate supply of two prescribed medicines to the simulated patient. Qualitative analysis identified seven core thinking processes used during the supply process: considering prescription in context, retrieving information, identifying medication-related issues, processing information, collaborative planning, decision making and reflection; and align closely with other health professionals. The insights from this study have implications for enhancing awareness of decision making processes in pharmacy practice and informing teaching and assessment approaches in medication supply. PMID:29301223
Croft, Hayley; Gilligan, Conor; Rasiah, Rohan; Levett-Jones, Tracy; Schneider, Jennifer
2017-12-31
Medication review and supply by pharmacists involves both cognitive and technical skills related to the safety and appropriateness of prescribed medicines. The cognitive ability of pharmacists to recall, synthesise and memorise information is a critical aspect of safe and optimal medicines use, yet few studies have investigated the clinical reasoning and decision-making processes pharmacists use when supplying prescribed medicines. The objective of this study was to examine the patterns and processes of pharmacists' clinical reasoning and to identify the information sources used, when making decisions about the safety and appropriateness of prescribed medicines. Ten community pharmacists participated in a simulation in which they were required to review a prescription and make decisions about the safety and appropriateness of supplying the prescribed medicines to the patient, whilst at the same time thinking aloud about the tasks required. Following the simulation each pharmacist was asked a series of questions to prompt retrospective thinking aloud using video-stimulated recall. The simulated consultation and retrospective interview were recorded and transcribed for thematic analysis. All of the pharmacists made a safe and appropriate supply of two prescribed medicines to the simulated patient. Qualitative analysis identified seven core thinking processes used during the supply process: considering prescription in context, retrieving information, identifying medication-related issues, processing information, collaborative planning, decision making and reflection; and align closely with other health professionals. The insights from this study have implications for enhancing awareness of decision making processes in pharmacy practice and informing teaching and assessment approaches in medication supply.
USDA-ARS?s Scientific Manuscript database
The objective of this study was to develop a realistic model to simulate the complex processes of flow and tracer transport in USDA-ARS OPE3 field site and to compare simulation results with the detailed monitoring observations. The site has been studied for over 10 years with the extensive availabl...
ERIC Educational Resources Information Center
Njoo, Melanie; de Jong, Ton
This paper contains the results of a study on the importance of discovery learning using computer simulations. The purpose of the study was to identify what constitutes discovery learning and to assess the effects of instructional support measures. College students were observed working with an assignment and a computer simulation in the domain of…
NASA Astrophysics Data System (ADS)
Raj, Rahul; van der Tol, Christiaan; Hamm, Nicholas Alexander Samuel; Stein, Alfred
2018-01-01
Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP) data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT), ratio of fine root carbon to leaf carbon (FRC : LC), ratio of carbon to nitrogen in leaf (C : Nleaf), canopy water interception coefficient (Wint), fraction of leaf nitrogen in RuBisCO (FLNR), and effective soil rooting depth (SD) characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash-Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.
IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
William M. Bond; Salih Ersayin
2007-03-30
This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency ofmore » individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern Minnesota, and future proposals are pending with non-taconite mineral processing applications.« less
Simulating the decentralized processes of the human immune system in a virtual anatomy model.
Sarpe, Vladimir; Jacob, Christian
2013-01-01
Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.
Experiments and FEM simulations of fracture behaviors for ADC12 aluminum alloy under impact load
NASA Astrophysics Data System (ADS)
Hu, Yumei; Xiao, Yue; Jin, Xiaoqing; Zheng, Haoran; Zhou, Yinge; Shao, Jinhua
2016-11-01
Using the combination of experiment and simulation, the fracture behavior of the brittle metal named ADC12 aluminum alloy was studied. Five typical experiments were carried out on this material, with responding data collected on different stress states and dynamic strain rates. Fractographs revealed that the morphologies of fractured specimen under several rates showed different results, indicating that the fracture was predominantly a brittle one in nature. Simulations of the fracture processes of those specimens were conducted by Finite Element Method, whilst consistency was observed between simulations and experiments. In simulation, the Johnson- Cook model was chosen to describe the damage development and to predict the failure using parameters determined from those experimental data. Subsequently, an ADC12 engine mount bracket crashing simulation was conducted and the results indicated good agreement with the experiments. The accordance showed that our research can provide an accurate description for the deforming and fracture processes of the studied alloy.
Shot Peening Numerical Simulation of Aircraft Aluminum Alloy Structure
NASA Astrophysics Data System (ADS)
Liu, Yong; Lv, Sheng-Li; Zhang, Wei
2018-03-01
After shot peening, the 7050 aluminum alloy has good anti-fatigue and anti-stress corrosion properties. In the shot peening process, the pellet collides with target material randomly, and generated residual stress distribution on the target material surface, which has great significance to improve material property. In this paper, a simplified numerical simulation model of shot peening was established. The influence of pellet collision velocity, pellet collision position and pellet collision time interval on the residual stress of shot peening was studied, which is simulated by the ANSYS/LS-DYNA software. The analysis results show that different velocity, different positions and different time intervals have great influence on the residual stress after shot peening. Comparing with the numerical simulation results based on Kriging model, the accuracy of the simulation results in this paper was verified. This study provides a reference for the optimization of the shot peening process, and makes an effective exploration for the precise shot peening numerical simulation.
Numerical computation of hurricane effects on historic coastal hydrology in Southern Florida
Swain, Eric D.; Krohn, M. Dennis; Langtimm, Catherine A.
2015-01-01
The hindcast simulation estimated hydrologic processes for the 1926 to 1932 period. It shows promise as a simulator in long-term ecological studies to test hypotheses based on theoretical or empirical-based studies at larger landscape scales.
UOE Pipe Manufacturing Process Simulation: Equipment Designing and Construction
NASA Astrophysics Data System (ADS)
Delistoian, Dmitri; Chirchor, Mihael
2017-12-01
UOE pipe manufacturing process influence directly on pipeline resilience and operation capacity. At present most spreaded pipe manufacturing method is UOE. This method is based on cold forming. After each technological step appears a certain stress and strain level. For pipe stress strain study is designed and constructed special equipment that simulate entire technological process.UOE pipe equipment is dedicated for manufacturing of longitudinally submerged arc welded DN 400 (16 inch) steel pipe.
NASA Astrophysics Data System (ADS)
1981-01-01
Oriel Corporation's simulators have a high pressure xenon lamp whose reflected light is processed by an optical system to produce a uniform solar beam. Because of many different types of applications, the simulators must be adjustable to replicate many different areas of the solar radiation spectrum. Simulators are laboratory tools for such purposes as testing and calibrating solar cells, or other solar energy systems, testing dyes, paints and pigments, pharmaceuticals and cosmetic preparations, plant and animal studies, food and agriculture studies and oceanographic research.
Continuity-based model interfacing for plant-wide simulation: a general approach.
Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A
2006-08-01
In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.
ERIC Educational Resources Information Center
Luo, Wei; Pelletier, Jon; Duffin, Kirk; Ormand, Carol; Hung, Wei-chen; Shernoff, David J.; Zhai, Xiaoming; Iverson, Ellen; Whalley, Kyle; Gallaher, Courtney; Furness, Walter
2016-01-01
The long geological time needed for landform development and evolution poses a challenge for understanding and appreciating the processes involved. The Web-based Interactive Landform Simulation Model--Grand Canyon (WILSIM-GC, http://serc.carleton.edu/landform/) is an educational tool designed to help students better understand such processes,…
ERIC Educational Resources Information Center
Koka, Andre
2017-01-01
This study examined the effectiveness of a brief theory-based intervention on muscular strength among adolescents in a physical education setting. The intervention adopted a process-based mental simulation technique. The self-reported frequency of practising for and actual levels of abdominal muscular strength/endurance as one component of…
Numerical simulation of plasma processes driven by transverse ion heating
NASA Technical Reports Server (NTRS)
Singh, Nagendra; Chan, C. B.
1993-01-01
The plasma processes driven by transverse ion heating in a diverging flux tube are investigated with numerical simulation. The heating is found to drive a host of plasma processes, in addition to the well-known phenomenon of ion conics. The downward electric field near the reverse shock generates a doublestreaming situation consisting of two upflowing ion populations with different average flow velocities. The electric field in the reverse shock region is modulated by the ion-ion instability driven by the multistreaming ions. The oscillating fields in this region have the possibility of heating electrons. These results from the simulations are compared with results from a previous study based on a hydrodynamical model. Effects of spatial resolutions provided by simulations on the evolution of the plasma are discussed.
Machine learning in sentiment reconstruction of the simulated stock market
NASA Astrophysics Data System (ADS)
Goykhman, Mikhail; Teimouri, Ali
2018-02-01
In this paper we continue the study of the simulated stock market framework defined by the driving sentiment processes. We focus on the market environment driven by the buy/sell trading sentiment process of the Markov chain type. We apply the methodology of the Hidden Markov Models and the Recurrent Neural Networks to reconstruct the transition probabilities matrix of the Markov sentiment process and recover the underlying sentiment states from the observed stock price behavior. We demonstrate that the Hidden Markov Model can successfully recover the transition probabilities matrix for the hidden sentiment process of the Markov Chain type. We also demonstrate that the Recurrent Neural Network can successfully recover the hidden sentiment states from the observed simulated stock price time series.
NASA Astrophysics Data System (ADS)
Kiani, Hossein; Sun, Da-Wen
2018-03-01
As novel processes such as ultrasound assisted heat transfer are emerged, new models and simulations are needed to describe these processes. In this paper, a numerical model was developed to study the freezing process of potatoes. Different thermal conductivity models were investigated, and the effect of sonication was evaluated on the convective heat transfer in a fluid to the particle heat transfer system. Potato spheres and sticks were the geometries researched, and the effect of different processing parameters on the results were studied. The numerical model successfully predicted the ultrasound assisted freezing of various shapes in comparison with experimental data of the process. The model was sensitive to processing parameters variation (sound intensity, duty cycle, shape, etc.) and could accurately simulate the freezing process. Among the thermal conductivity correlations studied, de Vries and Maxwell models gave closer estimations. The maximum temperature difference was obtained for the series equation that underestimated the thermal conductivity. Both numerical and experimental data confirmed that an optimum condition of intensity and duty cycle is needed for reducing the freezing time, as increasing the intensity, increased the heat transfer rate and sonically heating rate, simultaneously, that acted against each other.
Bigand, Emmanuel; Delbé, Charles; Poulin-Charronnat, Bénédicte; Leman, Marc; Tillmann, Barbara
2014-01-01
During the last decade, it has been argued that (1) music processing involves syntactic representations similar to those observed in language, and (2) that music and language share similar syntactic-like processes and neural resources. This claim is important for understanding the origin of music and language abilities and, furthermore, it has clinical implications. The Western musical system, however, is rooted in psychoacoustic properties of sound, and this is not the case for linguistic syntax. Accordingly, musical syntax processing could be parsimoniously understood as an emergent property of auditory memory rather than a property of abstract processing similar to linguistic processing. To support this view, we simulated numerous empirical studies that investigated the processing of harmonic structures, using a model based on the accumulation of sensory information in auditory memory. The simulations revealed that most of the musical syntax manipulations used with behavioral and neurophysiological methods as well as with developmental and cross-cultural approaches can be accounted for by the auditory memory model. This led us to question whether current research on musical syntax can really be compared with linguistic processing. Our simulation also raises methodological and theoretical challenges to study musical syntax while disentangling the confounded low-level sensory influences. In order to investigate syntactic abilities in music comparable to language, research should preferentially use musical material with structures that circumvent the tonal effect exerted by psychoacoustic properties of sounds. PMID:24936174
Simulation of aerobic and anaerobic biodegradation processes at a crude oil spill site
Essaid, Hedeff I.; Bekins, Barbara A.; Godsy, E. Michael; Warren, Ean; Baedecker, Mary Jo; Cozzarelli, Isabelle M.
1995-01-01
A two-dimensional, multispecies reactive solute transport model with sequential aerobic and anaerobic degradation processes was developed and tested. The model was used to study the field-scale solute transport and degradation processes at the Bemidji, Minnesota, crude oil spill site. The simulations included the biodegradation of volatile and nonvolatile fractions of dissolved organic carbon by aerobic processes, manganese and iron reduction, and methanogenesis. Model parameter estimates were constrained by published Monod kinetic parameters, theoretical yield estimates, and field biomass measurements. Despite the considerable uncertainty in the model parameter estimates, results of simulations reproduced the general features of the observed groundwater plume and the measured bacterial concentrations. In the simulation, 46% of the total dissolved organic carbon (TDOC) introduced into the aquifer was degraded. Aerobic degradation accounted for 40% of the TDOC degraded. Anaerobic processes accounted for the remaining 60% of degradation of TDOC: 5% by Mn reduction, 19% by Fe reduction, and 36% by methanogenesis. Thus anaerobic processes account for more than half of the removal of DOC at this site.
Monte Carlo simulations of safeguards neutron counter for oxide reduction process feed material
NASA Astrophysics Data System (ADS)
Seo, Hee; Lee, Chaehun; Oh, Jong-Myeong; An, Su Jung; Ahn, Seong-Kyu; Park, Se-Hwan; Ku, Jeong-Hoe
2016-10-01
One of the options for spent-fuel management in Korea is pyroprocessing whose main process flow is the head-end process followed by oxide reduction, electrorefining, and electrowining. In the present study, a well-type passive neutron coincidence counter, namely, the ACP (Advanced spent fuel Conditioning Process) safeguards neutron counter (ASNC), was redesigned for safeguards of a hot-cell facility related to the oxide reduction process. To this end, first, the isotopic composition, gamma/neutron emission yield and energy spectrum of the feed material ( i.e., the UO2 porous pellet) were calculated using the OrigenARP code. Then, the proper thickness of the gammaray shield was determined, both by irradiation testing at a standard dosimetry laboratory and by MCNP6 simulations using the parameters obtained from the OrigenARP calculation. Finally, the neutron coincidence counter's calibration curve for 100- to 1000-g porous pellets, in consideration of the process batch size, was determined through simulations. Based on these simulation results, the neutron counter currently is under construction. In the near future, it will be installed in a hot cell and tested with spent fuel materials.
Zhang, Jun
To explore the subjective learning experiences of baccalaureate nursing students participating in simulation sessions in a Chinese nursing school. This was a qualitative descriptive study. We used semi-structured interviews to explore students' perception about simulation-assisted learning. Each interview was audio-taped and transcribed verbatim. Thematic analysis was used to identify the major themes or categories from the transcript and the field notes. Only 10 students were needed to achieve theoretical saturation, due to high group homogeneity. Three main themes which were found from the study included 1. Students' positive views of the new educational experience of simulation; 2. Factors currently making simulation less attractive to students; and 3. The teacher's role in insuring a positive learning experience. Simulation-assisted teaching has been a positive experience for majority nursing students. Further efforts are needed in developing quality simulation-based course curriculum as well as planning and structuring its teaching process. The pedagogy approach requires close collaboration between faculty and students. Copyright © 2016 Elsevier Inc. All rights reserved.
Modified two-layer social force model for emergency earthquake evacuation
NASA Astrophysics Data System (ADS)
Zhang, Hao; Liu, Hong; Qin, Xin; Liu, Baoxi
2018-02-01
Studies of crowd behavior with related research on computer simulation provide an effective basis for architectural design and effective crowd management. Based on low-density group organization patterns, a modified two-layer social force model is proposed in this paper to simulate and reproduce a group gathering process. First, this paper studies evacuation videos from the Luan'xian earthquake in 2012, and extends the study of group organization patterns to a higher density. Furthermore, taking full advantage of the strength in crowd gathering simulations, a new method on grouping and guidance is proposed while using crowd dynamics. Second, a real-life grouping situation in earthquake evacuation is simulated and reproduced. Comparing with the fundamental social force model and existing guided crowd model, the modified model reduces congestion time and truly reflects group behaviors. Furthermore, the experiment result also shows that a stable group pattern and a suitable leader could decrease collision and allow a safer evacuation process.
Studies of particle wake potentials in plasmas
NASA Astrophysics Data System (ADS)
Ellis, Ian N.; Graziani, Frank R.; Glosli, James N.; Strozzi, David J.; Surh, Michael P.; Richards, David F.; Decyk, Viktor K.; Mori, Warren B.
2011-09-01
A detailed understanding of electron stopping and scattering in plasmas with variable values for the number of particles within a Debye sphere is still not at hand. Presently, there is some disagreement in the literature concerning the proper description of these processes. Theoretical models assume electrostatic (Coulomb force) interactions between particles and neglect magnetic effects. Developing and validating proper descriptions requires studying the processes using first-principle plasma simulations. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and BEPS. In this paper, we compare the wakes observed in these simulations with each other and predictions from collisionless kinetic theory. The relevance of the work to Fast Ignition is discussed.
Analysis and Comparison on the Flood Simulation in Typical Hilly & Semi-mountainous Region
NASA Astrophysics Data System (ADS)
Luan, Qinghua; Wang, Dong; Zhang, Xiang; Liu, Jiahong; Fu, Xiaoran; Zhang, Kun; Ma, Jun
2017-12-01
Water-logging and flood are both serious in hilly and semi-mountainous cities of China, but the related research is rare. Lincheng Economic Development Zone (EDZ) in Hebei Province as the typical city was selected and storm water management model (SWMM) was applied for flood simulation in this study. The regional model was constructed through calibrating and verifying the runoff coefficient of different flood processes. Different designed runoff processes in five-year, ten-year and twenty-year return periods in basic scenario and in the low impact development (LID) scenario, respectively, were simulated and compared. The result shows that: LID measures have effect on peak reduction in the study area, but the effectiveness is not significant; the effectiveness of lagging peak time is poor. These simulation results provide decision support for the rational construction of LID in the study area, and provide the references for regional rain flood management.
ASPEN simulation of a fixed-bed integrated gasification combined-cycle power plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, K.R.
1986-03-01
A fixed-bed integrated gasification combined-cycle (IGCC) power plant has been modeled using the Advanced System for Process ENgineering (ASPEN). The ASPEN simulation is based on a conceptual design of a 509-MW IGCC power plant that uses British Gas Corporation (BGC)/Lurgi slagging gasifiers and the Lurgi acid gas removal process. The 39.3-percent thermal efficiency of the plant that was calculated by the simulation compares very favorably with the 39.4 percent that was reported by EPRI. The simulation addresses only thermal performance and does not calculate capital cost or process economics. Portions of the BGC-IGCC simulation flowsheet are based on the SLAGGERmore » fixed-bed gasifier model (Stefano May 1985), and the Kellogg-Rust-Westinghouse (KRW) iGCC, and the Texaco-IGCC simulations (Stone July 1985) that were developed at the Department of Energy (DOE), Morgantown Energy Technology Center (METC). The simulation runs in 32 minutes of Central Processing Unit (CPU) time on the VAX-11/780. The BGC-IGCC simulation was developed to give accurate mass and energy balances and to track coal tars and environmental species such as SO/sub x/ and NO/sub x/ for a fixed-bed, coal-to-electricity system. This simulation is the third in a series of three IGCC simulations that represent fluidized-bed, entrained-flow, and fixed-bed gasification processes. Alternate process configurations can be considered by adding, deleting, or rearranging unit operation blocks. The gasifier model is semipredictive; it can properly respond to a limited range of coal types and gasifier operating conditions. However, some models in the flowsheet are based on correlations that were derived from the EPRI study, and are therefore limited to coal types and operating conditions that are reasonably close to those given in the EPRI design. 4 refs., 7 figs., 2 tabs.« less
Hydrogen Production via a High-Efficiency Low-Temperature Reformer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul KT Liu; Theo T. Tsotsis
2006-05-31
Fuel cells are promoted by the US government as a viable alternative for clean and efficient energy generation. It is anticipated that the fuel cell market will rise if the key technical barriers can be overcome. One of them is certainly fuel processing and purification. Existing fuel reforming processes are energy intensive, extremely complicated and capital intensive; these disadvantages handicap the scale-down of existing reforming process, targeting distributed or on-board/stationary hydrogen production applications. Our project involves the bench-scale demonstration of a high-efficiency low-temperature steam reforming process. Hydrogen production can be operated at 350 to 400ºC with our invention, as opposedmore » to >800ºC of existing reforming. In addition, our proposed process improves the start-up deficiency of conventional reforming due to its low temperature operation. The objective of this project is to demonstrate the invented process concept via a bench scale unit and verify mathematical simulation for future process optimization study. Under this project, we have performed the experimental work to determine the adsorption isotherm, reaction kinetics, and membrane permeances required to perform the process simulation based upon the mathematical model developed by us. A ceramic membrane coated with palladium thin film fabricated by us was employed in this study. The adsorption isotherm for a selected hydrotalcite adsorbent was determined experimentally. Further, the capacity loss under cyclic adsorption/desorption was confirmed to be negligible. Finally a commercial steam reforming catalyst was used to produce the reaction kinetic parameters required for the proposed operating condition. With these input parameters, a mathematical simulation was performed to predict the performance of the invented process. According to our simulation, our invented hybrid process can deliver 35 to 55% methane conversion, in comparison with the 12 and 18-21% conversion of the packed bed and an adsorptive reactor respectively. In addition CO contamination with <10 to 120 ppm is predicted for the invented process depending upon the cycle time for the PSA type operation. In comparison, the adsorption reactor can also deliver a similar CO contaminant at the low end; however, its high end reaches as high as 300 ppm based upon the simulation of our proposed operating condition. Our experimental results for the packed bed and the membrane reactor deliver 12 and 18% conversion at 400°C, approaching the conversion by the mathematical simulation. Due to the time constraint, the experimental study on the conversion of the invented process has not been complete. However, our in-house study using a similar process concept for the water gas shift reaction has demonstrated the reliability of our mathematical simulation for the invented process. In summary, we are confident that the invented process can deliver efficiently high purity hydrogen at a low temperature (~400°C). According to our projection, the invented process can further achieve 5% energy savings and ~50% capital savings over conventional reforming for fuel cell applications. The pollution abatement potential associated with the implementation of fuel cells, including the elimination of nitrogen oxides and CO, and the reduction in volatile organics and CO2, can thus be realized with the implementation of this invented process. The projected total market size for equipment sale for the proposed process in US is $1.5 billion annually.« less
Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.
Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo
2016-12-13
The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.
Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes
Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J.; Wang, Liliang; Lin, Jianguo
2016-01-01
The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions. PMID:28060298
NASA Astrophysics Data System (ADS)
Rezvanpanah, Elham; Ghaffarian Anbaran, S. Reza
2017-11-01
This study establishes a model and simulation scheme to describe the effect of crystallinity as one of the most effective parameters on cell growth phenomena in a solid batch foaming process. The governing model of cell growth dynamics, based on the well-known ‘Cell model’, is attained in details. To include the effect of crystallinity in the model, the properties of the polymer/gas mixtures (i.e. solubility, diffusivity, surface tension and viscosity) are estimated by modifying relations to consider the effect of crystallinity. A finite element-finite difference (FEFD) method is employed to solve the highly nonlinear and coupled equations of cell growth dynamics. The proposed simulation is able to evaluate all properties of the system at the given process condition and uses them to calculate the cell size, pressure and gas concentration gradient with time. A high-density polyethylene/nitrogen (HDPE/N2) system is used herein as a case study. Comparing the simulation results with the others works and experimental results verify the accuracy of the simulation scheme. The cell growth is a complicated combination of several phenomena. This study attempted to reach a better understanding of cell growth trend, driving and retarding forces and the effect of crystallinity on them.
NASA Astrophysics Data System (ADS)
Safaei, Hadi; Emami, Mohsen Davazdah; Jazi, Hamidreza Salimi; Mostaghimi, Javad
2017-12-01
Applications of hollow spherical particles in thermal spraying process have been developed in recent years, accompanied by attempts in the form of experimental and numerical studies to better understand the process of impact of a hollow droplet on a surface. During such process, volume and density of the trapped gas inside droplet change. The numerical models should be able to simulate such changes and their consequent effects. The aim of this study is to numerically simulate the impact of a hollow ZrO2 droplet on a flat surface using the volume of fluid technique for compressible flows. An open-source, finite-volume-based CFD code was used to perform the simulations, where appropriate subprograms were added to handle the studied cases. Simulation results were compared with the available experimental data. Results showed that at high impact velocities ( U 0 > 100 m/s), the compression of trapped gas inside droplet played a significant role in the impact dynamics. In such velocities, the droplet splashed explosively. Compressibility effects result in a more porous splat, compared to the corresponding incompressible model. Moreover, the compressible model predicted a higher spread factor than the incompressible model, due to planetary structure of the splat.
NASA Astrophysics Data System (ADS)
Yang, Bo; Tong, Yuting
2017-04-01
With the rapid development of economy, the development of logistics enterprises in China is also facing a huge challenge, especially the logistics enterprises generally lack of core competitiveness, and service innovation awareness is not strong. Scholars in the process of studying the core competitiveness of logistics enterprises are mainly from the perspective of static stability, not from the perspective of dynamic evolution to explore. So the author analyzes the influencing factors and the evolution process of the core competence of logistics enterprises, using the method of system dynamics to study the cause and effect of the evolution of the core competence of logistics enterprises, construct a system dynamics model of evolution of core competence logistics enterprises, which can be simulated by vensim PLE. The analysis for the effectiveness and sensitivity of simulation model indicates the model can be used as the fitting of the evolution process of the core competence of logistics enterprises and reveal the process and mechanism of the evolution of the core competence of logistics enterprises, and provide management strategies for improving the core competence of logistics enterprises. The construction and operation of computer simulation model offers a kind of effective method for studying the evolution of logistics enterprise core competence.
Building team adaptive capacity: the roles of sensegiving and team composition.
Randall, Kenneth R; Resick, Christian J; DeChurch, Leslie A
2011-05-01
The current study draws on motivated information processing in groups theory to propose that leadership functions and composition characteristics provide teams with the epistemic and social motivation needed for collective information processing and strategy adaptation. Three-person teams performed a city management decision-making simulation (N=74 teams; 222 individuals). Teams first managed a simulated city that was newly formed and required growth strategies and were then abruptly switched to a second simulated city that was established and required revitalization strategies. Consistent with hypotheses, external sensegiving and team composition enabled distinct aspects of collective information processing. Sensegiving prompted the emergence of team strategy mental models (i.e., cognitive information processing); psychological collectivism facilitated information sharing (i.e., behavioral information processing); and cognitive ability provided the capacity for both the cognitive and behavioral aspects of collective information processing. In turn, team mental models and information sharing enabled reactive strategy adaptation.
Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd
2016-03-01
The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.
Estarellas Martin, Carolina; Seira Castan, Constantí; Luque Garriga, F Javier; Bidon-Chanal Badia, Axel
2015-10-01
Residue conformational changes and internal cavity migration processes play a key role in regulating the kinetics of ligand migration and binding events in globins. Molecular dynamics simulations have demonstrated their value in the study of these processes in different haemoglobins, but derivation of kinetic data demands the use of more complex techniques like enhanced sampling molecular dynamics methods. This review discusses the different methodologies that are currently applied to study the ligand migration process in globins and highlight those specially developed to derive kinetic data. Copyright © 2015 Elsevier Ltd. All rights reserved.
Rizal, Datu; Tani, Shinichi; Nishiyama, Kimitoshi; Suzuki, Kazuhiko
2006-10-11
In this paper, a novel methodology in batch plant safety and reliability analysis is proposed using a dynamic simulator. A batch process involving several safety objects (e.g. sensors, controller, valves, etc.) is activated during the operational stage. The performance of the safety objects is evaluated by the dynamic simulation and a fault propagation model is generated. By using the fault propagation model, an improved fault tree analysis (FTA) method using switching signal mode (SSM) is developed for estimating the probability of failures. The timely dependent failures can be considered as unavailability of safety objects that can cause the accidents in a plant. Finally, the rank of safety object is formulated as performance index (PI) and can be estimated using the importance measures. PI shows the prioritization of safety objects that should be investigated for safety improvement program in the plants. The output of this method can be used for optimal policy in safety object improvement and maintenance. The dynamic simulator was constructed using Visual Modeler (VM, the plant simulator, developed by Omega Simulation Corp., Japan). A case study is focused on the loss of containment (LOC) incident at polyvinyl chloride (PVC) batch process which is consumed the hazardous material, vinyl chloride monomer (VCM).
NASA Astrophysics Data System (ADS)
Cheng, T.; Xu, Z.; Hong, S.
2017-12-01
Flood disasters frequently attack the urban area in Jinan City during past years, and the city is faced with severe road flooding which greatly threaten pedestrians' safety. Therefore, it is of great significance to investigate the pedestrian risk during floods under specific topographic condition. In this study, a model coupled hydrological and hydrodynamic processes is developed in the study area to simulate the flood routing process on the road for the "7.18" rainstorm and validated with post-disaster damage survey information. The risk of pedestrian is estimated with a flood risk assessment model. The result shows that the coupled model performs well in the rainstorm flood process. On the basis of the simulation result, the areas with extreme risk, medium risk, and mild risk are identified, respectively. Regions with high risk are generally located near the mountain front area with steep slopes. This study will provide scientific support for the flood control and disaster reduction in Jinan City.
Numerical simulation of X90 UOE pipe forming process
NASA Astrophysics Data System (ADS)
Zou, Tianxia; Ren, Qiang; Peng, Yinghong; Li, Dayong; Tang, Ding; Han, Jianzeng; Li, Xinwen; Wang, Xiaoxiu
2013-12-01
The UOE process is an important technique to manufacture large-diameter welding pipes which are increasingly applied in oil pipelines and offshore platforms. The forming process of UOE mainly consists of five successive operations: crimping, U-forming, O-forming, welding and mechanical expansion, through which a blank is formed into a pipe in a UOE pipe mill. The blank with an appropriate edge bevel is bent into a cylindrical shape by crimping (C-forming), U-forming and O-forming successively. After the O-forming, there is an open-seam between two ends of the plate. Then, the blank is welded by automatic four-electrode submerged arc welding technique. Subsequently, the welded pipe is expanded with a mechanical expander to get a high precision circular shape. The multiple operations in the UOE mill make it difficult to control the quality of the formed pipe. Therefore, process design mainly relies on experience in practical production. In this study, the UOE forming of an API X90 pipe is studied by using finite element simulation. The mechanical properties tests are performed on the API X90 pipeline steel blank. A two-dimensional finite element model under the hypothesis of plane strain condition is developed to simulate the UOE process according to data coming from the workshop. A kinematic hardening model is used in the simulation to take the Bauschinger effect into account. The deformation characteristics of the blank during the forming processes are analyzed. The simulation results show a significant coherence in the geometric configurations comparing with the practical manufacturing.
Improving operational anodising process performance using simulation approach
NASA Astrophysics Data System (ADS)
Liong, Choong-Yeun; Ghazali, Syarah Syahidah
2015-10-01
The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.
NASA Astrophysics Data System (ADS)
Xu, Ziwei; Yan, Tianying; Liu, Guiwu; Qiao, Guanjun; Ding, Feng
2015-12-01
To explore the mechanism of graphene chemical vapor deposition (CVD) growth on a catalyst surface, a molecular dynamics (MD) simulation of carbon atom self-assembly on a Ni(111) surface based on a well-designed empirical reactive bond order potential was performed. We simulated single layer graphene with recorded size (up to 300 atoms per super-cell) and reasonably good quality by MD trajectories up to 15 ns. Detailed processes of graphene CVD growth, such as carbon atom dissolution and precipitation, formation of carbon chains of various lengths, polygons and small graphene domains were observed during the initial process of the MD simulation. The atomistic processes of typical defect healing, such as the transformation from a pentagon into a hexagon and from a pentagon-heptagon pair (5|7) to two adjacent hexagons (6|6), were revealed as well. The study also showed that higher temperature and longer annealing time are essential to form high quality graphene layers, which is in agreement with experimental reports and previous theoretical results.To explore the mechanism of graphene chemical vapor deposition (CVD) growth on a catalyst surface, a molecular dynamics (MD) simulation of carbon atom self-assembly on a Ni(111) surface based on a well-designed empirical reactive bond order potential was performed. We simulated single layer graphene with recorded size (up to 300 atoms per super-cell) and reasonably good quality by MD trajectories up to 15 ns. Detailed processes of graphene CVD growth, such as carbon atom dissolution and precipitation, formation of carbon chains of various lengths, polygons and small graphene domains were observed during the initial process of the MD simulation. The atomistic processes of typical defect healing, such as the transformation from a pentagon into a hexagon and from a pentagon-heptagon pair (5|7) to two adjacent hexagons (6|6), were revealed as well. The study also showed that higher temperature and longer annealing time are essential to form high quality graphene layers, which is in agreement with experimental reports and previous theoretical results. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06016h
Defense Waste Processing Facility Simulant Chemical Processing Cell Studies for Sludge Batch 9
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Tara E.; Newell, J. David; Woodham, Wesley H.
The Savannah River National Laboratory (SRNL) received a technical task request from Defense Waste Processing Facility (DWPF) and Saltstone Engineering to perform simulant tests to support the qualification of Sludge Batch 9 (SB9) and to develop the flowsheet for SB9 in the DWPF. These efforts pertained to the DWPF Chemical Process Cell (CPC). CPC experiments were performed using SB9 simulant (SB9A) to qualify SB9 for sludge-only and coupled processing using the nitric-formic flowsheet in the DWPF. Two simulant batches were prepared, one representing SB8 Tank 40H and another representing SB9 Tank 51H. The simulant used for SB9 qualification testing wasmore » prepared by blending the SB8 Tank 40H and SB9 Tank 51H simulants. The blended simulant is referred to as SB9A. Eleven CPC experiments were run with an acid stoichiometry ranging between 105% and 145% of the Koopman minimum acid equation (KMA), which is equivalent to 109.7% and 151.5% of the Hsu minimum acid factor. Three runs were performed in the 1L laboratory scale setup, whereas the remainder were in the 4L laboratory scale setup. Sludge Receipt and Adjustment Tank (SRAT) and Slurry Mix Evaporator (SME) cycles were performed on nine of the eleven. The other two were SRAT cycles only. One coupled flowsheet and one extended run were performed for SRAT and SME processing. Samples of the condensate, sludge, and off-gas were taken to monitor the chemistry of the CPC experiments.« less
Study on Roadheader Cutting Load at Different Properties of Coal and Rock
2013-01-01
The mechanism of cutting process of roadheader with cutting head was researched, and the influences of properties of coal and rock on cutting load were deeply analyzed. Aimed at the defects of traditional calculation method of cutting load on fully expressing the complex cutting process of cutting head, the method of finite element simulation was proposed to simulate the dynamic cutting process. Aimed at the characteristics of coal and rock which affect the cutting load, several simulations with different firmness coefficient were taken repeatedly, and the relationship between three-axis force and firmness coefficient was derived. A comparative analysis of cutting pick load between simulation results and theoretical formula was carried out, and a consistency was achieved. Then cutting process with a total cutting head was carried out on this basis. The results show that the simulation analysis not only provides a reliable guarantee for the accurate calculation of the cutting head load and improves the efficiency of the cutting head cutting test but also offers a basis for selection of cutting head with different geological conditions of coal or rock. PMID:24302866
Is Moving More Memorable than Proving? Effects of Embodiment and Imagined Enactment on Verb Memory
Sidhu, David M.; Pexman, Penny M.
2016-01-01
Theories of embodied cognition propose that sensorimotor information is simulated during language processing (e.g., Barsalou, 1999). Previous studies have demonstrated that differences in simulation can have implications for word processing; for instance, lexical processing is facilitated for verbs that have relatively more embodied meanings (e.g., Sidhu et al., 2014). Here we examined the effects of these differences on memory for verbs. We observed higher rates of recognition (Experiments 1a-2a) and recall accuracy (Experiments 2b-3b) for verbs with a greater amount of associated bodily information (i.e., an embodiment effect). We also examined how this interacted with the imagined enactment effect: a memory benefit for actions that one imagines performing (e.g., Ditman et al., 2010). We found that these two effects did not interact (Experiment 3b), suggesting that the memory benefits of automatic simulation (i.e., the embodiment effect) and deliberate simulation (i.e., the imagined enactment effect) are distinct. These results provide evidence for the role of simulation in language processing, and its effects on memory. PMID:27445956
Numerical modeling of the fracture process in a three-unit all-ceramic fixed partial denture.
Kou, Wen; Kou, Shaoquan; Liu, Hongyuan; Sjögren, Göran
2007-08-01
The main objectives were to examine the fracture mechanism and process of a ceramic fixed partial denture (FPD) framework under simulated mechanical loading using a recently developed numerical modeling code, the R-T(2D) code, and also to evaluate the suitability of R-T(2D) code as a tool for this purpose. Using the recently developed R-T(2D) code the fracture mechanism and process of a 3U yttria-tetragonal zirconia polycrystal ceramic (Y-TZP) FPD framework was simulated under static loading. In addition, the fracture pattern obtained using the numerical simulation was compared with the fracture pattern obtained in a previous laboratory test. The result revealed that the framework fracture pattern obtained using the numerical simulation agreed with that observed in a previous laboratory test. Quasi-photoelastic stress fringe pattern and acoustic emission showed that the fracture mechanism was tensile failure and that the crack started at the lower boundary of the framework. The fracture process could be followed both in step-by-step and step-in-step. Based on the findings in the current study, the R-T(2D) code seems suitable for use as a complement to other tests and clinical observations in studying stress distribution, fracture mechanism and fracture processes in ceramic FPD frameworks.
Simulations of surface stress effects in nanoscale single crystals
NASA Astrophysics Data System (ADS)
Zadin, V.; Veske, M.; Vigonski, S.; Jansson, V.; Muszinsky, J.; Parviainen, S.; Aabloo, A.; Djurabekova, F.
2018-04-01
Onset of vacuum arcing near a metal surface is often associated with nanoscale asperities, which may dynamically appear due to different processes ongoing in the surface and subsurface layers in the presence of high electric fields. Thermally activated processes, as well as plastic deformation caused by tensile stress due to an applied electric field, are usually not accessible by atomistic simulations because of the long time needed for these processes to occur. On the other hand, finite element methods, able to describe the process of plastic deformations in materials at realistic stresses, do not include surface properties. The latter are particularly important for the problems where the surface plays crucial role in the studied process, as for instance, in the case of plastic deformations at a nanovoid. In the current study by means of molecular dynamics (MD) and finite element simulations we analyse the stress distribution in single crystal copper containing a nanovoid buried deep under the surface. We have developed a methodology to incorporate the surface effects into the solid mechanics framework by utilizing elastic properties of crystals, pre-calculated using MD simulations. The method leads to computationally efficient stress calculations and can be easily implemented in commercially available finite element software, making it an attractive analysis tool.
ERIC Educational Resources Information Center
BRATTEN, JACK E.
THE BIOLOGY COURSE OF THEODORE HIGH SCHOOL AT THEODORE, ALABAMA, WAS STUDIED AS A SYSTEM FOR "PROCESSING" STUDENTS AND WAS SIMULATED ON A COMPUTER. AN EXPERIMENTAL VERSION OF THE COURSE WAS SIMULATED AND COMPARED WITH THE ACTUAL COURSE. THE PURPOSES OF THIS STUDY WERE (1) TO EXAMINE THE CONCEPT OF INDIVIDUAL PROGRESS AS IT RELATED TO THE…
A low-cost fabrication method for sub-millimeter wave GaAs Schottky diode
NASA Astrophysics Data System (ADS)
Jenabi, Sarvenaz; Deslandes, Dominic; Boone, Francois; Charlebois, Serge A.
2017-10-01
In this paper, a submillimeter-wave Schottky diode is designed and simulated. Effect of Schottky layer thickness on cut-off frequency is studied. A novel microfabrication process is proposed and implemented. The presented microfabrication process avoids electron-beam (e-beam) lithography which reduces the cost. Also, this process provides more flexibility in selection of design parameters and allows significant reduction in the device parasitic capacitance. A key feature of the process is that the Schottky contact, the air-bridges, and the transmission lines, are fabricated in a single lift-off step. This process relies on a planarization method that is suitable for trenches of 1-10 μm deep and is tolerant to end-point variations. The fabricated diode is measured and results are compared with simulations. A very good agreement between simulation and measurement results are observed.
ERIC Educational Resources Information Center
Wang, Hung-Yuan; Duh, Henry Been-Lirn; Li, Nai; Lin, Tzung-Jin; Tsai, Chin-Chung
2014-01-01
The purpose of this study is to investigate and compare students' collaborative inquiry learning behaviors and their behavior patterns in an augmented reality (AR) simulation system and a traditional 2D simulation system. Their inquiry and discussion processes were analyzed by content analysis and lag sequential analysis (LSA). Forty…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yang; Zhao, Qiangsheng; Mirdamadi, Mansour
Woven fabric carbon fiber/epoxy composites made through compression molding are one of the promising choices of material for the vehicle light-weighting strategy. Previous studies have shown that the processing conditions can have substantial influence on the performance of this type of the material. Therefore the optimization of the compression molding process is of great importance to the manufacturing practice. An efficient way to achieve the optimized design of this process would be through conducting finite element (FE) simulations of compression molding for woven fabric carbon fiber/epoxy composites. However, performing such simulation remains a challenging task for FE as multiple typesmore » of physics are involved during the compression molding process, including the epoxy resin curing and the complex mechanical behavior of woven fabric structure. In the present study, the FE simulation of the compression molding process of resin based woven fabric composites at continuum level is conducted, which is enabled by the implementation of an integrated material modeling methodology in LS-Dyna. Specifically, the chemo-thermo-mechanical problem of compression molding is solved through the coupling of three material models, i.e., one thermal model for temperature history in the resin, one mechanical model to update the curing-dependent properties of the resin and another mechanical model to simulate the behavior of the woven fabric composites. Preliminary simulations of the carbon fiber/epoxy woven fabric composites in LS-Dyna are presented as a demonstration, while validations and models with real part geometry are planned in the future work.« less
Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S
2016-05-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.
Chhatre, Sunil; Jones, Carl; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel; Newcombe, Anthony; Keshavarz-Moore, Eli
2006-01-01
Growing commercial pressures in the pharmaceutical industry are establishing a need for robust computer simulations of whole bioprocesses to allow rapid prediction of the effects of changes made to manufacturing operations. This paper presents an integrated process simulation that models the cGMP manufacture of the FDA-approved biotherapeutic CroFab, an IgG fragment used to treat rattlesnake envenomation (Protherics U.K. Limited, Blaenwaun, Ffostrasol, Llandysul, Wales, U.K.). Initially, the product is isolated from ovine serum by precipitation and centrifugation, before enzymatic digestion of the IgG to produce FAB and FC fragments. These are purified by ion exchange and affinity chromatography to remove the FC and non-specific FAB fragments from the final venom-specific FAB product. The model was constructed in a discrete event simulation environment and used to determine the potential impact of a series of changes to the process, such as increasing the step efficiencies or volumes of chromatographic matrices, upon product yields and process times. The study indicated that the overall FAB yield was particularly sensitive to changes in the digestive and affinity chromatographic step efficiencies, which have a predicted 30% greater impact on process FAB yield than do the precipitation or centrifugation stages. The study showed that increasing the volume of affinity matrix has a negligible impact upon total process time. Although results such as these would require experimental verification within the physical constraints of the process and the facility, the model predictions are still useful in allowing rapid "what-if" scenario analysis of the likely impacts of process changes within such an integrated production process.
Simulating the flow of entangled polymers.
Masubuchi, Yuichi
2014-01-01
To optimize automation for polymer processing, attempts have been made to simulate the flow of entangled polymers. In industry, fluid dynamics simulations with phenomenological constitutive equations have been practically established. However, to account for molecular characteristics, a method to obtain the constitutive relationship from the molecular structure is required. Molecular dynamics simulations with atomic description are not practical for this purpose; accordingly, coarse-grained models with reduced degrees of freedom have been developed. Although the modeling of entanglement is still a challenge, mesoscopic models with a priori settings to reproduce entangled polymer dynamics, such as tube models, have achieved remarkable success. To use the mesoscopic models as staging posts between atomistic and fluid dynamics simulations, studies have been undertaken to establish links from the coarse-grained model to the atomistic and macroscopic simulations. Consequently, integrated simulations from materials chemistry to predict the macroscopic flow in polymer processing are forthcoming.
The National Shipbuilding Research Program, Computer Aided Process Planning for Shipyards
1986-08-01
Factory Simulation with Conventional Factory Planning Techniques Financial Justification of State-of-the-Art Investment: A Study Using CAPP I–5 T I T L...and engineer to order.” “Factory Simulation: Approach to Integration of Computer- Based Factory Simulation with Conventional Factory Planning Techniques
NASA Astrophysics Data System (ADS)
Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang
2010-11-01
This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gigley, H.M.
1982-01-01
An artificial intelligence approach to the simulation of neurolinguistically constrained processes in sentence comprehension is developed using control strategies for simulation of cooperative computation in associative networks. The desirability of this control strategy in contrast to ATN and production system strategies is explained. A first pass implementation of HOPE, an artificial intelligence simulation model of sentence comprehension, constrained by studies of aphasic performance, psycholinguistics, neurolinguistics, and linguistic theory is described. Claims that the model could serve as a basis for sentence production simulation and for a model of language acquisition as associative learning are discussed. HOPE is a model thatmore » performs in a normal state and includes a lesion simulation facility. HOPE is also a research tool. Its modifiability and use as a tool to investigate hypothesized causes of degradation in comprehension performance by aphasic patients are described. Issues of using behavioral constraints in modelling and obtaining appropriate data for simulated process modelling are discussed. Finally, problems of validation of the simulation results are raised; and issues of how to interpret clinical results to define the evolution of the model are discussed. Conclusions with respect to the feasibility of artificial intelligence simulation process modelling are discussed based on the current state of research.« less
Experimental validation of thermo-chemical algorithm for a simulation of pultrusion processes
NASA Astrophysics Data System (ADS)
Barkanov, E.; Akishin, P.; Miazza, N. L.; Galvez, S.; Pantelelis, N.
2018-04-01
To provide better understanding of the pultrusion processes without or with temperature control and to support the pultrusion tooling design, an algorithm based on the mixed time integration scheme and nodal control volumes method has been developed. At present study its experimental validation is carried out by the developed cure sensors measuring the electrical resistivity and temperature on the profile surface. By this verification process the set of initial data used for a simulation of the pultrusion process with rod profile has been successfully corrected and finally defined.
Zhang, Xiaobin; Li, Qiong; Eskine, Kendall J; Zuo, Bin
2014-01-01
The current studies extend perceptual symbol systems theory to the processing of gender categorization by revealing that gender categorization recruits perceptual simulations of spatial height and size dimensions. In study 1, categorization of male faces were faster when the faces were in the "up" (i.e., higher on the vertical axis) rather than the "down" (i.e., lower on the vertical axis) position and vice versa for female face categorization. Study 2 found that responses to male names depicted in larger font were faster than male names depicted in smaller font, whereas opposite response patterns were given for female names. Study 3 confirmed that the effect in Study 2 was not due to metaphoric relationships between gender and social power. Together, these findings suggest that representation of gender (social categorization) also involves processes of perceptual simulation.
Using a Laboratory Simulator in the Teaching and Study of Chemical Processes in Estuarine Systems
ERIC Educational Resources Information Center
Garcia-Luque, E.; Ortega, T.; Forja, J. M.; Gomez-Parra, A.
2004-01-01
The teaching of Chemical Oceanography in the Faculty of Marine and Environmental Sciences of the University of Cadiz (Spain) has been improved since 1994 by the employment of a device for the laboratory simulation of estuarine mixing processes and the characterisation of the chemical behaviour of many substances that pass through an estuary. The…
Kappel, Kalli; Miao, Yinglong; McCammon, J Andrew
2015-11-01
Elucidating the detailed process of ligand binding to a receptor is pharmaceutically important for identifying druggable binding sites. With the ability to provide atomistic detail, computational methods are well poised to study these processes. Here, accelerated molecular dynamics (aMD) is proposed to simulate processes of ligand binding to a G-protein-coupled receptor (GPCR), in this case the M3 muscarinic receptor, which is a target for treating many human diseases, including cancer, diabetes and obesity. Long-timescale aMD simulations were performed to observe the binding of three chemically diverse ligand molecules: antagonist tiotropium (TTP), partial agonist arecoline (ARc) and full agonist acetylcholine (ACh). In comparison with earlier microsecond-timescale conventional MD simulations, aMD greatly accelerated the binding of ACh to the receptor orthosteric ligand-binding site and the binding of TTP to an extracellular vestibule. Further aMD simulations also captured binding of ARc to the receptor orthosteric site. Additionally, all three ligands were observed to bind in the extracellular vestibule during their binding pathways, suggesting that it is a metastable binding site. This study demonstrates the applicability of aMD to protein-ligand binding, especially the drug recognition of GPCRs.
Studies of Particle Wake Potentials in Plasmas
NASA Astrophysics Data System (ADS)
Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren
2011-10-01
Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and with BEPS using various cell sizes. In this poster, we compare the wakes we observe in these simulations with each other and predictions from Vlasov theory. Prepared by LLNL under Contract DE-AC52-07NA27344 and by UCLA under Grant DE-FG52-09NA29552.
Three-dimensional numerical and experimental studies on transient ignition of hybrid rocket motor
NASA Astrophysics Data System (ADS)
Tian, Hui; Yu, Ruipeng; Zhu, Hao; Wu, Junfeng; Cai, Guobiao
2017-11-01
This paper presents transient simulations and experimental studies of the ignition process of the hybrid rocket motors (HRMs) using 90% hydrogen peroxide (HP) as the oxidizer and polymethyl methacrylate (PMMA) and Polyethylene (PE) as fuels. A fluid-solid coupling numerically method is established based on the conserved form of the three-dimensional unsteady Navier-Stokes (N-S) equations, considering gas fluid with chemical reactions and heat transfer between the fluid and solid region. Experiments are subsequently conducted using high-speed camera to record the ignition process. The flame propagation, chamber pressurizing process and average fuel regression rate of the numerical simulation results show good agreement with the experimental ones, which demonstrates the validity of the simulations in this study. The results also indicate that the flame propagation time is mainly affected by fluid dynamics and it increases with an increasing grain port area. The chamber pressurizing process begins when the flame propagation completes in the grain port. Furthermore, the chamber pressurizing time is about 4 times longer than the time of flame propagation.
INTEGRATION OF COST MODELS AND PROCESS SIMULATION TOOLS FOR OPTIMUM COMPOSITE MANUFACTURING PROCESS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, Seongchan; Wilson, Daniel; Aitharaju, Venkat
Manufacturing cost of resin transfer molded composite parts is significantly influenced by the cycle time, which is strongly related to the time for both filling and curing of the resin in the mold. The time for filling can be optimized by various injection strategies, and by suitably reducing the length of the resin flow distance during the injection. The curing time can be reduced by the usage of faster curing resins, but it requires a high pressure injection equipment, which is capital intensive. Predictive manufacturing simulation tools that are being developed recently for composite materials are able to provide variousmore » scenarios of processing conditions virtually well in advance of manufacturing the parts. In the present study, we integrate the cost models with process simulation tools to study the influence of various parameters such as injection strategies, injection pressure, compression control to minimize high pressure injection, resin curing rate, and demold time on the manufacturing cost as affected by the annual part volume. A representative automotive component was selected for the study and the results are presented in this paper« less
Computational study of noise in a large signal transduction network.
Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena
2011-06-21
Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.
ERIC Educational Resources Information Center
Gohring, Ralph J.
1979-01-01
A case study describing the process involved in publishing a personally developed simulation game including finding a publisher, obtaining a copyright, negotiating the contract, controlling front-end costs, marketing the product, and receiving feedback from users. (CMV)
Nanostructures nucleation in carbon-metal gaseous phase: A molecular dynamics study
NASA Astrophysics Data System (ADS)
Galiullina, G. M.; Orekhov, N. D.; Stegailov, V. V.
2018-01-01
We perform nonequilibrium molecular dynamics simulation of carbon nanoclusters nucleation and early stages of growth from the gaseous phase. We analyze the catalytic effect of iron atoms on the nucleation kinetics and structure of the resultant nanoparticles. Reactive Force Field (ReaxFF) is used in the simulations for the description of bond formation and dissociation during the nucleation process at the nanoscale. The catalytic effect of iron reveals itself even on nanosecond simulation times: iron atoms accelerate the process of clustering but result in less graphitized carbon structures.
Molecular simulation studies on chemical reactivity of methylcyclopentadiene.
Wang, Qingsheng; Zhang, Yingchun; Rogers, William J; Mannan, M Sam
2009-06-15
Molecular simulations are important to predict thermodynamic values for reactive chemicals especially when sufficient experimental data are not available. Methylcyclopentadiene (MCP) is an example of a highly reactive and hazardous compound in the chemical process industry. In this work, chemical reactivity of 2-methylcyclopentadiene, including isomerization, dimerization, and oxidation reactions, is investigated in detail by theoretical computational chemistry methods and empirical thermodynamic-energy correlation. On the basis of molecular simulations, an average value of -15.2 kcal/mol for overall heat of dimerization and -45.6 kcal/mol for overall heat of oxidation were obtained in gaseous phase at 298 K and 1 atm. These molecular simulation studies can provide guidance for the design of safer chemical processes, safer handling of MCP, and also provide useful information for an investigation of the T2 Laboratories explosion on December 19, 2007, in Florida.
Yang, Jubiao; Wang, Xingshi; Krane, Michael; Zhang, Lucy T.
2017-01-01
In this study, a fully-coupled fluid–structure interaction model is developed for studying dynamic interactions between compressible fluid and aeroelastic structures. The technique is built based on the modified Immersed Finite Element Method (mIFEM), a robust numerical technique to simulate fluid–structure interactions that has capabilities to simulate high Reynolds number flows and handles large density disparities between the fluid and the solid. For accurate assessment of this intricate dynamic process between compressible fluid, such as air and aeroelastic structures, we included in the model the fluid compressibility in an isentropic process and a solid contact model. The accuracy of the compressible fluid solver is verified by examining acoustic wave propagations in a closed and an open duct, respectively. The fully-coupled fluid–structure interaction model is then used to simulate and analyze vocal folds vibrations using compressible air interacting with vocal folds that are represented as layered viscoelastic structures. Using physiological geometric and parametric setup, we are able to obtain a self-sustained vocal fold vibration with a constant inflow pressure. Parametric studies are also performed to study the effects of lung pressure and vocal fold tissue stiffness in vocal folds vibrations. All the case studies produce expected airflow behavior and a sustained vibration, which provide verification and confidence in our future studies of realistic acoustical studies of the phonation process. PMID:29527067
The Development of a 3D LADAR Simulator Based on a Fast Target Impulse Response Generation Approach
NASA Astrophysics Data System (ADS)
Al-Temeemy, Ali Adnan
2017-09-01
A new laser detection and ranging (LADAR) simulator has been developed, using MATLAB and its graphical user interface, to simulate direct detection time of flight LADAR systems, and to produce 3D simulated scanning images under a wide variety of conditions. This simulator models each stage from the laser source to data generation and can be considered as an efficient simulation tool to use when developing LADAR systems and their data processing algorithms. The novel approach proposed for this simulator is to generate the actual target impulse response. This approach is fast and able to deal with high scanning requirements without losing the fidelity that accompanies increments in speed. This leads to a more efficient LADAR simulator and opens up the possibility for simulating LADAR beam propagation more accurately by using a large number of laser footprint samples. The approach is to select only the parts of the target that lie in the laser beam angular field by mathematically deriving the required equations and calculating the target angular ranges. The performance of the new simulator has been evaluated under different scanning conditions, the results showing significant increments in processing speeds in comparison to conventional approaches, which are also used in this study as a point of comparison for the results. The results also show the simulator's ability to simulate phenomena related to the scanning process, for example, type of noise, scanning resolution and laser beam width.
Just, Sarah; Toschkoff, Gregor; Funke, Adrian; Djuric, Dejan; Scharrer, Georg; Khinast, Johannes; Knop, Klaus; Kleinebudde, Peter
2013-03-01
Coating of solid dosage forms is an important unit operation in the pharmaceutical industry. In recent years, numerical simulations of drug manufacturing processes have been gaining interest as process analytical technology tools. The discrete element method (DEM) in particular is suitable to model tablet-coating processes. For the development of accurate simulations, information on the material properties of the tablets is required. In this study, the mechanical parameters Young's modulus, coefficient of restitution (CoR), and coefficients of friction (CoF) of gastrointestinal therapeutic systems (GITS) and of active-coated GITS were measured experimentally. The dynamic angle of repose of these tablets in a drum coater was investigated to revise the CoF. The resulting values were used as input data in DEM simulations to compare simulation and experiment. A mean value of Young's modulus of 31.9 MPa was determined by the uniaxial compression test. The CoR was found to be 0.78. For both tablet-steel and tablet-tablet friction, active-coated GITS showed a higher CoF compared with GITS. According to the values of the dynamic angle of repose, the CoF was adjusted to obtain consistent tablet motion in the simulation and in the experiment. On the basis of this experimental characterization, mechanical parameters are integrated into DEM simulation programs to perform numerical analysis of coating processes.
Urban Expansion Modeling Approach Based on Multi-Agent System and Cellular Automata
NASA Astrophysics Data System (ADS)
Zeng, Y. N.; Yu, M. M.; Li, S. N.
2018-04-01
Urban expansion is a land-use change process that transforms non-urban land into urban land. This process results in the loss of natural vegetation and increase in impervious surfaces. Urban expansion also alters the hydrologic cycling, atmospheric circulation, and nutrient cycling processes and generates enormous environmental and social impacts. Urban expansion monitoring and modeling are crucial to understanding urban expansion process, mechanism, and its environmental impacts, and predicting urban expansion in future scenarios. Therefore, it is important to study urban expansion monitoring and modeling approaches. We proposed to simulate urban expansion by combining CA and MAS model. The proposed urban expansion model based on MSA and CA was applied to a case study area of Changsha-Zhuzhou-Xiangtan urban agglomeration, China. The results show that this model can capture urban expansion with good adaptability. The Kappa coefficient of the simulation results is 0.75, which indicated that the combination of MAS and CA offered the better simulation result.
Observations and simulations of specularly reflected He++ at Earth's quasiperpendicular bow shock
NASA Astrophysics Data System (ADS)
Broll, J. M.; Fuselier, S. A.; Trattner, K. J.; Anderson, B. J.; Burch, J. L.; Giles, B. L.
2016-12-01
Specular reflection of protons at Earth's quasiperpendicular bow shock is an important process for supercritical shock dissipation. Previous studies have found evidence of He++ specular reflection from reduced particle distributions downstream from the shock, but confirmation of the process for heavier ions in the shock foot was not possible due to time resolution constraints. We present He++ distributions, observed by MMS in a quasiperpendicular bow shock crossing, that are consistent with specularly reflected He++. We also investigate the He++ dynamics with test-particle simulations in a simulated shock based on this crossing and we conduct wave analysis to determine what processes lead to separate gyrotropization timescales for the transmitted and reflected populations.
NASA Astrophysics Data System (ADS)
Zou, Jing; Xie, Zhenghui; Zhan, Chesheng; Qin, Peihua; Sun, Qin; Jia, Binghao; Xia, Jun
2015-05-01
In this study, we incorporated a groundwater exploitation scheme into the land surface model CLM3.5 to investigate the effects of the anthropogenic exploitation of groundwater on land surface processes in a river basin. Simulations of the Haihe River Basin in northern China were conducted for the years 1965-2000 using the model. A control simulation without exploitation and three exploitation simulations with different water demands derived from socioeconomic data related to the Basin were conducted. The results showed that groundwater exploitation for human activities resulted in increased wetting and cooling effects at the land surface and reduced groundwater storage. A lowering of the groundwater table, increased upper soil moisture, reduced 2 m air temperature, and enhanced latent heat flux were detected by the end of the simulated period, and the changes at the land surface were related linearly to the water demands. To determine the possible responses of the land surface processes in extreme cases (i.e., in which the exploitation process either continued or ceased), additional hypothetical simulations for the coming 200 years with constant climate forcing were conducted, regardless of changes in climate. The simulations revealed that the local groundwater storage on the plains could not contend with high-intensity exploitation for long if the exploitation process continues at the current rate. Changes attributable to groundwater exploitation reached extreme values and then weakened within decades with the depletion of groundwater resources and the exploitation process will therefore cease. However, if exploitation is stopped completely to allow groundwater to recover, drying and warming effects, such as increased temperature, reduced soil moisture, and reduced total runoff, would occur in the Basin within the early decades of the simulation period. The effects of exploitation will then gradually disappear, and the variables will approach the natural state and stabilize at different rates. Simulations were also conducted for cases in which exploitation either continues or ceases using future climate scenario outputs from a general circulation model. The resulting trends were almost the same as those of the simulations with constant climate forcing, despite differences in the climate data input. Therefore, a balance between slow groundwater restoration and rapid human development of the land must be achieved to maintain a sustainable water resource.
ERIC Educational Resources Information Center
Evans, Steven T.; Huang, Xinqun; Cramer, Steven M.
2010-01-01
The commercial simulator Aspen Chromatography was employed to study and optimize an important new industrial separation process, weak partitioning chromatography. This case study on antibody purification was implemented in a chromatographic separations course. Parametric simulations were performed to investigate the effect of operating parameters…
Koivisto, Jaana-Maija; Multisilta, Jari; Niemi, Hannele; Katajisto, Jouko; Eriksson, Elina
2016-10-01
Clinical reasoning is viewed as a problem-solving activity; in games, players solve problems. To provide excellent patient care, nursing students must gain competence in clinical reasoning. Utilising gaming elements and virtual simulations may enhance learning of clinical reasoning. To investigate nursing students' experiences of learning clinical reasoning process by playing a 3D simulation game. Cross-sectional descriptive study. Thirteen gaming sessions at two universities of applied sciences in Finland. The prototype of the simulation game used in this study was single-player in format. The game mechanics were built around the clinical reasoning process. Nursing students from the surgical nursing course of autumn 2014 (N=166). Data were collected by means of an online questionnaire. In terms of the clinical reasoning process, students learned how to take action and collect information but were less successful in learning to establish goals for patient care or to evaluate the effectiveness of interventions. Learning of the different phases of clinical reasoning process was strongly positively correlated. The students described that they learned mainly to apply theoretical knowledge while playing. The results show that those who played digital games daily or occasionally felt that they learned clinical reasoning by playing the game more than those who did not play at all. Nursing students' experiences of learning the clinical reasoning process by playing a 3D simulation game showed that such games can be used successfully for learning. To ensure that students follow a systematic approach, the game mechanics need to be built around the clinical reasoning process. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCorkle, D.; Yang, C.; Jordan, T.
2007-06-01
Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less
Numerical study of vortex rope during load rejection of a prototype pump-turbine
NASA Astrophysics Data System (ADS)
Liu, J. T.; Liu, S. H.; Sun, Y. K.; Wu, Y. L.; Wang, L. Q.
2012-11-01
A transient process of load rejection of a prototype pump-turbine was studied by three dimensional, unsteady simulations, as well as steady calculations.Dynamic mesh (DM) method and remeshing method were used to simulate the rotation of guide vanes and runner. The rotational speed of the runner was predicted by fluid couplingmethod. Both the transient calculation and steady calculation were performed based on turbulence model. Results show that steady calculation results have large error in the prediction of the external characteristics of the transient process. The runaway speed can reach 1.15 times the initial rotational speed during the transient process. The vortex rope occurs before the pump-turbine runs at zero moment point. Vortex rope has the same rotating direction with the runner. The vortex rope is separated into two parts as the flow rate decreases to 0. Pressure level decreases during the whole transient process.The transient simulation result were also compared and verified by experimental results. This computational method could be used in the fault diagnosis of transient operation, as well as the optimization of a transient process.
NASA Astrophysics Data System (ADS)
Li, N.; Mohamed, M. S.; Cai, J.; Lin, J.; Balint, D.; Dean, T. A.
2011-05-01
Formability of steel and aluminium alloys in hot stamping and cold die quenching processes is studied in this research. Viscoplastic-damage constitutive equations are developed and determined from experimental data for the prediction of viscoplastic flow and ductility of the materials. The determined unified constitutive equations are then implemented into the commercial Finite Element code Abaqus/Explicit via a user defined subroutine, VUMAT. An FE process simulation model and numerical procedures are established for the modeling of hot stamping processes for a spherical part with a central hole. Different failure modes (failure takes place either near the central hole or in the mid span of the part) are obtained. To validate the simulation results, a test programme is developed, a test die set has been designed and manufactured, and tests have been carried out for the materials with different forming rates. It has been found that very close agreements between experimental and numerical process simulation results are obtained for the ranges of temperatures and forming rates carried out.
An intersubject variable regional anesthesia simulator with a virtual patient architecture.
Ullrich, Sebastian; Grottke, Oliver; Fried, Eduard; Frommen, Thorsten; Liao, Wei; Rossaint, Rolf; Kuhlen, Torsten; Deserno, Thomas M
2009-11-01
The main purpose is to provide an intuitive VR-based training environment for regional anesthesia (RA). The research question is how to process subject-specific datasets, organize them in a meaningful way and how to perform the simulation for peripheral regions. We propose a flexible virtual patient architecture and methods to process datasets. Image acquisition, image processing (especially segmentation), interactive nerve modeling and permutations (nerve instantiation) are described in detail. The simulation of electric impulse stimulation and according responses are essential for the training of peripheral RA and solved by an approach based on the electric distance. We have created an XML-based virtual patient database with several subjects. Prototypes of the simulation are implemented and run on multimodal VR hardware (e.g., stereoscopic display and haptic device). A first user pilot study has confirmed our approach. The virtual patient architecture enables support for arbitrary scenarios on different subjects. This concept can also be used for other simulators. In future work, we plan to extend the simulation and conduct further evaluations in order to provide a tool for routine training for RA.
Feasibility Study On Missile Launch Detection And Trajectory Tracking
2016-09-01
Vehicles ( UAVs ) in military operations, their role in a missile defense operation is not well defined. The simulation program discussed in this thesis ...targeting information to an attacking UAV to reliably intercept the missile. B . FURTHER STUDIES The simulation program can be enhanced to improve the...intercept the threat. This thesis explores the challenges in creating a simulation program to process video footage from an unstable platform and the
Dependence of Snowmelt Simulations on Scaling of the Forcing Processes (Invited)
NASA Astrophysics Data System (ADS)
Winstral, A. H.; Marks, D. G.; Gurney, R. J.
2009-12-01
The spatial organization and scaling relationships of snow distribution in mountain environs is ultimately dependent on the controlling processes. These processes include interactions between weather, topography, vegetation, snow state, and seasonally-dependent radiation inputs. In large scale snow modeling it is vital to know these dependencies to obtain accurate predictions while reducing computational costs. This study examined the scaling characteristics of the forcing processes and the dependency of distributed snowmelt simulations to their scaling. A base model simulation characterized these processes with 10m resolution over a 14.0 km2 basin with an elevation range of 1474 - 2244 masl. Each of the major processes affecting snow accumulation and melt - precipitation, wind speed, solar radiation, thermal radiation, temperature, and vapor pressure - were independently degraded to 1 km resolution. Seasonal and event-specific results were analyzed. Results indicated that scale effects on melt vary by process and weather conditions. The dependence of melt simulations on the scaling of solar radiation fluxes also had a seasonal component. These process-based scaling characteristics should remain static through time as they are based on physical considerations. As such, these results not only provide guidance for current modeling efforts, but are also well suited to predicting how potential climate changes will affect the heterogeneity of mountain snow distributions.
Using an operator training simulator in the undergraduate chemical engineering curriculim
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharyya, D.; Turton, R.; Zitney, S.
2012-01-01
An operator training simulator (OTS) is to the chemical engineer what a flight simulator is to the aerospace engineer. The basis of an OTS is a high-fidelity dynamic model of a chemical process that allows an engineer to simulate start-up, shut-down, and normal operation. It can also be used to test the skill and ability of an engineer or operator to respond and control some unforeseen situation(s) through the use of programmed malfunctions. West Virginia University (WVU) is a member of the National Energy Technology Laboratory’s Regional University Alliance (NETL-RUA). Working through the NETL-RUA, the authors have spent the lastmore » four years collaborating on the development of a high-fidelity OTS for an Integrated Gasification Combined Cycle (IGCC) power plant with CO{sub 2} capture that is the cornerstone of the AVESTARTM (Advanced Virtual Energy Simulation Training And Research) Center with sister facilities at NETL and WVU in Morgantown, WV. This OTS is capable of real-time dynamic simulation of IGCC plant operation, including start-up, shut-down, and power demand load following. The dynamic simulator and its human machine interfaces (HMIs) are based on the DYNSIM and InTouch software, respectively, from Invensys Operations Management. The purpose of this presentation is to discuss the authors’ experiences in using this sophisticated dynamic simulation-based OTS as a hands-on teaching tool in the undergraduate chemical engineering curriculum. At present, the OTS has been used in two separate courses: a new process simulation course and a traditional process control course. In the process simulation course, concepts of steady-state and dynamic simulations were covered prior to exposing the students to the OTS. Moreover, digital logic and the concept of equipment requiring one or more permissive states to be enabled prior to successful operation were also covered. Students were briefed about start-up procedures and the importance of following a predetermined sequence of actions in order to start-up the plant successfully. Student experience with the dynamic simulator consisted of a six-hour training session in which the Claus sulfur capture unit of the IGCC plant was started up. The students were able to operate the simulator through the InTouch-based HMI displays and study and understand the underlying dynamic modeling approach used in the DYNSIM-based simulator. The concepts learned during the training sessions were further reinforced when students developed their own DYNSIM models for a chemical process and wrote a detailed start-up procedure. In the process control course, students learned how the plant responds dynamically to changes in the manipulated inputs, as well as how the control system impacts plant performance, stability, robustness and disturbance rejection characteristics. The OTS provided the opportunity to study the dynamics of complicated, “real-life” process plants consisting of hundreds of pieces of equipment. Students implemented ideal forcing functions, tracked the time-delay through the entire plant, studied the response of open-loop unstable systems, and learned “good practices” in control system design by taking into account the real-world events where significant deviations from the “ideal” or “expected” response can occur. The theory of closed-loop stability was reinforced by implementing limiting proportional gain for stability limits of real plants. Finally, students were divided into several groups where each group was tasked to control a section of the plant within a set of operating limits in the face of disturbances and simulated process faults. At the end of this test, they suggested ways to improve the control system performance based on the theory they learned in class and the hands-on experience they earned while working on the OTS.« less
NASA Astrophysics Data System (ADS)
Mahdaoui, O.; Agassant, J.-F.; Laure, P.; Valette, R.; Silva, L.
2007-04-01
The polymer coextrusion process is a new method of sheet metal lining. It allows to substitute lacquers for steel protection in food packaging industry. The coextrusion process may exhibit flow instabilities at the interface between the two polymer layers. The objective of this study is to check the influence of processing and rheology parameters on the instabilities. Finite elements numerical simulations of the coextrusion allow to investigate various stable and instable flow configurations.
A modified dynamical model of drying process of polymer blend solution coated on a flat substrate
NASA Astrophysics Data System (ADS)
Kagami, Hiroyuki
2008-05-01
We have proposed and modified a model of drying process of polymer solution coated on a flat substrate for flat polymer film fabrication. And for example numerical simulation of the model reproduces a typical thickness profile of the polymer film formed after drying. Then we have clarified dependence of distribution of polymer molecules on a flat substrate on a various parameters based on analysis of numerical simulations. Then we drove nonlinear equations of drying process from the dynamical model and the fruits were reported. The subject of above studies was limited to solution having one kind of solute though the model could essentially deal with solution having some kinds of solutes. But nowadays discussion of drying process of a solution having some kinds of solutes is needed because drying process of solution having some kinds of solutes appears in many industrial scenes. Polymer blend solution is one instance. And typical resist consists of a few kinds of polymers. Then we introduced a dynamical model of drying process of polymer blend solution coated on a flat substrate and results of numerical simulations of the dynamical model. But above model was the simplest one. In this study, we modify above dynamical model of drying process of polymer blend solution adding effects that some parameters change with time as functions of some variables to it. Then we consider essence of drying process of polymer blend solution through comparison between results of numerical simulations of the modified model and those of the former model.
Oh, Hong-Choon; Toh, Hong-Guan; Giap Cheong, Eddy Seng
2011-11-01
Using the classical process improvement framework of Plan-Do-Study-Act (PDSA), the diagnostic radiology department of a tertiary hospital identified several patient cycle time reduction strategies. Experimentation of these strategies (which included procurement of new machines, hiring of new staff, redesign of queue system, etc.) through pilot scale implementation was impractical because it might incur substantial expenditure or be operationally disruptive. With this in mind, simulation modeling was used to test these strategies via performance of "what if" analyses. Using the output generated by the simulation model, the team was able to identify a cost-free cycle time reduction strategy, which subsequently led to a reduction of patient cycle time and achievement of a management-defined performance target. As healthcare professionals work continually to improve healthcare operational efficiency in response to rising healthcare costs and patient expectation, simulation modeling offers an effective scientific framework that can complement established process improvement framework like PDSA to realize healthcare process enhancement. © 2011 National Association for Healthcare Quality.
Launch Site Computer Simulation and its Application to Processes
NASA Technical Reports Server (NTRS)
Sham, Michael D.
1995-01-01
This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.
Molecular Dynamic Studies of Particle Wake Potentials in Plasmas
NASA Astrophysics Data System (ADS)
Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren
2010-11-01
Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (P^3M) code ddcMD to perform these simulations. As a starting point in our study, we examined the wake of a particle passing through a plasma. In this poster, we compare the wake observed in 3D ddcMD simulations with that predicted by Vlasov theory and those observed in the electrostatic PIC code BEPS where the cell size was reduced to .03λD.
An intelligent processing environment for real-time simulation
NASA Technical Reports Server (NTRS)
Carroll, Chester C.; Wells, Buren Earl, Jr.
1988-01-01
The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.
The Effect of Simulation Games on the Learning of Computational Problem Solving
ERIC Educational Resources Information Center
Liu, Chen-Chung; Cheng, Yuan-Bang; Huang, Chia-Wen
2011-01-01
Simulation games are now increasingly applied to many subject domains as they allow students to engage in discovery processes, and may facilitate a flow learning experience. However, the relationship between learning experiences and problem solving strategies in simulation games still remains unclear in the literature. This study, thus, analyzed…
This study analyzes simulated regional-scale ozone burdens both near the surface and aloft, estimates process contributions to these burdens, and calculates the sensitivity of the simulated regional-scale ozone burden to several key model inputs with a particular emphasis on boun...
NASA Astrophysics Data System (ADS)
Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai
2018-01-01
Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.
NASA Technical Reports Server (NTRS)
Ross, M. D.; Linton, S. W.; Parnas, B. R.
2000-01-01
A quasi-three-dimensional finite-volume numerical simulator was developed to study passive voltage spread in vestibular macular afferents. The method, borrowed from computational fluid dynamics, discretizes events transpiring in small volumes over time. The afferent simulated had three calyces with processes. The number of processes and synapses, and direction and timing of synapse activation, were varied. Simultaneous synapse activation resulted in shortest latency, while directional activation (proximal to distal and distal to proximal) yielded most regular discharges. Color-coded visualizations showed that the simulator discretized events and demonstrated that discharge produced a distal spread of voltage from the spike initiator into the ending. The simulations indicate that directional input, morphology, and timing of synapse activation can affect discharge properties, as must also distal spread of voltage from the spike initiator. The finite volume method has generality and can be applied to more complex neurons to explore discrete synaptic effects in four dimensions.
NASA Astrophysics Data System (ADS)
Kuras, P. K.; Weiler, M.; Alila, Y.; Spittlehouse, D.; Winkler, R.
2006-12-01
Hydrologic models have been increasingly used in forest hydrology to overcome the limitations of paired watershed experiments, where vegetative recovery and natural variability obscure the inferences and conclusions that can be drawn from such studies. Models, however, are also plagued by uncertainty stemming from a limited understanding of hydrological processes in forested catchments and parameter equifinality is a common concern. This has created the necessity to improve our understanding of how hydrological systems work, through the development of hydrological measures, analyses and models that address the question: are we getting the right answers for the right reasons? Hence, physically-based, spatially-distributed hydrologic models should be validated with high-quality experimental data describing multiple concurrent internal catchment processes under a range of hydrologic regimes. The distributed hydrology soil vegetation model (DHSVM) frequently used in forest management applications is an example of a process-based model used to address the aforementioned circumstances, and this study takes a novel approach at collectively examining the ability of a pre-calibrated model application to realistically simulate outlet flows along with the spatial-temporal variation of internal catchment processes including: continuous groundwater dynamics at 9 locations, stream and road network flow at 67 locations for six individual days throughout the freshet, and pre-melt season snow distribution. Model efficiency was improved over prior evaluations due to continuous efforts in improving the quality of meteorological data in the watershed. Road and stream network flows were very well simulated for a range of hydrological conditions, and the spatial distribution of the pre-melt season snowpack was in general agreement with observed values. The model was effective in simulating the spatial variability of subsurface flow generation, except at locations where strong stream-groundwater interactions existed, as the model is not capable of simulating such processes and subsurface flows always drain to the stream network. The model has proven overall to be quite capable in realistically simulating internal catchment processes in the watershed, which creates more confidence in future model applications exploring the effects of various forest management scenarios on the watershed's hydrological processes.
Simulative design and process optimization of the two-stage stretch-blow molding process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopmann, Ch.; Rasche, S.; Windeck, C.
2015-05-22
The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development timemore » and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.« less
Simulative design and process optimization of the two-stage stretch-blow molding process
NASA Astrophysics Data System (ADS)
Hopmann, Ch.; Rasche, S.; Windeck, C.
2015-05-01
The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.
Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.
2017-01-01
Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830
Robot graphic simulation testbed
NASA Technical Reports Server (NTRS)
Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.
1991-01-01
The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.
Chung, Tae Nyoung; Kim, Sun Wook; You, Je Sung; Chung, Hyun Soo
2016-01-01
Objective Tube thoracostomy (TT) is a commonly performed intensive care procedure. Simulator training may be a good alternative method for TT training, compared with conventional methods such as apprenticeship and animal skills laboratory. However, there is insufficient evidence supporting use of a simulator. The aim of this study is to determine whether training with medical simulator is associated with faster TT process, compared to conventional training without simulator. Methods This is a simulation study. Eligible participants were emergency medicine residents with very few (≤3 times) TT experience. Participants were randomized to two groups: the conventional training group, and the simulator training group. While the simulator training group used the simulator to train TT, the conventional training group watched the instructor performing TT on a cadaver. After training, all participants performed a TT on a cadaver. The performance quality was measured as correct placement and time delay. Subjects were graded if they had difficulty on process. Results Estimated median procedure time was 228 seconds in the conventional training group and 75 seconds in the simulator training group, with statistical significance (P=0.040). The difficulty grading did not show any significant difference among groups (overall performance scale, 2 vs. 3; P=0.094). Conclusion Tube thoracostomy training with a medical simulator, when compared to no simulator training, is associated with a significantly faster procedure, when performed on a human cadaver. PMID:27752610
QM/MM MD and Free Energy Simulation Study of Methyl Transfer Processes Catalyzed by PKMTs and PRMTs.
Chu, Yuzhuo; Guo, Hong
2015-09-01
Methyl transfer processes catalyzed by protein lysine methyltransferases (PKMTs) and protein arginine methyltransferases (PRMTs) control important biological events including transcriptional regulation and cell signaling. One important property of these enzymes is that different PKMTs and PRMTs catalyze the formation of different methylated product (product specificity). These different methylation states lead to different biological outcomes. Here, we review the results of quantum mechanics/molecular mechanics molecular dynamics and free energy simulations that have been performed to study the reaction mechanism of PKMTs and PRMTs and the mechanism underlying the product specificity of the methyl transfer processes.
QM/MM MD and free energy simulation study of methyl transfer processes catalyzed by PKMTs and PRMTs.
Chu, Yuzhuo; Guo, Hong
2015-01-16
Methyl transfer processes catalyzed by protein lysine methyltransferases (PKMTs) and protein arginine methyltransferases (PRMTs) control important biological events including transcriptional regulation and cell signaling. One important property of these enzymes is that different PKMTs and PRMTs catalyze the formation of different methylated product (product specificity). These different methylation states lead to different biological outcomes. Here we review the results of quantum mechanics/molecular mechanics (QM/MM) molecular dynamics (MD) and free energy simulations that have been performed to study the reaction mechanism of PKMTs and PRMTs and the mechanism underlying the product specificity of the methyl transfer processes.
Development of a numerical methodology for flowforming process simulation of complex geometry tubes
NASA Astrophysics Data System (ADS)
Varela, Sonia; Santos, Maite; Arroyo, Amaia; Pérez, Iñaki; Puigjaner, Joan Francesc; Puigjaner, Blanca
2017-10-01
Nowadays, the incremental flowforming process is widely explored because of the usage of complex tubular products is increasing due to the light-weighting trend and the use of expensive materials. The enhanced mechanical properties of finished parts combined with the process efficiency in terms of raw material and energy consumption are the key factors for its competitiveness and sustainability, which is consistent with EU industry policy. As a promising technology, additional steps for extending the existing flowforming limits in the production of tubular products are required. The objective of the present research is to further expand the current state of the art regarding limitations on tube thickness and diameter, exploring the feasibility to flowform complex geometries as tubes of elevated thickness of up to 60 mm. In this study, the analysis of the backward flowforming process of 7075 aluminum tubular preform is carried out to define the optimum process parameters, machine requirements and tooling geometry as demonstration case. Numerical simulation studies on flowforming of thin walled tubular components have been considered to increase the knowledge of the technology. The calculation of the rotational movement of the mesh preform, the high ratio thickness/length and the thermomechanical condition increase significantly the computation time of the numerical simulation model. This means that efficient and reliable tools able to predict the forming loads and the quality of flowformed thick tubes are not available. This paper aims to overcome this situation by developing a simulation methodology based on FEM simulation code including new strategies. Material characterization has also been performed through tensile test to able to design the process. Finally, to check the reliability of the model, flowforming tests at industrial environment have been developed.
A microphysical pathway analysis to investigate aerosol effects on convective clouds
NASA Astrophysics Data System (ADS)
Heikenfeld, Max; White, Bethan; Labbouz, Laurent; Stier, Philip
2017-04-01
The impact of aerosols on ice- and mixed-phase processes in convective clouds remains highly uncertain, which has strong implications for estimates of the role of aerosol-cloud interactions in the climate system. The wide range of interacting microphysical processes are still poorly understood and generally not resolved in global climate models. To understand and visualise these processes and to conduct a detailed pathway analysis, we have added diagnostic output of all individual process rates for number and mass mixing ratios to two commonly-used cloud microphysics schemes (Thompson and Morrison) in WRF. This allows us to investigate the response of individual processes to changes in aerosol conditions and the propagation of perturbations throughout the development of convective clouds. Aerosol effects on cloud microphysics could strongly depend on the representation of these interactions in the model. We use different model complexities with regard to aerosol-cloud interactions ranging from simulations with different levels of fixed cloud droplet number concentration (CDNC) as a proxy for aerosol, to prognostic CDNC with fixed modal aerosol distributions. Furthermore, we have implemented the HAM aerosol model in WRF-chem to also perform simulations with a fully interactive aerosol scheme. We employ a hierarchy of simulation types to understand the evolution of cloud microphysical perturbations in atmospheric convection. Idealised supercell simulations are chosen to present and test the analysis methods for a strongly confined and well-studied case. We then extend the analysis to large case study simulations of tropical convection over the Amazon rainforest. For both cases we apply our analyses to individually tracked convective cells. Our results show the impact of model uncertainties on the understanding of aerosol-convection interactions and have implications for improving process representation in models.
NASA Astrophysics Data System (ADS)
Pilz, Tobias; Francke, Till; Bronstert, Axel
2016-04-01
Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.
Simulation of Assembly Line Balancing in Automotive Component Manufacturing
NASA Astrophysics Data System (ADS)
Jamil, Muthanna; Mohd Razali, Noraini
2016-02-01
This study focuses on the simulation of assembly line balancing in an automotive component in a vendor manufacturing company. A mixed-model assembly line of charcoal canister product that is used in an engine system as fuel's vapour filter was observed and found that the current production rate of the line does not achieve customer demand even though the company practices buffer stock for two days in advance. This study was carried out by performing detailed process flow and time studies along the line. To set up a model of the line by simulation, real data was taken from a factory floor and tested for distribution fit. The data gathered was then transformed into a simulation model. After verification of the model by comparing it with the actual system, it was found that the current line efficiency is not at its optimum condition due to blockage and idle time. Various what-if analysis were applied to eliminate the cause. Proposed layout shows that the line is balanced by adding buffer to avoid the blockage. Whereas, manpower is added the stations to reduce process time therefore reducing idling time. The simulation study was carried out using ProModel software.
Two-dimensional numerical simulation of boron diffusion for pyramidally textured silicon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Fa-Jun, E-mail: Fajun.Ma@nus.edu.sg; Duttagupta, Shubham; Department of Electrical and Computer Engineering, National University of Singapore, 4 Engineering Drive 3, 117576
2014-11-14
Multidimensional numerical simulation of boron diffusion is of great relevance for the improvement of industrial n-type crystalline silicon wafer solar cells. However, surface passivation of boron diffused area is typically studied in one dimension on planar lifetime samples. This approach neglects the effects of the solar cell pyramidal texture on the boron doping process and resulting doping profile. In this work, we present a theoretical study using a two-dimensional surface morphology for pyramidally textured samples. The boron diffusivity and segregation coefficient between oxide and silicon in simulation are determined by reproducing measured one-dimensional boron depth profiles prepared using different boronmore » diffusion recipes on planar samples. The established parameters are subsequently used to simulate the boron diffusion process on textured samples. The simulated junction depth is found to agree quantitatively well with electron beam induced current measurements. Finally, chemical passivation on planar and textured samples is compared in device simulation. Particularly, a two-dimensional approach is adopted for textured samples to evaluate chemical passivation. The intrinsic emitter saturation current density, which is only related to Auger and radiative recombination, is also simulated for both planar and textured samples. The differences between planar and textured samples are discussed.« less
NASA Astrophysics Data System (ADS)
Park, Han-Earl; Park, Sang-Young; Kim, Sung-Woo; Park, Chandeok
2013-12-01
Development and experiment of an integrated orbit and attitude hardware-in-the-loop (HIL) simulator for autonomous satellite formation flying are presented. The integrated simulator system consists of an orbit HIL simulator for orbit determination and control, and an attitude HIL simulator for attitude determination and control. The integrated simulator involves four processes (orbit determination, orbit control, attitude determination, and attitude control), which interact with each other in the same way as actual flight processes do. Orbit determination is conducted by a relative navigation algorithm using double-difference GPS measurements based on the extended Kalman filter (EKF). Orbit control is performed by a state-dependent Riccati equation (SDRE) technique that is utilized as a nonlinear controller for the formation control problem. Attitude is determined from an attitude heading reference system (AHRS) sensor, and a proportional-derivative (PD) feedback controller is used to control the attitude HIL simulator using three momentum wheel assemblies. Integrated orbit and attitude simulations are performed for a formation reconfiguration scenario. By performing the four processes adequately, the desired formation reconfiguration from a baseline of 500-1000 m was achieved with meter-level position error and millimeter-level relative position navigation. This HIL simulation demonstrates the performance of the integrated HIL simulator and the feasibility of the applied algorithms in a real-time environment. Furthermore, the integrated HIL simulator system developed in the current study can be used as a ground-based testing environment to reproduce possible actual satellite formation operations.
Zhang, Xiaobin; Li, Qiong; Eskine, Kendall J.; Zuo, Bin
2014-01-01
The current studies extend perceptual symbol systems theory to the processing of gender categorization by revealing that gender categorization recruits perceptual simulations of spatial height and size dimensions. In study 1, categorization of male faces were faster when the faces were in the “up” (i.e., higher on the vertical axis) rather than the “down” (i.e., lower on the vertical axis) position and vice versa for female face categorization. Study 2 found that responses to male names depicted in larger font were faster than male names depicted in smaller font, whereas opposite response patterns were given for female names. Study 3 confirmed that the effect in Study 2 was not due to metaphoric relationships between gender and social power. Together, these findings suggest that representation of gender (social categorization) also involves processes of perceptual simulation. PMID:24587022
ERIC Educational Resources Information Center
Fan, Xinxin; Geelan, David; Gillies, Robyn
2018-01-01
This study investigated the effectiveness of a novel inquiry-based instructional sequence using interactive simulations for supporting students' development of conceptual understanding, inquiry process skills and confidence in learning. The study, conducted in Beijing, involved two teachers and 117 students in four classes. The teachers…
Lectures and Simulation Laboratories to Improve Learners' Conceptual Understanding
ERIC Educational Resources Information Center
Brophy, Sean P.; Magana, Alejandra J.; Strachan, Alejandro
2013-01-01
We studied the use of online molecular dynamics simulations (MD) to enhance student abilities to understand the atomic processes governing plastic deformation in materials. The target population included a second-year undergraduate engineering course in the School of Materials Engineering at Purdue University. The objectives of the study were to…
Evaluating Vertical Moisture Structure of the Madden-Julian Oscillation in Contemporary GCMs
NASA Astrophysics Data System (ADS)
Guan, B.; Jiang, X.; Waliser, D. E.
2013-12-01
The Madden-Julian Oscillation (MJO) remains a major challenge in our understanding and modeling of the tropical convection and circulation. Many models have troubles in realistically simulating key characteristics of the MJO, such as the strength, period, and eastward propagation. For models that do simulate aspects of the MJO, it remains to be understood what parameters and processes are the most critical in determining the quality of the simulations. This study focuses on the vertical structure of moisture in MJO simulations, with the aim to identify and understand the relationship between MJO simulation qualities and key parameters related to moisture. A series of 20-year simulations conducted by 26 GCMs are analyzed, including four that are coupled to ocean models and two that have a two-dimensional cloud resolving model embedded (i.e., superparameterized). TRMM precipitation and ERA-Interim reanalysis are used to evaluate the model simulations. MJO simulation qualities are evaluated based on pattern correlations of lead/lag regressions of precipitation - a measure of the model representation of the eastward propagating MJO convection. Models with strongest and weakest MJOs (top and bottom quartiles) are compared in terms of differences in moisture content, moisture convergence, moistening rate, and moist static energy. It is found that models with strongest MJOs have better representations of the observed vertical tilt of moisture. Relative importance of convection, advection, boundary layer, and large scale convection/precipitation are discussed in terms of their contribution to the moistening process. The results highlight the overall importance of vertical moisture structure in MJO simulations. The work contributes to the climatological component of the joint WCRP-WWRP/THORPEX YOTC MJO Task Force and the GEWEX Atmosphere System Study (GASS) global model evaluation project focused on the vertical structure and diabatic processes of the MJO.
NASA Astrophysics Data System (ADS)
Cruz Inclán, Carlos M.; González Lazo, Eduardo; Rodríguez Rodríguez, Arturo; Guzmán Martínez, Fernando; Abreu Alfonso, Yamiel; Piñera Hernández, Ibrahin; Leyva Fabelo, Antonio
2017-09-01
The present work deals with the numerical simulation of gamma and electron radiation damage processes under high brightness and radiation particle fluency on regard to two new radiation induced atom displacement processes, which concern with both, the Monte Carlo Method based numerical simulation of the occurrence of atom displacement process as a result of gamma and electron interactions and transport in a solid matrix and the atom displacement threshold energies calculated by Molecular Dynamic methodologies. The two new radiation damage processes here considered in the framework of high brightness and particle fluency irradiation conditions are: 1) The radiation induced atom displacement processes due to a single primary knockout atom excitation in a defective target crystal matrix increasing its defect concentrations (vacancies, interstitials and Frenkel pairs) as a result of a severe and progressive material radiation damage and 2) The occurrence of atom displacements related to multiple primary knockout atom excitations for the same or different atomic species in an perfect target crystal matrix due to subsequent electron elastic atomic scattering in the same atomic neighborhood during a crystal lattice relaxation time. In the present work a review numeral simulation attempts of these two new radiation damage processes are presented, starting from the former developed algorithms and codes for Monte Carlo simulation of atom displacements induced by electron and gamma in
Computer Simulation in Predicting Biochemical Processes and Energy Balance at WWTPs
NASA Astrophysics Data System (ADS)
Drewnowski, Jakub; Zaborowska, Ewa; Hernandez De Vega, Carmen
2018-02-01
Nowadays, the use of mathematical models and computer simulation allow analysis of many different technological solutions as well as testing various scenarios in a short time and at low financial budget in order to simulate the scenario under typical conditions for the real system and help to find the best solution in design or operation process. The aim of the study was to evaluate different concepts of biochemical processes and energy balance modelling using a simulation platform GPS-x and a comprehensive model Mantis2. The paper presents the example of calibration and validation processes in the biological reactor as well as scenarios showing an influence of operational parameters on the WWTP energy balance. The results of batch tests and full-scale campaign obtained in the former work were used to predict biochemical and operational parameters in a newly developed plant model. The model was extended with sludge treatment devices, including anaerobic digester. Primary sludge removal efficiency was found as a significant factor determining biogas production and further renewable energy production in cogeneration. Water and wastewater utilities, which run and control WWTP, are interested in optimizing the process in order to save environment, their budget and decrease the pollutant emissions to water and air. In this context, computer simulation can be the easiest and very useful tool to improve the efficiency without interfering in the actual process performance.
Molecular dynamics studies on the DNA-binding process of ERG.
Beuerle, Matthias G; Dufton, Neil P; Randi, Anna M; Gould, Ian R
2016-11-15
The ETS family of transcription factors regulate gene targets by binding to a core GGAA DNA-sequence. The ETS factor ERG is required for homeostasis and lineage-specific functions in endothelial cells, some subset of haemopoietic cells and chondrocytes; its ectopic expression is linked to oncogenesis in multiple tissues. To date details of the DNA-binding process of ERG including DNA-sequence recognition outside the core GGAA-sequence are largely unknown. We combined available structural and experimental data to perform molecular dynamics simulations to study the DNA-binding process of ERG. In particular we were able to reproduce the ERG DNA-complex with a DNA-binding simulation starting in an unbound configuration with a final root-mean-square-deviation (RMSD) of 2.1 Å to the core ETS domain DNA-complex crystal structure. This allowed us to elucidate the relevance of amino acids involved in the formation of the ERG DNA-complex and to identify Arg385 as a novel key residue in the DNA-binding process. Moreover we were able to show that water-mediated hydrogen bonds are present between ERG and DNA in our simulations and that those interactions have the potential to achieve sequence recognition outside the GGAA core DNA-sequence. The methodology employed in this study shows the promising capabilities of modern molecular dynamics simulations in the field of protein DNA-interactions.
The forming simulation of flexible glass with silt down draw method
NASA Astrophysics Data System (ADS)
Yansheng, Hou; Jinshu, Cheng; Junfeng, Kang; Jing, Cui
2018-03-01
The slit down draw method is the main manufacturing process of flexible glass. In this study, Flow3DTM software was used to simulate the process of drawing and thinning glass slits during the slit down draw process. The influence of glass viscosity, initial plate thickness and initial plate speed on the glass spreading process was studied. The maximum pull-down force that the root can bear is linearly proportional to the viscosity, the initial thickness of 1.3837 power and the initial plate speed, respectively. The best way to improve the tensile strength of flexible glass is to increase the viscosity. Flexible glass was more easily to obtain with low viscosity, low thickness and low drawing speed.
The perceived value of using BIM for energy simulation
NASA Astrophysics Data System (ADS)
Lewis, Anderson M.
Building Information Modeling (BIM) is becoming an increasingly important tool in the Architectural, Engineering & Construction (AEC) industries. Some of the benefits associated with BIM include but are not limited to cost and time savings through greater trade and design coordination, and more accurate estimating take-offs. BIM is a virtual 3D, parametric design software that allows users to store information of a model within and can be used as a communication platform between project stakeholders. Likewise, energy simulation is an integral tool for predicting and optimizing a building's performance during design. Creating energy models and running energy simulations can be a time consuming activity due to the large number of parameters and assumptions that must be addressed to achieve reasonably accurate results. However, leveraging information imbedded within Building Information Models (BIMs) has the potential to increase accuracy and reduce the amount of time required to run energy simulations and can facilitate continuous energy simulations throughout the design process, thus optimizing building performance. Although some literature exists on how design stakeholders perceive the benefits associated with leveraging BIM for energy simulation, little is known about how perceptions associated with leveraging BIM for energy simulation differ between various green design stakeholder user groups. Through an e-survey instrument, this study seeks to determine how perceptions of using BIMs to inform energy simulation differ among distinct design stakeholder groups, which include BIM-only users, energy simulation-only users and BIM and energy simulation users. Additionally, this study seeks to determine what design stakeholders perceive as the main barriers and benefits of implementing BIM-based energy simulation. Results from this study suggest that little to no correlation exists between green design stakeholders' perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement level with BIM and/or energy simulation. However, green design stakeholder perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM and/or energy simulation may differ between different user groups (i.e. BIM users only, energy simulation users only, and BIM and energy simulation users). For example, the BIM-only user groups appeared to have a strong positive correlation between the perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM. Additionally, this study suggests that the top perceived benefits of using BIMs to inform energy simulations among green design stakeholders are: facilitation of communication, reducing of process related costs, and giving users the ability examine more design options. The main perceived barrier of using BIMs to inform energy simulations among green design stakeholders was a lack of BIM standards for model integration with multidisciplinary teams. Results from this study will help readers understand how to better implement BIM-based energy simulation while mitigating barriers and optimizing benefits. Additionally, examining discrepancies between user groups can lead the identification and improvement of shortfalls in current BIM-based energy simulation processes. Understanding how perceptions and engagement levels differ among different software user groups will help in developing a strategies for implementing BIM-based energy simulation that are tailored to each specific user group.
Visualization Methods for Viability Studies of Inspection Modules for the Space Shuttle
NASA Technical Reports Server (NTRS)
Mobasher, Amir A.
2005-01-01
An effective simulation of an object, process, or task must be similar to that object, process, or task. A simulation could consist of a physical device, a set of mathematical equations, a computer program, a person, or some combination of these. There are many reasons for the use of simulators. Although some of the reasons are unique to a specific situation, there are many general reasons and purposes for using simulators. Some are listed but not limited to (1) Safety, (2) Scarce resources, (3) Teaching/education, (4) Additional capabilities, (5) Flexibility and (6) Cost. Robot simulators are in use for all of these reasons. Virtual environments such as simulators will eliminate physical contact with humans and hence will increase the safety of work environment. Corporations with limited funding and resources may utilize simulators to accomplish their goals while saving manpower and money. A computer simulation is safer than working with a real robot. Robots are typically a scarce resource. Schools typically don t have a large number of robots, if any. Factories don t want the robots not performing useful work unless absolutely necessary. Robot simulators are useful in teaching robotics. A simulator gives a student hands-on experience, if only with a simulator. The simulator is more flexible. A user can quickly change the robot configuration, workcell, or even replace the robot with a different one altogether. In order to be useful, a robot simulator must create a model that accurately performs like the real robot. A powerful simulator is usually thought of as a combination of a CAD package with simulation capabilities. Computer Aided Design (CAD) techniques are used extensively by engineers in virtually all areas of engineering. Parts are designed interactively aided by the graphical display of both wireframe and more realistic shaded renderings. Once a part s dimensions have been specified to the CAD package, designers can view the part from any direction to examine how it will look and perform in relation to other parts. If changes are deemed necessary, the designer can easily make the changes and view the results graphically. However, a complex process of moving parts intended for operation in a complex environment can only be fully understood through the process of animated graphical simulation. A CAD package with simulation capabilities allows the designer to develop geometrical models of the process being designed, as well as the environment in which the process will be used, and then test the process in graphical animation much as the actual physical system would be run . By being able to operate the system of moving and stationary parts, the designer is able to see in simulation how the system will perform under a wide variety of conditions. If, for example, undesired collisions occur between parts of the system, design changes can be easily made without the expense or potential danger of testing the physical system.
Beltrán, F R; Lorenzo, V; Acosta, J; de la Orden, M U; Martínez Urreaga, J
2018-06-15
The aim of this work is to study the effects of different simulated mechanical recycling processes on the structure and properties of PLA. A commercial grade of PLA was melt compounded and compression molded, then subjected to two different recycling processes. The first recycling process consisted of an accelerated ageing and a second melt processing step, while the other recycling process included an accelerated ageing, a demanding washing process and a second melt processing step. The intrinsic viscosity measurements indicate that both recycling processes produce a degradation in PLA, which is more pronounced in the sample subjected to the washing process. DSC results suggest an increase in the mobility of the polymer chains in the recycled materials; however the degree of crystallinity of PLA seems unchanged. The optical, mechanical and gas barrier properties of PLA do not seem to be largely affected by the degradation suffered during the different recycling processes. These results suggest that, despite the degradation of PLA, the impact of the different simulated mechanical recycling processes on the final properties is limited. Thus, the potential use of recycled PLA in packaging applications is not jeopardized. Copyright © 2017 Elsevier Ltd. All rights reserved.
Virtual Collaborative Simulation Environment for Integrated Product and Process Development
NASA Technical Reports Server (NTRS)
Gulli, Michael A.
1997-01-01
Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.
Computer simulations and experimental study on crash box of automobile in low speed collision
NASA Astrophysics Data System (ADS)
Liu, Yanjie; Ding, Lin; Yan, Shengyuan; Yang, Yongsheng
2008-11-01
Based on the problems of energy-absorbing components in the automobile low speed collision process, according to crash box frontal crash test in low speed as the example, the simulation analysis of crash box impact process was carried out by Hyper Mesh and LS-DYNA. Each parameter on the influence modeling was analyzed by mathematics analytical solution and test comparison, which guaranteed that the model was accurate. Combination of experiment and simulation result had determined the weakness part of crash box structure crashworthiness aspect, and improvement method of crash box crashworthiness was discussed. Through numerical simulation of the impact process of automobile crash box, the obtained analysis result was used to optimize the design of crash box. It was helpful to improve the vehicles structure and decrease the collision accident loss at most. And it was also provided a useful method for the further research on the automobile collision.
Evaluating WRF Simulations of Urban Boundary Layer Processes during DISCOVER-AQ
NASA Astrophysics Data System (ADS)
Hegarty, J. D.; Henderson, J.; Lewis, J. R.; McGrath-Spangler, E. L.; Scarino, A. J.; Ferrare, R. A.; DeCola, P.; Welton, E. J.
2015-12-01
The accurate representation of processes in the planetary boundary layer (PBL) in meteorological models is of prime importance to air quality and greenhouse gas simulations as it governs the depth to which surface emissions are vertically mixed and influences the efficiency by which they are transported downwind. In this work we evaluate high resolution (~1 km) WRF simulations of PBL processes in the Washington DC - Baltimore and Houston urban areas during the respective DISCOVER-AQ 2011 and 2013 field campaigns using MPLNET micro-pulse lidar (MPL), mini-MPL, airborne high spectral resolution lidar (HSRL), Doppler wind profiler and CALIPSO satellite measurements along with complimentary surface and aircraft measurements. We will discuss how well WRF simulates the spatiotemporal variability of the PBL height in the urban areas and the development of fine-scale meteorological features such as bay and sea breezes that influence the air quality of the urban areas studied.
Mejía, Vilma; Gonzalez, Carlos; Delfino, Alejandro E; Altermatt, Fernando R; Corvetto, Marcia A
The primary purpose of this study was to compare the effect of high fidelity simulation versus a computer-based case solving self-study, in skills acquisition about malignant hyperthermia on first year anesthesiology residents. After institutional ethical committee approval, 31 first year anesthesiology residents were enrolled in this prospective randomized single-blinded study. Participants were randomized to either a High Fidelity Simulation Scenario or a computer-based Case Study about malignant hyperthermia. After the intervention, all subjects' performance in was assessed through a high fidelity simulation scenario using a previously validated assessment rubric. Additionally, knowledge tests and a satisfaction survey were applied. Finally, a semi-structured interview was done to assess self-perception of reasoning process and decision-making. 28 first year residents finished successfully the study. Resident's management skill scores were globally higher in High Fidelity Simulation versus Case Study, however they were significant in 4 of the 8 performance rubric elements: recognize signs and symptoms (p = 0.025), prioritization of initial actions of management (p = 0.003), recognize complications (p = 0.025) and communication (p = 0.025). Average scores from pre- and post-test knowledge questionnaires improved from 74% to 85% in the High Fidelity Simulation group, and decreased from 78% to 75% in the Case Study group (p = 0.032). Regarding the qualitative analysis, there was no difference in factors influencing the student's process of reasoning and decision-making with both teaching strategies. Simulation-based training with a malignant hyperthermia high-fidelity scenario was superior to computer-based case study, improving knowledge and skills in malignant hyperthermia crisis management, with a very good satisfaction level in anesthesia residents. Copyright © 2018 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights reserved.
The Monash University Interactive Simple Climate Model
NASA Astrophysics Data System (ADS)
Dommenget, D.
2013-12-01
The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.
Simulation of plasma loading of high-pressure RF cavities
NASA Astrophysics Data System (ADS)
Yu, K.; Samulyak, R.; Yonehara, K.; Freemire, B.
2018-01-01
Muon beam-induced plasma loading of radio-frequency (RF) cavities filled with high pressure hydrogen gas with 1% dry air dopant has been studied via numerical simulations. The electromagnetic code SPACE, that resolves relevant atomic physics processes, including ionization by the muon beam, electron attachment to dopant molecules, and electron-ion and ion-ion recombination, has been used. Simulations studies have been performed in the range of parameters typical for practical muon cooling channels.
LISP based simulation generators for modeling complex space processes
NASA Technical Reports Server (NTRS)
Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing
1987-01-01
The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.
NASA Technical Reports Server (NTRS)
Menon, Suresh
2003-01-01
This report summarizes the progress made in the first 8 to 9 months of this research. The Lattice Boltzmann Equation (LBE) methodology for Large-eddy Simulations (LES) of microblowing has been validated using a jet-in-crossflow test configuration. In this study, the flow intake is also simulated to allow the interaction to occur naturally. The Lattice Boltzmann Equation Large-eddy Simulations (LBELES) approach is capable of capturing not only the flow features associated with the flow, such as hairpin vortices and recirculation behind the jet, but also is able to show better agreement with experiments when compared to previous RANS predictions. The LBELES is shown to be computationally very efficient and therefore, a viable method for simulating the injection process. Two strategies have been developed to simulate multi-hole injection process as in the experiment. In order to allow natural interaction between the injected fluid and the primary stream, the flow intakes for all the holes have to be simulated. The LBE method is computationally efficient but is still 3D in nature and therefore, there may be some computational penalty. In order to study a large number or holes, a new 1D subgrid model has been developed that will simulate a reduced form of the Navier-Stokes equation in these holes.
Investigation of roughing machining simulation by using visual basic programming in NX CAM system
NASA Astrophysics Data System (ADS)
Hafiz Mohamad, Mohamad; Nafis Osman Zahid, Muhammed
2018-03-01
This paper outlines a simulation study to investigate the characteristic of roughing machining simulation in 4th axis milling processes by utilizing visual basic programming in NX CAM systems. The selection and optimization of cutting orientation in rough milling operation is critical in 4th axis machining. The main purpose of roughing operation is to approximately shape the machined parts into finished form by removing the bulk of material from workpieces. In this paper, the simulations are executed by manipulating a set of different cutting orientation to generate estimated volume removed from the machine parts. The cutting orientation with high volume removal is denoted as an optimum value and chosen to execute a roughing operation. In order to run the simulation, customized software is developed to assist the routines. Operations build-up instructions in NX CAM interface are translated into programming codes via advanced tool available in the Visual Basic Studio. The codes is customized and equipped with decision making tools to run and control the simulations. It permits the integration with any independent program files to execute specific operations. This paper aims to discuss about the simulation program and identifies optimum cutting orientations for roughing processes. The output of this study will broaden up the simulation routines performed in NX CAM systems.
Numerical simulations for active tectonic processes: increasing interoperability and performance
NASA Technical Reports Server (NTRS)
Donnellan, A.; Fox, G.; Rundle, J.; McLeod, D.; Tullis, T.; Grant, L.
2002-01-01
The objective of this project is to produce a system to fully model earthquake-related data. This task develops simulation and analysis tools to study the physics of earthquakes using state-of-the-art modeling.
Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L
2016-09-21
Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived commitment, influence and confidence of stakeholders in implementing policy and program decisions identified in the modelling process; and the impact of the process in terms of policy and program change. The study will generate empirical evidence on the feasibility and potential value of simulation modelling to support knowledge mobilisation and consensus building in health settings.
Improving operational anodising process performance using simulation approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liong, Choong-Yeun, E-mail: lg@ukm.edu.my; Ghazali, Syarah Syahidah, E-mail: syarah@gapps.kptm.edu.my
The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist ofmore » five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.« less
A tool for simulating parallel branch-and-bound methods
NASA Astrophysics Data System (ADS)
Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail
2016-01-01
The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.
A study of the required Rayleigh number to sustain dynamo with various inner core radius
NASA Astrophysics Data System (ADS)
Nishida, Y.; Katoh, Y.; Matsui, H.; Kumamoto, A.
2017-12-01
It is widely accepted that the geomagnetic field is sustained by thermal and compositional driven convections of a liquid iron alloy in the outer core. The generation process of the geomagnetic field has been studied by a number of MHD dynamo simulations. Recent studies of the ratio of the Earth's core evolution suggest that the inner solid core radius ri to the outer liquid core radius ro changed from ri/ro = 0 to 0.35 during the last one billion years. There are some studies of dynamo in the early Earth with smaller inner core than the present. Heimpel et al. (2005) revealed the Rayleigh number Ra of the onset of dynamo process as a function of ri/ro from simulation, while paleomagnetic observation shows that the geomagnetic field has been sustained for 3.5 billion years. While Heimpel and Evans (2013) studied dynamo processes taking into account the thermal history of the Earth's interior, there were few cases corresponding to the early Earth. Driscoll (2016) performed a series of dynamo based on a thermal evolution model. Despite a number of dynamo simulations, dynamo process occurring in the interior of the early Earth has not been fully understood because the magnetic Prandtl numbers in these simulations are much larger than that for the actual outer core.In the present study, we performed thermally driven dynamo simulations with different aspect ratio ri/ro = 0.15, 0.25 and 0.35 to evaluate the critical Ra for the thermal convection and required Ra to maintain the dynamo. For this purpose, we performed simulations with various Ra and fixed the other control parameters such as the Ekman, Prandtl, and magnetic Prandtl numbers. For the initial condition and boundary conditions, we followed the dynamo benchmark case 1 by Christensen et al. (2001). The results show that the critical Ra increases with the smaller aspect ratio ri/ro. It is confirmed that larger amplitude of buoyancy is required in the smaller inner core to maintain dynamo.
Numerical simulation of the casting process of titanium removable partial denture frameworks.
Wu, Menghuai; Wagner, Ingo; Sahm, Peter R; Augthun, Michael
2002-03-01
The objective of this work was to study the filling incompleteness and porosity defects in titanium removal partial denture frameworks by means of numerical simulation. Two frameworks, one for lower jaw and one for upper jaw, were chosen according to dentists' recommendation to be simulated. Geometry of the frameworks were laser-digitized and converted into a simulation software (MAGMASOFT). Both mold filling and solidification of the castings with different sprue designs (e.g. tree, ball, and runner-bar) were numerically calculated. The shrinkage porosity was quantitatively predicted by a feeding criterion, the potential filling defect and gas pore sensitivity were estimated based on the filling and solidification results. A satisfactory sprue design with process parameters was finally recommended for real casting trials (four replica for each frameworks). All the frameworks were successfully cast. Through X-ray radiographic inspections it was found that all the castings were acceptably sound except for only one case in which gas bubbles were detected in the grasp region of the frame. It is concluded that numerical simulation aids to achieve understanding of the casting process and defect formation in titanium frameworks, hence to minimize the risk of producing defect casting by improving the sprue design and process parameters.
Simulated interprofessional education: an analysis of teaching and learning processes.
van Soeren, Mary; Devlin-Cop, Sandra; Macmillan, Kathleen; Baker, Lindsay; Egan-Lee, Eileen; Reeves, Scott
2011-11-01
Simulated learning activities are increasingly being used in health professions and interprofessional education (IPE). Specifically, IPE programs are frequently adopting role-play simulations as a key learning approach. Despite this widespread adoption, there is little empirical evidence exploring the teaching and learning processes embedded within this type of simulation. This exploratory study provides insight into the nature of these processes through the use of qualitative methods. A total of 152 clinicians, 101 students and 9 facilitators representing a range of health professions, participated in video-recorded role-plays and debrief sessions. Videotapes were analyzed to explore emerging issues and themes related to teaching and learning processes related to this type of interprofessional simulated learning experience. In addition, three focus groups were conducted with a subset of participants to explore perceptions of their educational experiences. Five key themes emerged from the data analysis: enthusiasm and motivation, professional role assignment, scenario realism, facilitator style and background and team facilitation. Our findings suggest that program developers need to be mindful of these five themes when using role-plays in an interprofessional context and point to the importance of deliberate and skilled facilitation in meeting desired learning outcomes.
DWPF Simulant CPC Studies For SB8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newell, J. D.
2013-09-25
Prior to processing a Sludge Batch (SB) in the Defense Waste Processing Facility (DWPF), flowsheet studies using simulants are performed. Typically, the flowsheet studies are conducted based on projected composition(s). The results from the flowsheet testing are used to 1) guide decisions during sludge batch preparation, 2) serve as a preliminary evaluation of potential processing issues, and 3) provide a basis to support the Shielded Cells qualification runs performed at the Savannah River National Laboratory (SRNL). SB8 was initially projected to be a combination of the Tank 40 heel (Sludge Batch 7b), Tank 13, Tank 12, and the Tank 51more » heel. In order to accelerate preparation of SB8, the decision was made to delay the oxalate-rich material from Tank 12 to a future sludge batch. SB8 simulant studies without Tank 12 were reported in a separate report.1 The data presented in this report will be useful when processing future sludge batches containing Tank 12. The wash endpoint target for SB8 was set at a significantly higher sodium concentration to allow acceptable glass compositions at the targeted waste loading. Four non-coupled tests were conducted using simulant representing Tank 40 at 110-146% of the Koopman Minimum Acid requirement. Hydrogen was generated during high acid stoichiometry (146% acid) SRAT testing up to 31% of the DWPF hydrogen limit. SME hydrogen generation reached 48% of of the DWPF limit for the high acid run. Two non-coupled tests were conducted using simulant representing Tank 51 at 110-146% of the Koopman Minimum Acid requirement. Hydrogen was generated during high acid stoichiometry SRAT testing up to 16% of the DWPF limit. SME hydrogen generation reached 49% of the DWPF limit for hydrogen in the SME for the high acid run. Simulant processing was successful using previously established antifoam addition strategy. Foaming during formic acid addition was not observed in any of the runs. Nitrite was destroyed in all runs and no N2O was detected during SME processing. Mercury behavior was consistent with that seen in previous SRAT runs. Mercury was stripped below the DWPF limit on 0.8 wt% for all runs. Rheology yield stress fell within or below the design basis of 1-5 Pa. The low acid Tank 40 run (106% acid stoichiometry) had the highest yield stress at 3.78 Pa.« less
NASA Technical Reports Server (NTRS)
Gerber, C. R.
1972-01-01
The computation and logical functions which are performed by the data processing assembly of the modular space station are defined. The subjects discussed are: (1) requirements analysis, (2) baseline data processing assembly configuration, (3) information flow study, (4) throughput simulation, (5) redundancy study, (6) memory studies, and (7) design requirements specification.
Perceptual Processing Affects Conceptual Processing
ERIC Educational Resources Information Center
van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.
2008-01-01
According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…
Cuerva, Marcos J; Piñel, Carlos S; Martin, Lourdes; Espinosa, Jose A; Corral, Octavio J; Mendoza, Nicolás
2018-02-12
The design of optimal courses for obstetric undergraduate teaching is a relevant question. This study evaluates two different designs of simulator-based learning activity on childbirth with regard to respect to the patient, obstetric manoeuvres, interpretation of cardiotocography tracings (CTG) and infection prevention. This randomised experimental study which differs in the content of their briefing sessions consisted of two groups of undergraduate students, who performed two simulator-based learning activities on childbirth. The first briefing session included the observations of a properly performed scenario according to Spanish clinical practice guidelines on care in normal childbirth by the teachers whereas the second group did not include the observations of a properly performed scenario, and the students observed it only after the simulation process. The group that observed a properly performed scenario after the simulation obtained worse grades during the simulation, but better grades during the debriefing and evaluation. Simulator use in childbirth may be more fruitful when the medical students observe correct performance at the completion of the scenario compared to that at the start of the scenario. Impact statement What is already known on this subject? There is a scarcity of literature about the design of optimal high-fidelity simulation training in childbirth. It is known that preparing simulator-based learning activities is a complex process. Simulator-based learning includes the following steps: briefing, simulation, debriefing and evaluation. The most important part of high-fidelity simulations is the debriefing. A good briefing and simulation are of high relevance in order to have a fruitful debriefing session. What do the results of this study add? Our study describes a full simulator-based learning activity on childbirth that can be reproduced in similar facilities. The findings of this study add that high-fidelity simulation training in childbirth is favoured by a short briefing session and an abrupt start to the scenario, rather than a long briefing session that includes direct instruction in the scenario. What are the implications of these findings for clinical practice and/or further research? The findings of this study reveal what to include in the briefing of simulator-based learning activities on childbirth. These findings have implications in medical teaching and in medical practice.
Scenario Development Process at the Vertical Motion Simulator
NASA Technical Reports Server (NTRS)
Reardon, Scott E.; Beard, Steven D.; Lewis, Emily
2017-01-01
There has been a significant effort within the simulation community to standardize many aspects of flight simulation. More recently, an effort has begun to develop a formal scenario definition language for aviation. A working group within the AIAA Modeling and Simulation Technical Committee has been created to develop a standard aviation scenario definition language, though much of the initial effort has been tailored to training simulators. Research and development (R&D) simulators, like the Vertical Motion Simulator (VMS), and training simulators have different missions and thus have different scenario requirements. The purpose of this paper is to highlight some of the unique tasks and scenario elements used at the VMS so they may be captured by scenario standardization efforts. The VMS most often performs handling qualities studies and transfer of training studies. Three representative handling qualities simulation studies and two transfer of training simulation studies are described in this paper. Unique scenario elements discussed in this paper included special out-the-window (OTW) targets and environmental conditions, motion system parameters, active inceptor parameters, and configurable vehicle math model parameters.
Acceleration techniques for dependability simulation. M.S. Thesis
NASA Technical Reports Server (NTRS)
Barnette, James David
1995-01-01
As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.
Efficiency and Accuracy in Thermal Simulation of Powder Bed Fusion of Bulk Metallic Glass
NASA Astrophysics Data System (ADS)
Lindwall, J.; Malmelöv, A.; Lundbäck, A.; Lindgren, L.-E.
2018-05-01
Additive manufacturing by powder bed fusion processes can be utilized to create bulk metallic glass as the process yields considerably high cooling rates. However, there is a risk that reheated material set in layers may become devitrified, i.e., crystallize. Therefore, it is advantageous to simulate the process to fully comprehend it and design it to avoid the aforementioned risk. However, a detailed simulation is computationally demanding. It is necessary to increase the computational speed while maintaining accuracy of the computed temperature field in critical regions. The current study evaluates a few approaches based on temporal reduction to achieve this. It is found that the evaluated approaches save a lot of time and accurately predict the temperature history.
NASA Astrophysics Data System (ADS)
Amiri, Amir; Nikpour, Amin; Saraeian, Payam
2018-05-01
Forging is one of the manufacturing processes of aluminium parts which has two major categories: called hot and cold forging. In the cold forging, the dimensional and geometrical accuracy of final part is high. However, fracture may occur in some aluminium alloys during the process because of less workability. Fracture in cold forging can be in the form of ductile, brittle or combination of both depending on the alloy type. There are several criteria for predicting fracture in cold forging. In this study, cold forging process of 6063 aluminium alloy for three different parts is simulated in order to predict fracture. The results of numerical simulations of Freudenthal criterion is in conformity with experimental tests.
Mapping the Limitations of Breakthrough Analysis in Fixed-Bed Adsorption
NASA Technical Reports Server (NTRS)
Knox, James Clinton
2017-01-01
The separation of gases through adsorption plays an important role in the chemical processing industry, where the separation step is often the costliest part of a chemical process and thus worthy of careful study and optimization. This work developed a number of new, archival aspects on the computer simulations used for the refinement and design of these gas adsorption processes: 1. Presented a new approach to fit the undetermined heat and mass transfer coefficients in the axially dispersed plug flow equation and associated balance equations 2. Examined and described the conditions where non-physical simulation results can arise 3. Presented an approach to determine the limits of the axial dispersion and LDF mass transfer terms above which non-physical simulation results occur.
Experimental study of modification mechanism at a wear-resistant surfacing
NASA Astrophysics Data System (ADS)
Dema, R. R.; Amirov, R. N.; Kalugina, O. B.
2018-01-01
In the study, a simulation of the crystallization process was carried out for the deposition of the near-eutectic structure alloys with inoculants presence in order to reveal the regularities of the inoculant effect and parameters of the process mode simulating surfacing on the structure of the crystallization front and on the nucleation rate and kinetics of growth of equiaxed crystallites of primary phases occurring in the volume of the melt. The simulation technique of primary crystallization of alloys similar to eutectic alloys in the presence of modifiers is offered. The possibility of fully eutectic structure during surfacing of nominal hypereutectic alloys of type white cast irons in wide range of deviations from the nominal composition is revealed.
Naveros, Francisco; Luque, Niceto R; Garrido, Jesús A; Carrillo, Richard R; Anguita, Mancia; Ros, Eduardo
2015-07-01
Time-driven simulation methods in traditional CPU architectures perform well and precisely when simulating small-scale spiking neural networks. Nevertheless, they still have drawbacks when simulating large-scale systems. Conversely, event-driven simulation methods in CPUs and time-driven simulation methods in graphic processing units (GPUs) can outperform CPU time-driven methods under certain conditions. With this performance improvement in mind, we have developed an event-and-time-driven spiking neural network simulator suitable for a hybrid CPU-GPU platform. Our neural simulator is able to efficiently simulate bio-inspired spiking neural networks consisting of different neural models, which can be distributed heterogeneously in both small layers and large layers or subsystems. For the sake of efficiency, the low-activity parts of the neural network can be simulated in CPU using event-driven methods while the high-activity subsystems can be simulated in either CPU (a few neurons) or GPU (thousands or millions of neurons) using time-driven methods. In this brief, we have undertaken a comparative study of these different simulation methods. For benchmarking the different simulation methods and platforms, we have used a cerebellar-inspired neural-network model consisting of a very dense granular layer and a Purkinje layer with a smaller number of cells (according to biological ratios). Thus, this cerebellar-like network includes a dense diverging neural layer (increasing the dimensionality of its internal representation and sparse coding) and a converging neural layer (integration) similar to many other biologically inspired and also artificial neural networks.
WRF nested large-eddy simulations of deep convection during SEAC4RS
NASA Astrophysics Data System (ADS)
Heath, Nicholas Kyle
Deep convection is an important component of atmospheric circulations that affects many aspects of weather and climate. Therefore, improved understanding and realistic simulations of deep convection are critical to both operational and climate forecasts. Large-eddy simulations (LESs) often are used with observations to enhance understanding of convective processes. This study develops and evaluates a nested-LES method using the Weather Research and Forecasting (WRF) model. Our goal is to evaluate the extent to which the WRF nested-LES approach is useful for studying deep convection during a real-world case. The method was applied on 2 September 2013, a day of continental convection having a robust set of ground and airborne data available for evaluation. A three domain mesoscale WRF simulation is run first. Then, the finest mesoscale output (1.35 km grid length) is used to separately drive nested-LES domains with grid lengths of 450 and 150 m. Results reveal that the nested-LES approach reasonably simulates a broad spectrum of observations, from reflectivity distributions to vertical velocity profiles, during the study period. However, reducing the grid spacing does not necessarily improve results for our case, with the 450 m simulation outperforming the 150 m version. We find that simulated updrafts in the 150 m simulation are too narrow to overcome the negative effects of entrainment, thereby generating convection that is weaker than observed. Increasing the sub-grid mixing length in the 150 m simulation leads to deeper, more realistic convection, but comes at the expense of delaying the onset of the convection. Overall, results show that both the 450 m and 150 m simulations are influenced considerably by the choice of sub-grid mixing length used in the LES turbulence closure. Finally, the simulations and observations are used to study the processes forcing strong midlevel cloud-edge downdrafts that were observed on 2 September. Results suggest that these downdrafts are forced by evaporative cooling due to mixing near cloud edge and by vertical perturbation pressure gradient forces acting to restore mass continuity around neighboring updrafts. We conclude that the WRF nested-LES approach provides an effective method for studying deep convection for our real-world case. The method can be used to provide insight into physical processes that are important to understanding observations. The WRF nested-LES approach could be adapted for other case studies in which high-resolution observations are available for validation.
Improving Transfer of Learning: Relationship to Methods of Using Business Simulation
ERIC Educational Resources Information Center
Mayer, Brad W.; Dale, Kathleen M.; Fraccastoro, Katherine A.; Moss, Gisele
2011-01-01
This study investigates whether the processes associated with the use of business simulations can be structured to improve transfer of learning from the classroom environment to the workplace.The answer to this question is explored by investigating teaching methods used to introduce the simulation, the amount of time students spend on decisions,…
ERIC Educational Resources Information Center
Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel
2016-01-01
In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving…
Jay Renew
2016-02-06
Results from a nanofiltration study utilizing simulated geothermal brines. The data includes a PDF documenting the process used to remove Calcium, Magnesium, Sodium, Silica, Lithium, Chlorine, and Sulfate from simulated geothermal brines. Three different membranes were evaluated. The results were analyzed using inductively coupled plasma mass spectrometry (ICP-MS).
ERIC Educational Resources Information Center
Korfiatis, K.; Papatheodorou, E.; Paraskevopoulous, S.; Stamou, G. P.
1999-01-01
Describes a study of the effectiveness of computer-simulation programs in enhancing biology students' familiarity with ecological modeling and concepts. Finds that computer simulations improved student comprehension of ecological processes expressed in mathematical form, but did not allow a full understanding of ecological concepts. Contains 28…
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Day, John H. (Technical Monitor)
2000-01-01
Post-Processing of data related to a Global Positioning System (GPS) simulation is an important activity in qualification of a GPS receiver for space flight. Because a GPS simulator is a critical resource it is desirable to move off the pertinent simulation data from the simulator as soon as a test is completed. The simulator data files are usually moved to a Personal Computer (PC), where the post-processing of the receiver logged measurements and solutions data and simulated data is performed. Typically post-processing is accomplished using PC-based commercial software languages and tools. Because of commercial software systems generality their general-purpose functions are notoriously slow and more than often are the bottleneck problem even for short duration experiments. For example, it may take 8 hours to post-process data from a 6-hour simulation. There is a need to do post-processing faster, especially in order to use the previous test results as feedback for a next simulation setup. This paper demonstrates that a fast software linear interpolation algorithm is applicable to a large class of engineering problems, like GPS simulation data post-processing, where computational time is a critical resource and is one of the most important considerations. An approach is developed that allows to speed-up post-processing by an order of magnitude. It is based on improving the post-processing bottleneck interpolation algorithm using apriori information that is specific to the GPS simulation application. The presented post-processing scheme was used in support of a few successful space flight missions carrying GPS receivers. A future approach to solving the post-processing performance problem using Field Programmable Gate Array (FPGA) technology is described.
Advances in multi-scale modeling of solidification and casting processes
NASA Astrophysics Data System (ADS)
Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang
2011-04-01
The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.
An Overview of the State of the Art in Atomistic and Multiscale Simulation of Fracture
NASA Technical Reports Server (NTRS)
Saether, Erik; Yamakov, Vesselin; Phillips, Dawn R.; Glaessgen, Edward H.
2009-01-01
The emerging field of nanomechanics is providing a new focus in the study of the mechanics of materials, particularly in simulating fundamental atomic mechanisms involved in the initiation and evolution of damage. Simulating fundamental material processes using first principles in physics strongly motivates the formulation of computational multiscale methods to link macroscopic failure to the underlying atomic processes from which all material behavior originates. This report gives an overview of the state of the art in applying concurrent and sequential multiscale methods to analyze damage and failure mechanisms across length scales.
A Multiagent Modeling Environment for Simulating Work Practice in Organizations
NASA Technical Reports Server (NTRS)
Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron
2004-01-01
In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to represent the relations of people, locations, systems, artifacts, communication and information content.
Evaluation of tocopherol recovery through simulation of molecular distillation process.
Moraes, E B; Batistella, C B; Alvarez, M E Torres; Filho, Rubens Maciel; Maciel, M R Wolf
2004-01-01
DISMOL simulator was used to determine the best possible operating conditions to guide, in future studies, experimental works. This simulator needs several physical-chemical properties and often it is very difficult to determine them because of the complexity of the involved components. Their determinations must be made through correlations and/or predictions, in order to characterize the system and calculate it. The first try is to have simulation results of a system that later can be validated with experimental data. To implement, in the simulator, the necessary parameters of complex systems is a difficult task. In this work, we aimed to determe these properties in order to evaluate the tocopherol (vitamin E) recovery using a DISMOL simulator. The raw material used was the crude deodorizer distillate of soya oil. With this procedure, it is possible to determine the best operating conditions for experimental works and to evaluate the process in the separation of new systems, analyzing the profiles obtained from these simulations for the falling film molecular distillator.
NASA Astrophysics Data System (ADS)
Dethlefsen, Frank; Tilmann Pfeiffer, Wolf; Schäfer, Dirk
2016-04-01
Numerical simulations of hydraulic, thermal, geomechanical, or geochemical (THMC-) processes in the subsurface have been conducted for decades. Often, such simulations are commenced by applying a parameter set that is as realistic as possible. Then, a base scenario is calibrated on field observations. Finally, scenario simulations can be performed, for instance to forecast the system behavior after varying input data. In the context of subsurface energy and mass storage, however, these model calibrations based on field data are often not available, as these storage actions have not been carried out so far. Consequently, the numerical models merely rely on the parameter set initially selected, and uncertainties as a consequence of a lack of parameter values or process understanding may not be perceivable, not mentioning quantifiable. Therefore, conducting THMC simulations in the context of energy and mass storage deserves a particular review of the model parameterization with its input data, and such a review so far hardly exists to the required extent. Variability or aleatory uncertainty exists for geoscientific parameter values in general, and parameters for that numerous data points are available, such as aquifer permeabilities, may be described statistically thereby exhibiting statistical uncertainty. In this case, sensitivity analyses for quantifying the uncertainty in the simulation resulting from varying this parameter can be conducted. There are other parameters, where the lack of data quantity and quality implies a fundamental changing of ongoing processes when such a parameter value is varied in numerical scenario simulations. As an example for such a scenario uncertainty, varying the capillary entry pressure as one of the multiphase flow parameters can either allow or completely inhibit the penetration of an aquitard by gas. As the last example, the uncertainty of cap-rock fault permeabilities and consequently potential leakage rates of stored gases into shallow compartments are regarded as recognized ignorance by the authors of this study, as no realistic approach exists to determine this parameter and values are best guesses only. In addition to these aleatory uncertainties, an equivalent classification is possible for rating epistemic uncertainties describing the degree of understanding processes such as the geochemical and hydraulic effects following potential gas intrusions from deeper reservoirs into shallow aquifers. As an outcome of this grouping of uncertainties, prediction errors of scenario simulations can be calculated by sensitivity analyses, if the uncertainties are identified as statistical. However, if scenario uncertainties exist or even recognized ignorance has to be attested to a parameter or a process in question, the outcomes of simulations mainly depend on the decision of the modeler by choosing parameter values or by interpreting the occurring of processes. In that case, the informative value of numerical simulations is limited by ambiguous simulation results, which cannot be refined without improving the geoscientific database through laboratory or field studies on a longer term basis, so that the effects of the subsurface use may be predicted realistically. This discussion, amended by a compilation of available geoscientific data to parameterize such simulations, will be presented in this study.
Takizawa, Yuumi; Shimomura, Takeshi; Miura, Toshiaki
2013-05-23
We study the initial nucleation dynamics of poly(3-hexylthiophene) (P3HT) in solution, focusing on the relationship between the ordering process of main chains and that of side chains. We carried out Langevin dynamics simulation and found that the initial nucleation processes consist of three steps: the ordering of ring orientation, the ordering of main-chain vectors, and the ordering of side chains. At the start, the normal vectors of thiophene rings aligned in a very short time, followed by alignment of main-chain end-to-end vectors. The flexible side-chain ordering took almost 5 times longer than the rigid-main-chain ordering. The simulation results indicated that the ordering of side chains was induced after the formation of the regular stack structure of main chains. This slow ordering dynamics of flexible side chains is one of the factors that cause anisotropic nuclei growth, which would be closely related to the formation of nanofiber structures without external flow field. Our simulation results revealed how the combined structure of the planar and rigid-main-chain backbones and the sparse flexible side chains lead to specific ordering behaviors that are not observed in ordinary linear polymer crystallization processes.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Day, John H. (Technical Monitor)
2000-01-01
Post-processing of data, related to a GPS receiver test in a GPS simulator and test facility, is an important step towards qualifying a receiver for space flight. Although the GPS simulator provides all the parameters needed to analyze a simulation, as well as excellent analysis tools on the simulator workstation, post-processing is not a GPS simulator or receiver function alone, and it must be planned as a separate pre-flight test program requirement. A GPS simulator is a critical resource, and it is desirable to move off the pertinent test data from the simulator as soon as a test is completed. The receiver and simulator databases are used to extract the test data files for postprocessing. These files are then usually moved from the simulator and receiver systems to a personal computer (PC) platform, where post-processing is done typically using PC-based commercial software languages and tools. Because of commercial software systems generality their functions are notoriously slow and more than often are the bottleneck even for short duration simulator-based tests. There is a need to do post-processing faster and within an hour after test completion, including all required operations on the simulator and receiver to prepare and move off the post-processing files. This is especially significant in order to use the previous test feedback for the next simulation setup or to run near back-to-back simulation scenarios. Solving the post-processing timing problem is critical for a pre-flight test program success. Towards this goal an approach was developed that allows to speed-up post-processing by an order of a magnitude. It is based on improving the post-processing bottleneck function algorithm using a priory information that is specific to a GPS simulation application and using only the necessary volume of truth data. The presented postprocessing scheme was used in support of a few successful space flight missions carrying GPS receivers.
NASA Astrophysics Data System (ADS)
Rousseau, A. N.; Álvarez; Yu, X.; Savary, S.; Duffy, C.
2015-12-01
Most physically-based hydrological models simulate to various extents the relevant watershed processes occurring at different spatiotemporal scales. These models use different physical domain representations (e.g., hydrological response units, discretized control volumes) and numerical solution techniques (e.g., finite difference method, finite element method) as well as a variety of approximations for representing the physical processes. Despite the fact that several models have been developed so far, very few inter-comparison studies have been conducted to check beyond streamflows whether different modeling approaches could simulate in a similar fashion the other processes at the watershed scale. In this study, PIHM (Qu and Duffy, 2007), a fully coupled, distributed model, and HYDROTEL (Fortin et al., 2001; Turcotte et al., 2003, 2007), a pseudo-coupled, semi-distributed model, were compared to check whether the models could corroborate observed streamflows while equally representing other processes as well such as evapotranspiration, snow accumulation/melt or infiltration, etc. For this study, the Young Womans Creek watershed, PA, was used to compare: streamflows (channel routing), actual evapotranspiration, snow water equivalent (snow accumulation and melt), infiltration, recharge, shallow water depth above the soil surface (surface flow), lateral flow into the river (surface and subsurface flow) and height of the saturated soil column (subsurface flow). Despite a lack of observed data for contrasting most of the simulated processes, it can be said that the two models can be used as simulation tools for streamflows, actual evapotranspiration, infiltration, lateral flows into the river, and height of the saturated soil column. However, each process presents particular differences as a result of the physical parameters and the modeling approaches used by each model. Potentially, these differences should be object of further analyses to definitively confirm or reject modeling hypotheses.
Benchmark Problems of the Geothermal Technologies Office Code Comparison Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Mark D.; Podgorney, Robert; Kelkar, Sharad M.
A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulationmore » capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners.« less
Three dimensional modeling of cirrus during the 1991 FIRE IFO 2: Detailed process study
NASA Technical Reports Server (NTRS)
Jensen, Eric J.; Toon, Owen B.; Westphal, Douglas L.
1993-01-01
A three-dimensional model of cirrus cloud formation and evolution, including microphysical, dynamical, and radiative processes, was used to simulate cirrus observed in the FIRE Phase 2 Cirrus field program (13 Nov. - 7 Dec. 1991). Sulfate aerosols, solution drops, ice crystals, and water vapor are all treated as interactive elements in the model. Ice crystal size distributions are fully resolved based on calculations of homogeneous freezing of solution drops, growth by water vapor deposition, evaporation, aggregation, and vertical transport. Visible and infrared radiative fluxes, and radiative heating rates are calculated using the two-stream algorithm described by Toon et al. Wind velocities, diffusion coefficients, and temperatures were taken from the MAPS analyses and the MM4 mesoscale model simulations. Within the model, moisture is transported and converted to liquid or vapor by the microphysical processes. The simulated cloud bulk and microphysical properties are shown in detail for the Nov. 26 and Dec. 5 case studies. Comparisons with lidar, radar, and in situ data are used to determine how well the simulations reproduced the observed cirrus. The roles played by various processes in the model are described in detail. The potential modes of nucleation are evaluated, and the importance of small-scale variations in temperature and humidity are discussed. The importance of competing ice crystal growth mechanisms (water vapor deposition and aggregation) are evaluated based on model simulations. Finally, the importance of ice crystal shape for crystal growth and vertical transport of ice are discussed.
Practical Unitary Simulator for Non-Markovian Complex Processes
NASA Astrophysics Data System (ADS)
Binder, Felix C.; Thompson, Jayne; Gu, Mile
2018-06-01
Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.
Effect of the presence of oil on foam performance; A field simulation study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, D.H.S.; Yang, Z.M.; Stone, T.W.
1992-05-01
This paper describes a field-scale sensitivity study of the effect of the presence of oil on foam performance in a steam-foam-drive process. The 2D field-scale simulation was based on a field pilot in the Karamay formation in Zin-Jiang, China. Numerical results showed that the detrimental effect of oil on the foam performance in field operations is significant. The success of a steam-foam process depended mainly on the ability of the foam to divert steam from the depleted zone.
A Theory for the Neural Basis of Language Part 2: Simulation Studies of the Model
ERIC Educational Resources Information Center
Baron, R. J.
1974-01-01
Computer simulation studies of the proposed model are presented. Processes demonstrated are (1) verbally directed recall of visual experience; (2) understanding of verbal information; (3) aspects of learning and forgetting; (4) the dependence of recognition and understanding in context; and (5) elementary concepts of sentence production. (Author)
NASA Astrophysics Data System (ADS)
Si, Lina; Guo, Dan; Luo, Jianbin; Lu, Xinchun
2010-03-01
Molecular dynamics simulations of nanoscratching processes were used to study the atomic-scale removal mechanism of single crystalline silicon in chemical mechanical polishing (CMP) process and particular attention was paid to the effect of scratching depth. The simulation results under a scratching depth of 1 nm showed that a thick layer of silicon material was removed by chip formation and an amorphous layer was formed on the silicon surface after nanoscratching. By contrast, the simulation results with a depth of 0.1 nm indicated that just one monoatomic layer of workpiece was removed and a well ordered crystalline surface was obtained, which is quite consistent with previous CMP experimental results. Therefore, monoatomic layer removal mechanism was presented, by which it is considered that during CMP process the material was removed by one monoatomic layer after another, and the mechanism could provide a reasonable understanding on how the high precision surface was obtained. Also, the effects of the silica particle size and scratching velocity on the removal mechanism were investigated; the wear regimes and interatomic forces between silica particle and workpiece were studied to account for the different removal mechanisms with indentation depths of 0.1 and 1 nm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konomi, Bledar A.; Karagiannis, Georgios; Sarkar, Avik
2014-05-16
Computer experiments (numerical simulations) are widely used in scientific research to study and predict the behavior of complex systems, which usually have responses consisting of a set of distinct outputs. The computational cost of the simulations at high resolution are often expensive and become impractical for parametric studies at different input values. To overcome these difficulties we develop a Bayesian treed multivariate Gaussian process (BTMGP) as an extension of the Bayesian treed Gaussian process (BTGP) in order to model and evaluate a multivariate process. A suitable choice of covariance function and the prior distributions facilitates the different Markov chain Montemore » Carlo (MCMC) movements. We utilize this model to sequentially sample the input space for the most informative values, taking into account model uncertainty and expertise gained. A simulation study demonstrates the use of the proposed method and compares it with alternative approaches. We apply the sequential sampling technique and BTMGP to model the multiphase flow in a full scale regenerator of a carbon capture unit. The application presented in this paper is an important tool for research into carbon dioxide emissions from thermal power plants.« less
Quench simulations for superconducting elements in the LHC accelerator
NASA Astrophysics Data System (ADS)
Sonnemann, F.; Schmidt, R.
2000-08-01
The design of the protection system for the superconducting elements in an accelerator such as the large Hadron collider (LHC), now under construction at CERN, requires a detailed understanding of the thermo-hydraulic and electrodynamic processes during a quench. A numerical program (SPQR - simulation program for quench research) has been developed to evaluate temperature and voltage distributions during a quench as a function of space and time. The quench process is simulated by approximating the heat balance equation with the finite difference method in presence of variable cooling and powering conditions. The simulation predicts quench propagation along a superconducting cable, forced quenching with heaters, impact of eddy currents induced by a magnetic field change, and heat transfer through an insulation layer into helium, an adjacent conductor or other material. The simulation studies allowed a better understanding of experimental quench data and were used for determining the adequate dimensioning and protection of the highly stabilised superconducting cables for connecting magnets (busbars), optimising the quench heater strip layout for the main magnets, and studying quench back by induced eddy currents in the superconductor. After the introduction of the theoretical approach, some applications of the simulation model for the LHC dipole and corrector magnets are presented and the outcome of the studies is compared with experimental data.
NASA Astrophysics Data System (ADS)
Yang, Peng
The focus of this dissertation is the Molecular Dynamics (MD) simulation study of two different systems. In thefirst system, we study the dynamic process of graphene exfoliation, particularly graphene dispersion using ionic surfactants (Chapter 2). In the second system, we investigate the mesoscopic structure of binary solute/ionic liquid (IL) mixtures through the comparison between simulations and corresponding experiments (Chapter 3 and 4). In the graphene exfoliation study, we consider two separation mechanisms: changing the interlayer distance and sliding away the relative distance of two single-layer graphene sheets. By calculating the energy barrier as a function of separation (interlayer or sliding-away) distance and performing sodium dodecyl sulfate (SDS) structure analysis around graphene surface in SDS surfactant/water + bilayer graphene mixture systems, we find that the sliding-away mechanism is the dominant, feasible separation process. In this process, the SDS-graphene interaction gradually replaces the graphene-graphene Van der Waals (VdW) interaction, and decreases the energy barrier until almost zero at critical SDS concentration. In solute/IL study, we investigate nonpolar (CS2) and dipolar (CH 3CN) solute/IL mixture systems. MD simulation shows that at low concentrations, IL is nanosegregated into an ionic network and nonpolar domain. It is also found that CS2 molecules tend to be localized into the nonpolar domain, while CH3CN interacts with nonpolar domain as well as with the charged head groups in the ionic network because of its amphiphilicity. At high concentrations, CH3CN molecules eventually disrupt the nanostructural organization. This dissertation is organized in four chapters: (1) introduction to graphene, ionic liquids and the methodology of MD; (2) MD simulation of graphene exfoliation; (3) Nanostructural organization in acetonitrile/IL mixtures; (4) Nanostructural organization in carbon disulfide/IL mixtures; (5) Conclusions. Results of MD simulations of liquid mixture systems car-ried out in this research explain observed experiments and show the details of nanostructural organizations in small solute molecules/IL mixture. Additionally, the research successfully reveals the correct mechanism of graphene exfoliation process in liquid solution. (This will be summarized in Chapter 5.) The research presented in this dissertation enhances our understanding of the microscopic behaviors in complex liquid systems as well as the theoretical method to explore them.
Simulation of plasma loading of high-pressure RF cavities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, K.; Samulyak, R.; Yonehara, K.
2018-01-11
Muon beam-induced plasma loading of radio-frequency (RF) cavities filled with high pressure hydrogen gas with 1% dry air dopant has been studied via numerical simulations. The electromagnetic code SPACE, that resolves relevant atomic physics processes, including ionization by the muon beam, electron attachment to dopant molecules, and electron-ion and ion-ion recombination, has been used. Simulations studies have also been performed in the range of parameters typical for practical muon cooling channels.
Study of CFB Simulation Model with Coincidence at Multi-Working Condition
NASA Astrophysics Data System (ADS)
Wang, Z.; He, F.; Yang, Z. W.; Li, Z.; Ni, W. D.
A circulating fluidized bed (CFB) two-stage simulation model was developed. To realize the model results coincident with the design value or real operation value at specified multi-working conditions and with capability of real-time calculation, only the main key processes were taken into account and the dominant factors were further abstracted out of these key processes. The simulation results showed a sound accordance at multi-working conditions, and confirmed the advantage of the two-stage model over the original single-stage simulation model. The combustion-support effect of secondary air was investigated using the two-stage model. This model provides a solid platform for investigating the pant-leg structured CFB furnace, which is now under design for a supercritical power plant.
Stanley, Claire; Lindsay, Sally; Parker, Kathryn; Kawamura, Anne; Samad Zubairi, Mohammad
2018-05-09
We previously reported that experienced clinicians find the process of collectively building and participating in simulations provide (1) a unique reflective opportunity; (2) a venue to identify different perspectives through discussion and action in a group; and (3) a safe environment for learning. No studies have assessed the value of collaborating with standardized patients (SPs) and patient facilitators (PFs) in the process. In this work, we describe this collaboration in building a simulation and the key elements that facilitate reflection. Three simulation scenarios surrounding communication were built by teams of clinicians, a PF, and SPs. Six build sessions were audio recorded, transcribed, and thematically analyzed through an iterative process to (1) describe the steps of building a simulation scenario and (2) identify the key elements involved in the collaboration. The five main steps to build a simulation scenario were (1) storytelling and reflection; (2) defining objectives and brainstorming ideas; (3) building a stem and creating a template; (4) refining the scenario with feedback from SPs; and (5) mock run-throughs with follow-up discussion. During these steps, the PF shared personal insights, challenging participants to reflect deeper to better understand and consider the patient's perspective. The SPs provided unique outside perspective to the group. In addition, the interaction between the SPs and the PF helped refine character roles. A collaborative approach incorporating feedback from PFs and SPs to create a simulation scenario is a valuable method to enhance reflective practice for clinicians.
Repetition-Related Reductions in Neural Activity during Emotional Simulations of Future Events.
Szpunar, Karl K; Jing, Helen G; Benoit, Roland G; Schacter, Daniel L
2015-01-01
Simulations of future experiences are often emotionally arousing, and the tendency to repeatedly simulate negative future outcomes has been identified as a predictor of the onset of symptoms of anxiety. Nonetheless, next to nothing is known about how the healthy human brain processes repeated simulations of emotional future events. In this study, we present a paradigm that can be used to study repeated simulations of the emotional future in a manner that overcomes phenomenological confounds between positive and negative events. The results show that pulvinar nucleus and orbitofrontal cortex respectively demonstrate selective reductions in neural activity in response to frequently as compared to infrequently repeated simulations of negative and positive future events. Implications for research on repeated simulations of the emotional future in both non-clinical and clinical populations are discussed.
Multi-material 3D Models for Temporal Bone Surgical Simulation.
Rose, Austin S; Kimbell, Julia S; Webster, Caroline E; Harrysson, Ola L A; Formeister, Eric J; Buchman, Craig A
2015-07-01
A simulated, multicolor, multi-material temporal bone model can be created using 3-dimensional (3D) printing that will prove both safe and beneficial in training for actual temporal bone surgical cases. As the process of additive manufacturing, or 3D printing, has become more practical and affordable, a number of applications for the technology in the field of Otolaryngology-Head and Neck Surgery have been considered. One area of promise is temporal bone surgical simulation. Three-dimensional representations of human temporal bones were created from temporal bone computed tomography (CT) scans using biomedical image processing software. Multi-material models were then printed and dissected in a temporal bone laboratory by attending and resident otolaryngologists. A 5-point Likert scale was used to grade the models for their anatomical accuracy and suitability as a simulation of cadaveric and operative temporal bone drilling. The models produced for this study demonstrate significant anatomic detail and a likeness to human cadaver specimens for drilling and dissection. Simulated temporal bones created by this process have potential benefit in surgical training, preoperative simulation for challenging otologic cases, and the standardized testing of temporal bone surgical skills. © The Author(s) 2015.
Evers, J B; Vos, J; Yin, X; Romero, P; van der Putten, P E L; Struik, P C
2010-05-01
Intimate relationships exist between form and function of plants, determining many processes governing their growth and development. However, in most crop simulation models that have been created to simulate plant growth and, for example, predict biomass production, plant structure has been neglected. In this study, a detailed simulation model of growth and development of spring wheat (Triticum aestivum) is presented, which integrates degree of tillering and canopy architecture with organ-level light interception, photosynthesis, and dry-matter partitioning. An existing spatially explicit 3D architectural model of wheat development was extended with routines for organ-level microclimate, photosynthesis, assimilate distribution within the plant structure according to organ demands, and organ growth and development. Outgrowth of tiller buds was made dependent on the ratio between assimilate supply and demand of the plants. Organ-level photosynthesis, biomass production, and bud outgrowth were simulated satisfactorily. However, to improve crop simulation results more efforts are needed mechanistically to model other major plant physiological processes such as nitrogen uptake and distribution, tiller death, and leaf senescence. Nevertheless, the work presented here is a significant step forwards towards a mechanistic functional-structural plant model, which integrates plant architecture with key plant processes.
Simulation of Soil Frost and Thaw Fronts Dynamics with Community Land Model 4.5
NASA Astrophysics Data System (ADS)
Gao, J.; Xie, Z.
2016-12-01
Freeze-thaw processes in soils, including changes in frost and thaw fronts (FTFs) , are important physical processes. The movement of FTFs affects soil water and thermal characteristics, as well as energy and water exchanges between land surface and the atmosphere, and then the land surface hydrothermal process. In this study, a two-directional freeze and thaw algorithm for simulating FTFs is incorporated into the community land surface model CLM4.5, which is called CLM4.5-FTF. The simulated FTFs depth and soil temperature of CLM4.5-FTF compared well with the observed data both in D66 station (permafrost) and Hulugou station (seasonally frozen soil). Because the soil temperature profile within a soil layer can be estimated according to the position of FTFs, CLM4.5 performed better in soil temperature simulation. Permafrost and seasonally frozen ground conditions in China from 1980 to 2010 were simulated using the CLM4.5-FTF. Numerical experiments show that the spatial distribution of simulated maximum frost depth by CLM4.5-FTF has seasonal variation obviously. Significant positive active-layer depth trends for permafrost regions and negative maximum freezing depth trends for seasonal frozen soil regions are simulated in response to positive air temperature trends except west of Black Sea.
NASA Astrophysics Data System (ADS)
Huppert, J.; Michal Lomask, S.; Lazarowitz, R.
2002-08-01
Computer-assisted learning, including simulated experiments, has great potential to address the problem solving process which is a complex activity. It requires a highly structured approach in order to understand the use of simulations as an instructional device. This study is based on a computer simulation program, 'The Growth Curve of Microorganisms', which required tenth grade biology students to use problem solving skills whilst simultaneously manipulating three independent variables in one simulated experiment. The aims were to investigate the computer simulation's impact on students' academic achievement and on their mastery of science process skills in relation to their cognitive stages. The results indicate that the concrete and transition operational students in the experimental group achieved significantly higher academic achievement than their counterparts in the control group. The higher the cognitive operational stage, the higher students' achievement was, except in the control group where students in the concrete and transition operational stages did not differ. Girls achieved equally with the boys in the experimental group. Students' academic achievement may indicate the potential impact a computer simulation program can have, enabling students with low reasoning abilities to cope successfully with learning concepts and principles in science which require high cognitive skills.
Galato, Dayani; Alano, Graziela M.; Trauthman, Silvana C.; França, Tainã F.
Objective A simulation process known as objective structured clinical examination (OSCE) was applied to assess pharmacy practice performed by senior pharmacy students. Methods A cross-sectional study was conducted based on documentary analysis of performance evaluation records of pharmacy practice simulations that occurred between 2005 and 2009. These simulations were related to the process of self-medication and dispensing, and were performed with the use of patients simulated. The simulations were filmed to facilitate the evaluation process. It presents the OSCE educational experience performed by pharmacy trainees of the University of Southern Santa Catarina and experienced by two evaluators. The student general performance was analyzed, and the criteria for pharmacy practice assessment often identified trainees in difficulty. Results The results of 291 simulations showed that students have an average yield performance of 70.0%. Several difficulties were encountered, such as the lack of information about the selected/prescribed treatment regimen (65.1%); inadequate communication style (21.9%); lack of identification of patients’ needs (7.7%) and inappropriate drug selection for self-medication (5.3%). Conclusions These data show that there is a need for reorientation of clinical pharmacy students because they need to improve their communication skills, and have a deeper knowledge of medicines and health problems in order to properly orient their patients. PMID:24367467
NASA Astrophysics Data System (ADS)
Ambarita, H.; Widodo, T. I.; Nasution, D. M.
2017-01-01
In order to reduce the consumption of fossil fuel of a compression ignition (CI) engines which is usually used in transportation and heavy machineries, it can be operated in dual-fuel mode (diesel-biogas). However, the literature reviews show that the thermal efficiency is lower due to incomplete combustion process. In order to increase the efficiency, the combustion process in the combustion chamber need to be explored. Here, a commercial CFD code is used to explore the combustion process of a small CI engine run on dual fuel mode (diesel-biogas). The turbulent governing equations are solved based on finite volume method. A simulation of compression and expansions strokes at an engine speed and load of 1000 rpm and 2500W, respectively has been carried out. The pressure and temperature distributions and streamlines are plotted. The simulation results show that at engine power of 732.27 Watt the thermal efficiency is 9.05%. The experiment and simulation results show a good agreement. The method developed in this study can be used to investigate the combustion process of CI engine run on dual-fuel mode.
Simulation and analysis of tape spring for deployed space structures
NASA Astrophysics Data System (ADS)
Chang, Wei; Cao, DongJing; Lian, MinLong
2018-03-01
The tape spring belongs to the configuration of ringent cylinder shell, and the mechanical properties of the structure are significantly affected by the change of geometrical parameters. There are few studies on the influence of geometrical parameters on the mechanical properties of the tape spring. The bending process of the single tape spring was simulated based on simulation software. The variations of critical moment, unfolding moment, and maximum strain energy in the bending process were investigated, and the effects of different radius angles of section and thickness and length on driving capability of the simple tape spring was studied by using these parameters. Results show that the driving capability and resisting disturbance capacity grow with the increase of radius angle of section in the bending process of the single tape spring. On the other hand, these capabilities decrease with increasing length of the single tape spring. In the end, the driving capability and resisting disturbance capacity grow with the increase of thickness in the bending process of the single tape spring. The research has a certain reference value for improving the kinematic accuracy and reliability of deployable structures.
Dietz, Mathias; Hohmann, Volker; Jürgens, Tim
2015-01-01
For normal-hearing listeners, speech intelligibility improves if speech and noise are spatially separated. While this spatial release from masking has already been quantified in normal-hearing listeners in many studies, it is less clear how spatial release from masking changes in cochlear implant listeners with and without access to low-frequency acoustic hearing. Spatial release from masking depends on differences in access to speech cues due to hearing status and hearing device. To investigate the influence of these factors on speech intelligibility, the present study measured speech reception thresholds in spatially separated speech and noise for 10 different listener types. A vocoder was used to simulate cochlear implant processing and low-frequency filtering was used to simulate residual low-frequency hearing. These forms of processing were combined to simulate cochlear implant listening, listening based on low-frequency residual hearing, and combinations thereof. Simulated cochlear implant users with additional low-frequency acoustic hearing showed better speech intelligibility in noise than simulated cochlear implant users without acoustic hearing and had access to more spatial speech cues (e.g., higher binaural squelch). Cochlear implant listener types showed higher spatial release from masking with bilateral access to low-frequency acoustic hearing than without. A binaural speech intelligibility model with normal binaural processing showed overall good agreement with measured speech reception thresholds, spatial release from masking, and spatial speech cues. This indicates that differences in speech cues available to listener types are sufficient to explain the changes of spatial release from masking across these simulated listener types. PMID:26721918
Fu, Guang; Zhang, David Z; He, Allen N; Mao, Zhongfa; Zhang, Kaifei
2018-05-10
A deep understanding of the laser-material interaction mechanism, characterized by laser absorption, is very important in simulating the laser metal powder bed fusion (PBF) process. This is because the laser absorption of material affects the temperature distribution, which influences the thermal stress development and the final quality of parts. In this paper, a three-dimensional finite element analysis model of heat transfer taking into account the effect of material state and phase changes on laser absorption is presented to gain insight into the absorption mechanism, and the evolution of instantaneous absorptance in the laser metal PBF process. The results showed that the instantaneous absorptance was significantly affected by the time of laser radiation, as well as process parameters, such as hatch space, scanning velocity, and laser power, which were consistent with the experiment-based findings. The applicability of this model to temperature simulation was demonstrated by a comparative study, wherein the peak temperature in fusion process was simulated in two scenarios, with and without considering the effect of material state and phase changes on laser absorption, and the simulated results in the two scenarios were then compared with experimental data respectively.
ATLAS Simulation using Real Data: Embedding and Overlay
NASA Astrophysics Data System (ADS)
Haas, Andrew; ATLAS Collaboration
2017-10-01
For some physics processes studied with the ATLAS detector, a more accurate simulation in some respects can be achieved by including real data into simulated events, with substantial potential improvements in the CPU, disk space, and memory usage of the standard simulation configuration, at the cost of significant database and networking challenges. Real proton-proton background events can be overlaid (at the detector digitization output stage) on a simulated hard-scatter process, to account for pileup background (from nearby bunch crossings), cavern background, and detector noise. A similar method is used to account for the large underlying event from heavy ion collisions, rather than directly simulating the full collision. Embedding replaces the muons found in Z→μμ decays in data with simulated taus at the same 4-momenta, thus preserving the underlying event and pileup from the original data event. In all these cases, care must be taken to exactly match detector conditions (beamspot, magnetic fields, alignments, dead sensors, etc.) between the real data event and the simulation. We will discuss the status of these overlay and embedding techniques within ATLAS software and computing.
Simulation and Analysis of One-time Forming Process of Automobile Steering Ball Head
NASA Astrophysics Data System (ADS)
Shi, Peicheng; Zhang, Xujun; Xu, Zengwei; Zhang, Rongyun
2018-03-01
Aiming at the problems such as large machining allowance, low production efficiency and material waste during die forging of ball pin, the cold extrusion process of ball head was studied and the analog simulation of the forming process was carried out by using the finite element analysis software DEFORM-3D. Through the analysis of the equivalent stress strain, velocity vector field and load-displacement curve, the flow regularity of the metal during the cold extrusion process of ball pin was clarified, and possible defects during the molding were predicted. The results showed that this process could solve the forming problem of ball pin and provide theoretical basis for actual production of enterprises.
Park, Pyung-Kyu; Lee, Sangho; Cho, Jae-Seok; Kim, Jae-Hong
2012-08-01
The objective of this study is to further develop previously reported mechanistic predictive model that simulates boron removal in full-scale seawater reverse osmosis (RO) desalination processes to take into account the effect of membrane fouling. Decrease of boron removal and reduction in water production rate by membrane fouling due to enhanced concentration polarization were simulated as a decrease in solute mass transfer coefficient in boundary layer on membrane surface. Various design and operating options under fouling condition were examined including single- versus double-pass configurations, different number of RO elements per vessel, use of RO membranes with enhanced boron rejection, and pH adjustment. These options were quantitatively compared by normalizing the performance of the system in terms of E(min), the minimum energy costs per product water. Simulation results suggested that most viable options to enhance boron rejection among those tested in this study include: i) minimizing fouling, ii) exchanging the existing SWRO elements to boron-specific ones, and iii) increasing pH in the second pass. The model developed in this study is expected to help design and optimization of the RO processes to achieve the target boron removal at target water recovery under realistic conditions where membrane fouling occurs during operation. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Yizhen; McGillicuddy, Dennis J.; Dinniman, Michael S.; Klinck, John M.
2017-02-01
Both remotely sensed and in situ observations in austral summer of early 2012 in the Ross Sea suggest the presence of cold, low-salinity, and high-biomass eddies along the edge of the Ross Ice Shelf (RIS). Satellite measurements include sea surface temperature and ocean color, and shipboard data sets include hydrographic profiles, towed instrumentation, and underway acoustic Doppler current profilers. Idealized model simulations are utilized to examine the processes responsible for ice shelf eddy formation. 3-D model simulations produce similar cold and fresh eddies, although the simulated vertical lenses are quantitatively thinner than observed. Model sensitivity tests show that both basal melting underneath the ice shelf and irregularity of the ice shelf edge facilitate generation of cold and fresh eddies. 2-D model simulations further suggest that both basal melting and downwelling-favorable winds play crucial roles in forming a thick layer of low-salinity water observed along the edge of the RIS. These properties may have been entrained into the observed eddies, whereas that entrainment process was not captured in the specific eddy formation events studied in our 3-D model-which may explain the discrepancy between the simulated and observed eddies, at least in part. Additional sensitivity experiments imply that uncertainties associated with background stratification and wind stress may also explain why the model underestimates the thickness of the low-salinity lens in the eddy interiors. Our study highlights the importance of incorporating accurate wind forcing, basal melting, and ice shelf irregularity for simulating eddy formation near the RIS edge. The processes responsible for generating the high phytoplankton biomass inside these eddies remain to be elucidated. Appendix B. Details for the basal melting and mechanical forcing by the ice shelf edge.
Numerical Simulations of the Digital Microfluidic Manipulation of Single Microparticles.
Lan, Chuanjin; Pal, Souvik; Li, Zhen; Ma, Yanbao
2015-09-08
Single-cell analysis techniques have been developed as a valuable bioanalytical tool for elucidating cellular heterogeneity at genomic, proteomic, and cellular levels. Cell manipulation is an indispensable process for single-cell analysis. Digital microfluidics (DMF) is an important platform for conducting cell manipulation and single-cell analysis in a high-throughput fashion. However, the manipulation of single cells in DMF has not been quantitatively studied so far. In this article, we investigate the interaction of a single microparticle with a liquid droplet on a flat substrate using numerical simulations. The droplet is driven by capillary force generated from the wettability gradient of the substrate. Considering the Brownian motion of microparticles, we utilize many-body dissipative particle dynamics (MDPD), an off-lattice mesoscopic simulation technique, in this numerical study. The manipulation processes (including pickup, transport, and drop-off) of a single microparticle with a liquid droplet are simulated. Parametric studies are conducted to investigate the effects on the manipulation processes from the droplet size, wettability gradient, wetting properties of the microparticle, and particle-substrate friction coefficients. The numerical results show that the pickup, transport, and drop-off processes can be precisely controlled by these parameters. On the basis of the numerical results, a trap-free delivery of a hydrophobic microparticle to a destination on the substrate is demonstrated in the numerical simulations. The numerical results not only provide a fundamental understanding of interactions among the microparticle, the droplet, and the substrate but also demonstrate a new technique for the trap-free immobilization of single hydrophobic microparticles in the DMF design. Finally, our numerical method also provides a powerful design and optimization tool for the manipulation of microparticles in DMF systems.
NASA Astrophysics Data System (ADS)
Li, Xiaoyi; Gao, Hui; Soteriou, Marios C.
2017-08-01
Atomization of extremely high viscosity liquid can be of interest for many applications in aerospace, automotive, pharmaceutical, and food industries. While detailed atomization measurements usually face grand challenges, high-fidelity numerical simulations offer the advantage to comprehensively explore the atomization details. In this work, a previously validated high-fidelity first-principle simulation code HiMIST is utilized to simulate high-viscosity liquid jet atomization in crossflow. The code is used to perform a parametric study of the atomization process in a wide range of Ohnesorge numbers (Oh = 0.004-2) and Weber numbers (We = 10-160). Direct comparisons between the present study and previously published low-viscosity jet in crossflow results are performed. The effects of viscous damping and slowing on jet penetration, liquid surface instabilities, ligament formation/breakup, and subsequent droplet formation are investigated. Complex variations in near-field and far-field jet penetrations with increasing Oh at different We are observed and linked with the underlying jet deformation and breakup physics. Transition in breakup regimes and increase in droplet size with increasing Oh are observed, mostly consistent with the literature reports. The detailed simulations elucidate a distinctive edge-ligament-breakup dominated process with long surviving ligaments for the higher Oh cases, as opposed to a two-stage edge-stripping/column-fracture process for the lower Oh counterparts. The trend of decreasing column deflection with increasing We is reversed as Oh increases. A predominantly unimodal droplet size distribution is predicted at higher Oh, in contrast to the bimodal distribution at lower Oh. It has been found that both Rayleigh-Taylor and Kelvin-Helmholtz linear stability theories cannot be easily applied to interpret the distinct edge breakup process and further study of the underlying physics is needed.
Modeling cell adhesion and proliferation: a cellular-automata based approach.
Vivas, J; Garzón-Alvarado, D; Cerrolaza, M
Cell adhesion is a process that involves the interaction between the cell membrane and another surface, either a cell or a substrate. Unlike experimental tests, computer models can simulate processes and study the result of experiments in a shorter time and lower costs. One of the tools used to simulate biological processes is the cellular automata, which is a dynamic system that is discrete both in space and time. This work describes a computer model based on cellular automata for the adhesion process and cell proliferation to predict the behavior of a cell population in suspension and adhered to a substrate. The values of the simulated system were obtained through experimental tests on fibroblast monolayer cultures. The results allow us to estimate the cells settling time in culture as well as the adhesion and proliferation time. The change in the cells morphology as the adhesion over the contact surface progress was also observed. The formation of the initial link between cell and the substrate of the adhesion was observed after 100 min where the cell on the substrate retains its spherical morphology during the simulation. The cellular automata model developed is, however, a simplified representation of the steps in the adhesion process and the subsequent proliferation. A combined framework of experimental and computational simulation based on cellular automata was proposed to represent the fibroblast adhesion on substrates and changes in a macro-scale observed in the cell during the adhesion process. The approach showed to be simple and efficient.
BIOASPEN: System for technology development
NASA Technical Reports Server (NTRS)
1986-01-01
The public version of ASPEN was installed in the VAX 11/750 computer. To examine the idea of BIOASPEN, a test example (the manufacture of acetone, butanol, and ethanol through a biological route) was chosen for simulation. Previous reports on the BIOASPEN project revealed the limitations of ASPEN in modeling this process. To overcome some of the difficulties, modules were written for the acid and enzyme hydrolyzers, the fermentor, and a sterilizer. Information required for these modules was obtained from the literature whenever possible. Additional support modules necessary for interfacing with ASPEN were also written. Some of ASPEN subroutines were themselves altered in order to ensure the correct running of the simulation program. After testing of these additions and charges was completed, the Acetone-Butanol-Ethanol (ABE) process was simulated. A release of ASPEN (which contained the Economic Subsystem) was obtained and installed. This subsection was tested and numerous charges were made in the FORTRAN code. Capital investment and operating cost studies were performed on the ABE process. Some alternatives in certain steps of the ABE simulation were investigated in order to elucidate their effects on the overall economics of the process.
Low-Frequency Waves in HF Heating of the Ionosphere
NASA Astrophysics Data System (ADS)
Sharma, A. S.; Eliasson, B.; Milikh, G. M.; Najmi, A.; Papadopoulos, K.; Shao, X.; Vartanyan, A.
2016-02-01
Ionospheric heating experiments have enabled an exploration of the ionosphere as a large-scale natural laboratory for the study of many plasma processes. These experiments inject high-frequency (HF) radio waves using high-power transmitters and an array of ground- and space-based diagnostics. This chapter discusses the excitation and propagation of low-frequency waves in HF heating of the ionosphere. The theoretical aspects and the associated models and simulations, and the results from experiments, mostly from the HAARP facility, are presented together to provide a comprehensive interpretation of the relevant plasma processes. The chapter presents the plasma model of the ionosphere for describing the physical processes during HF heating, the numerical code, and the simulations of the excitation of low-frequency waves by HF heating. It then gives the simulations of the high-latitude ionosphere and mid-latitude ionosphere. The chapter also briefly discusses the role of kinetic processes associated with wave generation.
Process simulation of ethanol production from biomass gasification and syngas fermentation.
Pardo-Planas, Oscar; Atiyeh, Hasan K; Phillips, John R; Aichele, Clint P; Mohammad, Sayeed
2017-12-01
The hybrid gasification-syngas fermentation platform can produce more bioethanol utilizing all biomass components compared to the biochemical conversion technology. Syngas fermentation operates at mild temperatures and pressures and avoids using expensive pretreatment processes and enzymes. This study presents a new process simulation model developed with Aspen Plus® of a biorefinery based on a hybrid conversion technology for the production of anhydrous ethanol using 1200tons per day (wb) of switchgrass. The simulation model consists of three modules: gasification, fermentation, and product recovery. The results revealed a potential production of about 36.5million gallons of anhydrous ethanol per year. Sensitivity analyses were also performed to investigate the effects of gasification and fermentation parameters that are keys for the development of an efficient process in terms of energy conservation and ethanol production. Copyright © 2017 Elsevier Ltd. All rights reserved.
Evaluation of the flame propagation within an SI engine using flame imaging and LES
NASA Astrophysics Data System (ADS)
He, Chao; Kuenne, Guido; Yildar, Esra; van Oijen, Jeroen; di Mare, Francesca; Sadiki, Amsini; Ding, Carl-Philipp; Baum, Elias; Peterson, Brian; Böhm, Benjamin; Janicka, Johannes
2017-11-01
This work shows experiments and simulations of the fired operation of a spark ignition engine with port-fuelled injection. The test rig considered is an optically accessible single cylinder engine specifically designed at TU Darmstadt for the detailed investigation of in-cylinder processes and model validation. The engine was operated under lean conditions using iso-octane as a substitute for gasoline. Experiments have been conducted to provide a sound database of the combustion process. A planar flame imaging technique has been applied within the swirl- and tumble-planes to provide statistical information on the combustion process to complement a pressure-based comparison between simulation and experiments. This data is then analysed and used to assess the large eddy simulation performed within this work. For the simulation, the engine code KIVA has been extended by the dynamically thickened flame model combined with chemistry reduction by means of pressure dependent tabulation. Sixty cycles have been simulated to perform a statistical evaluation. Based on a detailed comparison with the experimental data, a systematic study has been conducted to obtain insight into the most crucial modelling uncertainties.
Exploring the physical layer frontiers of cellular uplink: The Vienna LTE-A Uplink Simulator.
Zöchmann, Erich; Schwarz, Stefan; Pratschner, Stefan; Nagel, Lukas; Lerch, Martin; Rupp, Markus
Communication systems in practice are subject to many technical/technological constraints and restrictions. Multiple input, multiple output (MIMO) processing in current wireless communications, as an example, mostly employs codebook-based pre-coding to save computational complexity at the transmitters and receivers. In such cases, closed form expressions for capacity or bit-error probability are often unattainable; effects of realistic signal processing algorithms on the performance of practical communication systems rather have to be studied in simulation environments. The Vienna LTE-A Uplink Simulator is a 3GPP LTE-A standard compliant MATLAB-based link level simulator that is publicly available under an academic use license, facilitating reproducible evaluations of signal processing algorithms and transceiver designs in wireless communications. This paper reviews research results that have been obtained by means of the Vienna LTE-A Uplink Simulator, highlights the effects of single-carrier frequency-division multiplexing (as the distinguishing feature to LTE-A downlink), extends known link adaptation concepts to uplink transmission, shows the implications of the uplink pilot pattern for gathering channel state information at the receiver and completes with possible future research directions.
NASA Astrophysics Data System (ADS)
Yiran, P.; Li, J.; von Salzen, K.; Dai, T.; Liu, D.
2014-12-01
Mineral dust is a significant contributor to global and Asian aerosol burden. Currently, large uncertainties still exist in simulated aerosol processes in global climate models (GCMs), which lead to a diversity in dust mass loading and spatial distribution of GCM projections. In this study, satellite measurements from CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization) and observed aerosol data from Asian stations are compared with modelled aerosol in the Canadian Atmospheric Global Climate Model (CanAM4.2). Both seasonal and annual variations in Asian dust distribution are investigated. Vertical profile of simulated aerosol in troposphere is evaluated with CALIOP Level 3 products and local observed extinction for dust and total aerosols. Physical processes in GCM such as horizontal advection, vertical mixing, dry and wet removals are analyzed according to model simulation and available measurements of aerosol. This work aims to improve current understanding of Asian dust transport and vertical exchange on a large scale, which may help to increase the accuracy of GCM simulation on aerosols.
Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah
Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less
Modeling, simulation, and analysis of optical remote sensing systems
NASA Technical Reports Server (NTRS)
Kerekes, John Paul; Landgrebe, David A.
1989-01-01
Remote Sensing of the Earth's resources from space-based sensors has evolved in the past 20 years from a scientific experiment to a commonly used technological tool. The scientific applications and engineering aspects of remote sensing systems have been studied extensively. However, most of these studies have been aimed at understanding individual aspects of the remote sensing process while relatively few have studied their interrelations. A motivation for studying these interrelationships has arisen with the advent of highly sophisticated configurable sensors as part of the Earth Observing System (EOS) proposed by NASA for the 1990's. Two approaches to investigating remote sensing systems are developed. In one approach, detailed models of the scene, the sensor, and the processing aspects of the system are implemented in a discrete simulation. This approach is useful in creating simulated images with desired characteristics for use in sensor or processing algorithm development. A less complete, but computationally simpler method based on a parametric model of the system is also developed. In this analytical model the various informational classes are parameterized by their spectral mean vector and covariance matrix. These class statistics are modified by models for the atmosphere, the sensor, and processing algorithms and an estimate made of the resulting classification accuracy among the informational classes. Application of these models is made to the study of the proposed High Resolution Imaging Spectrometer (HRIS). The interrelationships among observational conditions, sensor effects, and processing choices are investigated with several interesting results.
Zhang, Yuxin; Huang, Xiaoqin; Han, Keli; Zheng, Fang; Zhan, Chang-Guo
2016-11-25
The combined molecular dynamics (MD) and potential of mean force (PMF) simulations have been performed to determine the free energy profile of the CocE)-(+)-cocaine binding process in comparison with that of the corresponding CocE-(-)-cocaine binding process. According to the MD simulations, the equilibrium CocE-(+)-cocaine binding mode is similar to the CocE-(-)-cocaine binding mode. However, based on the simulated free energy profiles, a significant free energy barrier (∼5 kcal/mol) exists in the CocE-(+)-cocaine binding process whereas no obvious free energy barrier exists in the CocE-(-)-cocaine binding process, although the free energy barrier of ∼5 kcal/mol is not high enough to really slow down the CocE-(+)-cocaine binding process. In addition, the obtained free energy profiles also demonstrate that (+)-cocaine and (-)-cocaine have very close binding free energies with CocE, with a negligible difference (∼0.2 kcal/mol), which is qualitatively consistent with the nearly same experimental K M values of the CocE enzyme for (+)-cocaine and (-)-cocaine. The consistency between the computational results and available experimental data suggests that the mechanistic insights obtained from this study are reasonable. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Computational approach on PEB process in EUV resist: multi-scale simulation
NASA Astrophysics Data System (ADS)
Kim, Muyoung; Moon, Junghwan; Choi, Joonmyung; Lee, Byunghoon; Jeong, Changyoung; Kim, Heebom; Cho, Maenghyo
2017-03-01
For decades, downsizing has been a key issue for high performance and low cost of semiconductor, and extreme ultraviolet lithography is one of the promising candidates to achieve the goal. As a predominant process in extreme ultraviolet lithography on determining resolution and sensitivity, post exposure bake has been mainly studied by experimental groups, but development of its photoresist is at the breaking point because of the lack of unveiled mechanism during the process. Herein, we provide theoretical approach to investigate underlying mechanism on the post exposure bake process in chemically amplified resist, and it covers three important reactions during the process: acid generation by photo-acid generator dissociation, acid diffusion, and deprotection. Density functional theory calculation (quantum mechanical simulation) was conducted to quantitatively predict activation energy and probability of the chemical reactions, and they were applied to molecular dynamics simulation for constructing reliable computational model. Then, overall chemical reactions were simulated in the molecular dynamics unit cell, and final configuration of the photoresist was used to predict the line edge roughness. The presented multiscale model unifies the phenomena of both quantum and atomic scales during the post exposure bake process, and it will be helpful to understand critical factors affecting the performance of the resulting photoresist and design the next-generation material.
Linkage of mike she to wetland-dndc for carbon budgeting and anaerobic biogeochemistry simulation
Jianbo Cui; Changsheng Li; Ge Sun; Carl Trettin
2005-01-01
This study reports the linkage between MIKE SHE and Wetland-DNDC for carbon dynamics and greenhouse gases (GHGs) emissions simulation in forested wetland.Wet1and-DNDC was modified by parameterizing management measures, refining anaerobic biogeochemical processes, and was linked to the hydrological model - MIKE SHE. As a preliminary application, we simulated the effect...
Sensitivity of air quality simulation to smoke plume rise
Yongqiang Liu; Gary Achtemeier; Scott Goodrick
2008-01-01
Plume rise is the height smoke plumes can reach. This information is needed by air quality models such as the Community Multiscale Air Quality (CMAQ) model to simulate physical and chemical processes of point-source fire emissions. This study seeks to understand the importance of plume rise to CMAQ air quality simulation of prescribed burning to plume rise. CMAQ...
NASA Astrophysics Data System (ADS)
Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.
2012-12-01
Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.
Finite Element Analysis of ECAP, TCAP, RUE and CGP Processes
NASA Astrophysics Data System (ADS)
Patil, Deepak C.; Kallannavar, Vinayak; Bhovi, Prabhakar M.; Kori, S. A.; Venkateswarlu, K.
2016-02-01
A finite element method was applied to study the various severe plastic deformation processes like, Equal Channel Angular Pressing (ECAP), Tubular Channel Angular Pressing (TCAP), Repetitive Upsetting and Extrusion (RUE) and Constrained Groove Pressing (CGP), considering aluminum AA-390 alloy as specimen material for all these processes. FEA simulation was carried out using AFDEX simulation tool. Effect of the various ECAP process parameters like, die corner angle, channel angle, and the coefficient of friction were analyzed. The die corner angles were divided into 2 equal parts for increasing the effectiveness of ECAP process, thereby increasing the channel number from 2 to 3 and further, their influence on ECAP process was investigated. A 3D simulation of TCAP was carried out for die shapes like triangular and trapezoidal, and variation of the generated stress and strain was plotted. In CGP, four cycle operation was carried out; wherein each cycle is composed of corrugating the specimen and subsequent straightening to original dimension. During RUE process, a maximum effective stress of 683.1 MPa was induced in the specimen after processing it for four complete cycles of RUE process; whereas the maximum strain induced during the same condition was 3.715.
Process Modeling and Dynamic Simulation for EAST Helium Refrigerator
NASA Astrophysics Data System (ADS)
Lu, Xiaofei; Fu, Peng; Zhuang, Ming; Qiu, Lilong; Hu, Liangbing
2016-06-01
In this paper, the process modeling and dynamic simulation for the EAST helium refrigerator has been completed. The cryogenic process model is described and the main components are customized in detail. The process model is controlled by the PLC simulator, and the realtime communication between the process model and the controllers is achieved by a customized interface. Validation of the process model has been confirmed based on EAST experimental data during the cool down process of 300-80 K. Simulation results indicate that this process simulator is able to reproduce dynamic behaviors of the EAST helium refrigerator very well for the operation of long pulsed plasma discharge. The cryogenic process simulator based on control architecture is available for operation optimization and control design of EAST cryogenic systems to cope with the long pulsed heat loads in the future. supported by National Natural Science Foundation of China (No. 51306195) and Key Laboratory of Cryogenics, Technical Institute of Physics and Chemistry, CAS (No. CRYO201408)
Shakhawath Hossain, Md; Bergstrom, D J; Chen, X B
2015-12-01
The in vitro chondrocyte cell culture for cartilage tissue regeneration in a perfusion bioreactor is a complex process. Mathematical modeling and computational simulation can provide important insights into the culture process, which would be helpful for selecting culture conditions to improve the quality of the developed tissue constructs. However, simulation of the cell culture process is a challenging task due to the complicated interaction between the cells and local fluid flow and nutrient transport inside the complex porous scaffolds. In this study, a mathematical model and computational framework has been developed to simulate the three-dimensional (3D) cell growth in a porous scaffold placed inside a bi-directional flow perfusion bioreactor. The model was developed by taking into account the two-way coupling between the cell growth and local flow field and associated glucose concentration, and then used to perform a resolved-scale simulation based on the lattice Boltzmann method (LBM). The simulation predicts the local shear stress, glucose concentration, and 3D cell growth inside the porous scaffold for a period of 30 days of cell culture. The predicted cell growth rate was in good overall agreement with the experimental results available in the literature. This study demonstrates that the bi-directional flow perfusion culture system can enhance the homogeneity of the cell growth inside the scaffold. The model and computational framework developed is capable of providing significant insight into the culture process, thus providing a powerful tool for the design and optimization of the cell culture process. © 2015 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Timothy C.; Hammond, Glenn E.; Chen, Xingyuan
Time-lapse electrical resistivity tomography (ERT) is finding increased application for remotely monitoring processes occurring in the near subsurface in three-dimensions (i.e. 4D monitoring). However, there are few codes capable of simulating the evolution of subsurface resistivity and corresponding tomographic measurements arising from a particular process, particularly in parallel and with an open source license. Herein we describe and demonstrate an electrical resistivity tomography module for the PFLOTRAN subsurface simulation code, named PFLOTRAN-E4D. The PFLOTRAN-E4D module operates in parallel using a dedicated set of compute cores in a master-slave configuration. At each time step, the master processes receives subsurface states frommore » PFLOTRAN, converts those states to bulk electrical conductivity, and instructs the slave processes to simulate a tomographic data set. The resulting multi-physics simulation capability enables accurate feasibility studies for ERT imaging, the identification of the ERT signatures that are unique to a given process, and facilitates the joint inversion of ERT data with hydrogeological data for subsurface characterization. PFLOTRAN-E4D is demonstrated herein using a field study of stage-driven groundwater/river water interaction ERT monitoring along the Columbia River, Washington, USA. Results demonstrate the complex nature of changes subsurface electrical conductivity, in both the saturated and unsaturated zones, arising from water table changes and from river water intrusion into the aquifer. The results also demonstrate the sensitivity of surface based ERT measurements to those changes over time. PFLOTRAN-E4D is available with the PFLOTRAN development version with an open-source license at https://bitbucket.org/pflotran/pflotran-dev .« less
NASA Astrophysics Data System (ADS)
Johnson, Timothy C.; Hammond, Glenn E.; Chen, Xingyuan
2017-02-01
Time-lapse electrical resistivity tomography (ERT) is finding increased application for remotely monitoring processes occurring in the near subsurface in three-dimensions (i.e. 4D monitoring). However, there are few codes capable of simulating the evolution of subsurface resistivity and corresponding tomographic measurements arising from a particular process, particularly in parallel and with an open source license. Herein we describe and demonstrate an electrical resistivity tomography module for the PFLOTRAN subsurface flow and reactive transport simulation code, named PFLOTRAN-E4D. The PFLOTRAN-E4D module operates in parallel using a dedicated set of compute cores in a master-slave configuration. At each time step, the master processes receives subsurface states from PFLOTRAN, converts those states to bulk electrical conductivity, and instructs the slave processes to simulate a tomographic data set. The resulting multi-physics simulation capability enables accurate feasibility studies for ERT imaging, the identification of the ERT signatures that are unique to a given process, and facilitates the joint inversion of ERT data with hydrogeological data for subsurface characterization. PFLOTRAN-E4D is demonstrated herein using a field study of stage-driven groundwater/river water interaction ERT monitoring along the Columbia River, Washington, USA. Results demonstrate the complex nature of subsurface electrical conductivity changes, in both the saturated and unsaturated zones, arising from river stage fluctuations and associated river water intrusion into the aquifer. The results also demonstrate the sensitivity of surface based ERT measurements to those changes over time. PFLOTRAN-E4D is available with the PFLOTRAN development version with an open-source license at https://bitbucket.org/pflotran/pflotran-dev.
Experimental study and numerical simulation of evacuation from a dormitory
NASA Astrophysics Data System (ADS)
Lei, Wenjun; Li, Angui; Gao, Ran; Zhou, Ning; Mei, Sen; Tian, Zhenguo
2012-11-01
The evacuation process of students from a dormitory is investigated by both experiment and modeling. We investigate the video record of pedestrian movement in a dormitory, and find some typical characteristics of evacuation, including continuous pedestrian flow, mass behavior and so on. Based on the experimental observation, we found that simulation results considering pre-movement time are closer to the experimental results. With the model considering pre-movement time, we simulate the evacuation process and compare the simulation results with the experimental results, and find that they agree with each other closely. The crowd massing phenomenon is conducted in this paper. It is found that different crowd massing phenomena will emerge due to different desired velocities. The crowd massing phenomenon could be more serious with the increase of the desired velocity. In this study, we also found the faster-is-slower effect. When the positive effect produced by increasing the desired velocity is not sufficient for making up for its negative effect, the phenomenon of the greater the desired velocity the longer the time required for evacuation will emerge. From the video record, it can be observed that the mass behavior is obvious during the evacuation process. And the mass phenomenon could also be found in simulation. The results obtained from our study are also suitable to all these buildings in which both living and resting areas occupy the majority space, such as dormitories, residential buildings, hotels (restaurants) and so on.
Using cognitive architectures to study issues in team cognition in a complex task environment
NASA Astrophysics Data System (ADS)
Smart, Paul R.; Sycara, Katia; Tang, Yuqing
2014-05-01
Cognitive social simulation is a computer simulation technique that aims to improve our understanding of the dynamics of socially-situated and socially-distributed cognition. This makes cognitive social simulation techniques particularly appealing as a means to undertake experiments into team cognition. The current paper reports on the results of an ongoing effort to develop a cognitive social simulation capability that can be used to undertake studies into team cognition using the ACT-R cognitive architecture. This capability is intended to support simulation experiments using a team-based problem solving task, which has been used to explore the effect of different organizational environments on collective problem solving performance. The functionality of the ACT-R-based cognitive social simulation capability is presented and a number of areas of future development work are outlined. The paper also describes the motivation for adopting cognitive architectures in the context of social simulation experiments and presents a number of research areas where cognitive social simulation may be useful in developing a better understanding of the dynamics of team cognition. These include the use of cognitive social simulation to study the role of cognitive processes in determining aspects of communicative behavior, as well as the impact of communicative behavior on the shaping of task-relevant cognitive processes (e.g., the social shaping of individual and collective memory as a result of communicative exchanges). We suggest that the ability to perform cognitive social simulation experiments in these areas will help to elucidate some of the complex interactions that exist between cognitive, social, technological and informational factors in the context of team-based problem-solving activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2004-05-01
In an energy-efficiency study at its refinery near Salt Lake City, Utah, Chevron focused on light hydrocarbons processing. The company found it could recover hydrocarbons from its fuel gas system and sell them. By using process simulation models of special distillation columns and associated reboilers and condensers, Chevron could predict the performance of potential equipment configuration changes and process modifications. More than 25,000 MMBtu in natural gas could be saved annually if a debutanizer upgrade project and a new saturated gas plant project were completed. Together, these projects would save $4.4 million annually.
Viscoelastic properties of chalcogenide glasses and the simulation of their molding processes
NASA Astrophysics Data System (ADS)
Liu, Weiguo; Shen, Ping; Jin, Na
In order to simulate the precision molding process, the viscoelastic properties of chalcogenide glasses under high temperatures were investigated. Thermomechanical analysis were performed to measure and analysis the thermomechanical properties of chalcogenide glasses. The creep responses of the glasses at different temperatures were obtained. Finite element analysis was applied for the simulation of the molding processes. The simulation results were in consistence with previously reported experiment results. Stress concentration and evolution during the molding processes was also described with the simulation results.
Leskens, J G; Brugnach, M; Hoekstra, A Y
2014-01-01
Water simulation models are available to support decision-makers in urban water management. To use current water simulation models, special expertise is required. Therefore, model information is prepared prior to work sessions, in which decision-makers weigh different solutions. However, this model information quickly becomes outdated when new suggestions for solutions arise and are therefore limited in use. We suggest that new model techniques, i.e. fast and flexible computation algorithms and realistic visualizations, allow this problem to be solved by using simulation models during work sessions. A new Interactive Water Simulation Model was applied for two case study areas in Amsterdam and was used in two workshops. In these workshops, the Interactive Water Simulation Model was positively received. It included non-specialist participants in the process of suggesting and selecting possible solutions and made them part of the accompanying discussions and negotiations. It also provided the opportunity to evaluate and enhance possible solutions more often within the time horizon of a decision-making process. Several preconditions proved to be important for successfully applying the Interactive Water Simulation Model, such as the willingness of the stakeholders to participate and the preparation of different general main solutions that can be used for further iterations during a work session.
New Computer Simulations of Macular Neural Functioning
NASA Technical Reports Server (NTRS)
Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.
1994-01-01
We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.
NASA Astrophysics Data System (ADS)
Jacquey, Antoine; Cacace, Mauro
2017-04-01
Utilization of the underground for energy-related purposes have received increasing attention in the last decades as a source for carbon-free energy and for safe storage solutions. Understanding the key processes controlling fluid and heat flow around geological discontinuities such as faults and fractures as well as their mechanical behaviours is therefore of interest in order to design safe and sustainable reservoir operations. These processes occur in a naturally complex geological setting, comprising natural or engineered discrete heterogeneities as faults and fractures, span a relatively large spectrum of temporal and spatial scales and they interact in a highly non-linear fashion. In this regard, numerical simulators have become necessary in geological studies to model coupled processes and complex geological geometries. In this study, we present a new simulator GOLEM, using multiphysics coupling to characterize geological reservoirs. In particular, special attention is given to discrete geological features such as faults and fractures. GOLEM is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE). The MOOSE framework provides a powerful and flexible platform to solve multiphysics problems implicitly and in a tightly coupled manner on unstructured meshes which is of interest for the considered non-linear context. Governing equations in 3D for fluid flow, heat transfer (conductive and advective), saline transport as well as deformation (elastic and plastic) have been implemented into the GOLEM application. Coupling between rock deformation and fluid and heat flow is considered using theories of poroelasticity and thermoelasticity. Furthermore, considering material properties such as density and viscosity and transport properties such as porosity as dependent on the state variables (based on the International Association for the Properties of Water and Steam models) increase the coupling complexity of the problem. The GOLEM application aims therefore at integrating more physical processes observed in the field or in the laboratory to simulate more realistic scenarios. The use of high-level nonlinear solver technology allow us to tackle these complex multiphysics problems in three dimensions. Basic concepts behing the GOLEM simulator will be presented in this study as well as a few application examples to illustrate its main features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mackiewicz-Ludtka, G.; Sebright, J.
2007-12-15
The primary goal of this Cooperative Research and Development Agreement (CRADA) betwe1311 UT-Battelle (Contractor) and Caterpillar Inc. (Participant) was to develop the plasma arc lamp (PAL), infrared (IR) thermal processing technology 1.) to enhance surface coating performance by improving the interfacial bond strength between selected coatings and substrates; and 2.) to extend this technology base for transitioning of the arc lamp processing to the industrial Participant. Completion of the following three key technical tasks (described below) was necessary in order to accomplish this goal. First, thermophysical property data sets were successfully determined for composite coatings applied to 1010 steel substrates,more » with a more limited data set successfully measured for free-standing coatings. These data are necessary for the computer modeling simulations and parametric studies to; A.) simulate PAL IR processing, facilitating the development of the initial processing parameters; and B.) help develop a better understanding of the basic PAL IR fusing process fundamentals, including predicting the influence of melt pool stirring and heat tnmsfar characteristics introduced during plasma arc lamp infrared (IR) processing; Second, a methodology and a set of procedures were successfully developed and the plasma arc lamp (PAL) power profiles were successfully mapped as a function of PAL power level for the ORNL PAL. The latter data also are necessary input for the computer model to accurately simulate PAL processing during process modeling simulations, and to facilitate a better understand of the fusing process fundamentals. Third, several computer modeling codes have been evaluated as to their capabilities and accuracy in being able to capture and simulate convective mixing that may occur during PAL thermal processing. The results from these evaluation efforts are summarized in this report. The intention of this project was to extend the technology base and provide for transitioning of the arc lamp processing to the industrial Participant.« less
Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.
Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816
Simulation of Ge Dopant Emission in Indirect-Drive ICF Implosion Experiments
NASA Astrophysics Data System (ADS)
Macfarlane, J. J.; Golovkin, I.; Kulkarni, S.; Regan, S.; Epstein, R.; Mancini, R.; Peterson, K.; Suter, L. J.
2013-10-01
We present results from simulations performed to study the radiative properties of dopants used in inertial confinement fusion indirect-drive capsule implosion experiments on NIF. In Rev5 NIF ignition capsules, a Ge dopant is added to an inner region of the CH ablator to absorb hohlraum x-ray preheat. Spectrally resolved emission from ablator dopants can be used to study the degree of mixing of ablator material into the ignition hot spot. Here, we study the atomic processes that affect the radiative characteristics of these elements using a set of simulation tools to first estimate the evolution of plasma conditions in the compressed target, and then to compute the atomic kinetics of the dopant and the resultant radiative emission. Using estimates of temperature and density profiles predicted by radiation-hydrodynamics simulations, we set up simple 2-D plasma grids where we allow dopant material to be embedded in the fuel, and perform multi-dimensional collisional-radiative simulations using SPECT3D to compute non-LTE atomic level populations and spectral signatures from the dopant. Recently improved Stark-broadened line shape modeling for Ge K-shell lines has been included. The goal is to study the radiative and atomic processes that affect the emergent spectra, including the effects of inner-shell photoabsorption and K α reemission from the dopant.
Workshop on data acquisition and trigger system simulations for high energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1992-12-31
This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- anmore » Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.« less
NASA Technical Reports Server (NTRS)
Mocko, David M.; Sud, Y. C.; Einaudi, Franco (Technical Monitor)
2000-01-01
Present-day climate models produce large climate drifts that interfere with the climate signals simulated in modelling studies. The simplifying assumptions of the physical parameterization of snow and ice processes lead to large biases in the annual cycles of surface temperature, evapotranspiration, and the water budget, which in turn causes erroneous land-atmosphere interactions. Since land processes are vital for climate prediction, and snow and snowmelt processes have been shown to affect Indian monsoons and North American rainfall and hydrology, special attention is now being given to cold land processes and their influence on the simulated annual cycle in GCMs. The snow model of the SSiB land-surface model being used at Goddard has evolved from a unified single snow-soil layer interacting with a deep soil layer through a force-restore procedure to a two-layer snow model atop a ground layer separated by a snow-ground interface. When the snow cover is deep, force-restore occurs within the snow layers. However, several other simplifying assumptions such as homogeneous snow cover, an empirical depth related surface albedo, snowmelt and melt-freeze in the diurnal cycles, and neglect of latent heat of soil freezing and thawing still remain as nagging problems. Several important influences of these assumptions will be discussed with the goal of improving them to better simulate the snowmelt and meltwater hydrology. Nevertheless, the current snow model (Mocko and Sud, 2000, submitted) better simulates cold land processes as compared to the original SSiB. This was confirmed against observations of soil moisture, runoff, and snow cover in global GSWP (Sud and Mocko, 1999) and point-scale Valdai simulations over seasonal snow regions. New results from the current snow model SSiB from the 10-year PILPS 2e intercomparison in northern Scandinavia will be presented.
Verifying different-modality properties for concepts produces switching costs.
Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W
2003-03-01
According to perceptual symbol systems, sensorimotor simulations underlie the representation of concepts. It follows that sensorimotor phenomena should arise in conceptual processing. Previous studies have shown that switching from one modality to another during perceptual processing incurs a processing cost. If perceptual simulation underlies conceptual processing, then verifying the properties of concepts should exhibit a switching cost as well. For example, verifying a property in the auditory modality (e.g., BLENDER-loud) should be slower after verifying a property in a different modality (e.g., CRANBERRIES-tart) than after verifying a property in the same modality (e.g., LEAVES-rustling). Only words were presented to subjects, and there were no instructions to use imagery. Nevertheless, switching modalities incurred a cost, analogous to the cost of switching modalities in perception. A second experiment showed that this effect was not due to associative priming between properties in the same modality. These results support the hypothesis that perceptual simulation underlies conceptual processing.
NASA Astrophysics Data System (ADS)
Ohkubo, Tomomasa; Sato, Yuji; Matsunaga, Ei-ichi; Tsukamoto, Masahiro
2018-02-01
Although laser processing is widely used for many applications, the cutting quality of carbon fiber reinforced plastic (CFRP) decreases around the heat-affected zone (HAZ) during laser processing. Carbon fibers are exposed around the HAZ, and tensile strength decreases with increasing length of the HAZ. Some theoretical studies of thermal conductions that do not consider fluid dynamics have been performed; however, theoretical considerations that include the dynamics of laser ablation are scarce. Using removed mass and depth observed from experiments, the dynamics of laser ablation of CFRP with high-temperature and high-pressure of compressive gas is simulated herein. In this calculation, the mushroom-like shape of laser ablation is qualitatively simulated compared with experiments using a high-speed camera. Considering the removal temperature of the resin and the temperature distribution at each point on the surface, the simulation results suggest that a wide area of the resin is removed when the processing depth is shallow, and a rounded kerf is generated as the processing depth increases.
A Comparison of Three Approaches to Model Human Behavior
NASA Astrophysics Data System (ADS)
Palmius, Joel; Persson-Slumpi, Thomas
2010-11-01
One way of studying social processes is through the use of simulations. The use of simulations for this purpose has been established as its own field, social simulations, and has been used for studying a variety of phenomena. A simulation of a social setting can serve as an aid for thinking about that social setting, and for experimenting with different parameters and studying the outcomes caused by them. When using the simulation as an aid for thinking and experimenting, the chosen simulation approach will implicitly steer the simulationist towards thinking in a certain fashion in order to fit the model. To study the implications of model choice on the understanding of a setting where human anticipation comes into play, a simulation scenario of a coffee room was constructed using three different simulation approaches: Cellular Automata, Systems Dynamics and Agent-based modeling. The practical implementations of the models were done in three different simulation packages: Stella for Systems Dynamic, CaFun for Cellular automata and SesAM for Agent-based modeling. The models were evaluated both using Randers' criteria for model evaluation, and through introspection where the authors reflected upon how their understanding of the scenario was steered through the model choice. Further the software used for implementing the simulation models was evaluated, and practical considerations for the choice of software package are listed. It is concluded that the models have very different strengths. The Agent-based modeling approach offers the most intuitive support for thinking about and modeling a social setting where the behavior of the individual is in focus. The Systems Dynamics model would be preferable in situations where populations and large groups would be studied as wholes, but where individual behavior is of less concern. The Cellular Automata models would be preferable where processes need to be studied from the basis of a small set of very simple rules. It is further concluded that in most social simulation settings the Agent-based modeling approach would be the probable choice. This since the other models does not offer much in the way of supporting the modeling of the anticipatory behavior of humans acting in an organization.
Development of a global aerosol model using a two-dimensional sectional method: 1. Model design
NASA Astrophysics Data System (ADS)
Matsui, H.
2017-08-01
This study develops an aerosol module, the Aerosol Two-dimensional bin module for foRmation and Aging Simulation version 2 (ATRAS2), and implements the module into a global climate model, Community Atmosphere Model. The ATRAS2 module uses a two-dimensional (2-D) sectional representation with 12 size bins for particles from 1 nm to 10 μm in dry diameter and 8 black carbon (BC) mixing state bins. The module can explicitly calculate the enhancement of absorption and cloud condensation nuclei activity of BC-containing particles by aging processes. The ATRAS2 module is an extension of a 2-D sectional aerosol module ATRAS used in our previous studies within a framework of a regional three-dimensional model. Compared with ATRAS, the computational cost of the aerosol module is reduced by more than a factor of 10 by simplifying the treatment of aerosol processes and 2-D sectional representation, while maintaining good accuracy of aerosol parameters in the simulations. Aerosol processes are simplified for condensation of sulfate, ammonium, and nitrate, organic aerosol formation, coagulation, and new particle formation processes, and box model simulations show that these simplifications do not substantially change the predicted aerosol number and mass concentrations and their mixing states. The 2-D sectional representation is simplified (the number of advected species is reduced) primarily by the treatment of chemical compositions using two interactive bin representations. The simplifications do not change the accuracy of global aerosol simulations. In part 2, comparisons with measurements and the results focused on aerosol processes such as BC aging processes are shown.
NASA Astrophysics Data System (ADS)
Farid, V. L.; Wonorahardjo, S.
2018-05-01
The implementation of Green Building criteria is relatively new in architectural practice, especially in Indonesia. Consequently, the integration of these criteria into design process has the potential to change the design process itself. The implementation of the green building criteria into the conventional design process will be discussed in this paper. The concept of this project is to design a residential unit with a natural air-conditioning system. To achieve this purpose, the Green Building criteria has been implemented since the beginning of the design process until the detailing process on the end of the project. Several studies was performed throughout the design process, such as: (1) Conceptual review, where several professionally proved theories related to Tropical Architecture and passive design are used for a reference, and (2) Computer simulations, such as Computational Fluid Dynamics (CFD) and wind tunnel simulation, used to represent the dynamic response of the surrounding environment towards the building. Hopefully this paper may become a reference for designing a green residential building.
Micromechanical Aspects of Hydraulic Fracturing Processes
NASA Astrophysics Data System (ADS)
Galindo-torres, S. A.; Behraftar, S.; Scheuermann, A.; Li, L.; Williams, D.
2014-12-01
A micromechanical model is developed to simulate the hydraulic fracturing process. The model comprises two key components. Firstly, the solid matrix, assumed as a rock mass with pre-fabricated cracks, is represented by an array of bonded particles simulated by the Discrete Element Model (DEM)[1]. The interaction is ruled by the spheropolyhedra method, which was introduced by the authors previously and has been shown to realistically represent many of the features found in fracturing and communition processes. The second component is the fluid, which is modelled by the Lattice Boltzmann Method (LBM). It was recently coupled with the spheropolyhedra by the authors and validated. An advantage of this coupled LBM-DEM model is the control of many of the parameters of the fracturing fluid, such as its viscosity and the injection rate. To the best of the authors' knowledge this is the first application of such a coupled scheme for studying hydraulic fracturing[2]. In this first implementation, results are presented for a two-dimensional situation. Fig. 1 shows one snapshot of the LBM-DEM coupled simulation for the hydraulic fracturing where the elements with broken bonds can be identified and the fracture geometry quantified. The simulation involves a variation of the underground stress, particularly the difference between the two principal components of the stress tensor, to explore the effect on the fracture path. A second study focuses on the fluid viscosity to examine the effect of the time scales of different injection plans on the fracture geometry. The developed tool and the presented results have important implications for future studies of the hydraulic fracturing process and technology. references 1. Galindo-Torres, S.A., et al., Breaking processes in three-dimensional bonded granular materials with general shapes. Computer Physics Communications, 2012. 183(2): p. 266-277. 2. Galindo-Torres, S.A., A coupled Discrete Element Lattice Boltzmann Method for the simulation of fluid-solid interaction with particles of general shapes. Computer Methods in Applied Mechanics and Engineering, 2013. 265(0): p. 107-119.
Simulation of diffuse-charge capacitance in electric double layer capacitors
NASA Astrophysics Data System (ADS)
Sun, Ning; Gersappe, Dilip
2017-01-01
We use a Lattice Boltzmann Model (LBM) in order to simulate diffuse-charge dynamics in Electric Double Layer Capacitors (EDLCs). Simulations are carried out for both the charge and the discharge processes on 2D systems of complex random electrode geometries (pure random, random spheres and random fibers). The steric effect of concentrated solutions is considered by using a Modified Poisson-Nernst-Planck (MPNP) equations and compared with regular Poisson-Nernst-Planck (PNP) systems. The effects of electrode microstructures (electrode density, electrode filler morphology, filler size, etc.) on the net charge distribution and charge/discharge time are studied in detail. The influence of applied potential during discharging process is also discussed. Our studies show how electrode morphology can be used to tailor the properties of supercapacitors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romanov, Gennady; /Fermilab
CST Particle Studio combines electromagnetic field simulation, multi-particle tracking, adequate post-processing and advanced probabilistic emission model, which is the most important new capability in multipactor simulation. The emission model includes in simulation the stochastic properties of emission and adds primary electron elastic and inelastic reflection from the surfaces. The simulation of multipactor in coaxial waveguides have been performed to study the effects of the innovations on the multipactor threshold and the range over which multipactor can occur. The results compared with available previous experiments and simulations as well as the technique of MP simulation with CST PS are presented andmore » discussed.« less
A Low Cost Microcomputer System for Process Dynamics and Control Simulations.
ERIC Educational Resources Information Center
Crowl, D. A.; Durisin, M. J.
1983-01-01
Discusses a video simulator microcomputer system used to provide real-time demonstrations to strengthen students' understanding of process dynamics and control. Also discusses hardware/software and simulations developed using the system. The four simulations model various configurations of a process liquid level tank system. (JN)
Margarit, Gerard; Mallorqui, Jordi J.
2008-01-01
This paper uses a complete and realistic SAR simulation processing chain, GRECOSAR, to study the potentialities of Polarimetric SAR Interferometry (POLInSAR) in the development of new classification methods for ships. Its high processing efficiency and scenario flexibility have allowed to develop exhaustive scattering studies. The results have revealed, first, vessels' geometries can be described by specific combinations of Permanent Polarimetric Scatterers (PePS) and, second, each type of vessel could be characterized by a particular spatial and polarimetric distribution of PePS. Such properties have been recently exploited to propose a new Vessel Classification Algorithm (VCA) working with POLInSAR data, which, according to several simulation tests, may provide promising performance in real scenarios. Along the paper, explanation of the main steps summarizing the whole research activity carried out with ships and GRECOSAR are provided as well as examples of the main results and VCA validation tests. Special attention will be devoted to the new improvements achieved, which are related to simulations processing a new and highly realistic sea surface model. The paper will show that, for POLInSAR data with fine resolution, VCA can help to classify ships with notable robustness under diverse and adverse observation conditions. PMID:27873954
Analysis and Simulation of a Blue Energy Cycle
Sharma, Ms. Ketki; Kim, Yong-Ha; Yiacoumi, Sotira; ...
2016-01-30
The mixing process of fresh water and seawater releases a significant amount of energy and is a potential source of renewable energy. The so called ‘blue energy’ or salinity-gradient energy can be harvested by a device consisting of carbon electrodes immersed in an electrolyte solution, based on the principle of capacitive double layer expansion (CDLE). In this study, we have investigated the feasibility of energy production based on the CDLE principle. Experiments and computer simulations were used to study the process. Mesoporous carbon materials, synthesized at the Oak Ridge National Laboratory, were used as electrode materials in the experiments. Neutronmore » imaging of the blue energy cycle was conducted with cylindrical mesoporous carbon electrodes and 0.5 M lithium chloride as the electrolyte solution. For experiments conducted at 0.6 V and 0.9 V applied potential, a voltage increase of 0.061 V and 0.054 V was observed, respectively. From sequences of neutron images obtained for each step of the blue energy cycle, information on the direction and magnitude of lithium ion transport was obtained. A computer code was developed to simulate the process. Experimental data and computer simulations allowed us to predict energy production.« less
A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition
NASA Technical Reports Server (NTRS)
Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.
2012-01-01
A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.
Growth factor involvement in tension-induced skeletal muscle growth
NASA Technical Reports Server (NTRS)
Vandenburgh, H. H.
1987-01-01
Muscle tissue culture techniques were developed to grow skeletal myofibers which differentiate into more adult-like myofibers. Mechanical simulation studies of these muscle cells in a newly developed mechanical cell simulator can now be performed to study growth processes in skeletal muscle. Conditions in the mechanical cell simulator were defined where mechanical activity can either prevent muscle wasting or stimulate muscle growth. The role of endogenous and exogenous growth factors in tension-induced muscle growth is being investigated under the defined conditions of tissue culture.
Application of simulation models for the optimization of business processes
NASA Astrophysics Data System (ADS)
Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří
2016-06-01
The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.
Signal Processing Studies of a Simulated Laser Doppler Velocimetry-Based Acoustic Sensor
1990-10-17
investigated using spectral correlation methods. Results indicate that it may be possible to extend demonstrated LDV-based acoustic sensor sensitivities using higher order processing techniques. (Author)
A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berres, Anne Sabine; Adhinarayanan, Vignesh; Turton, Terece
2017-05-12
Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline atmore » the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.« less
Negotiating the role of the professional nurse: The pedagogy of simulation: a grounded theory study.
Walton, Joni; Chute, Elizabeth; Ball, Lynda
2011-01-01
Simulation is the mainstay of laboratory education in health sciences, yet there is a void of pedagogy-the art and science of teaching. Nursing faculty does not have adequate evidence-based resources related to how students learn through simulation. The research questions that were addressed were as follows: (a) How do students learn using simulation? (b) What is the process of learning with simulations from the students' perspective? (c) What faculty teaching styles promote learning? and (d) How can faculty support students during simulation? Grounded theory methodology was used to explore how senior baccalaureate nursing students learn using simulation. Twenty-six students participated in this research study. Sixteen nursing students who completed two semesters of simulation courses volunteered for in-depth audio-taped interviews. In addition, there were two focus groups with five senior students in each group who validated findings and identified faculty teaching styles and supportive interventions. Negotiating the Role of the Professional Nurse was the core category, which included the following phases (I) feeling like an imposter, (II) trial and error, (III) taking it seriously, (IV) transference of skills and knowledge, and (V) professionalization. Faculty traits and teaching strategies for teaching with simulation were also identified. A conceptual model of the socialization process was developed to assist faculty in understanding the ways students learn with simulation and ways to facilitate their development. These findings provide a midrange theory for the pedagogy of simulation and will help faculty gain insight and help to assimilate into teaching-learning strategies. Published by Elsevier Inc.
Impact of diabatic processes on the tropopause inversion layer formation in baroclinic life cycles
NASA Astrophysics Data System (ADS)
Kunkel, Daniel; Hoor, Peter; Wirth, Volkmar
2015-04-01
Observations of temperature profiles in the extratropical upper troposphere/lower stratosphere (UTLS) show the presence of an inversion layer just above the thermal tropopause, i.e., the tropopause inversion layer (TIL). In recent studies both diabatic and adiabatic processes have been identified to contribute to the formation of this layer. In particular, adiabatic simulations indicate a TIL formation without the explicit simulation of diabatic, i.e. radiative or humidity related, processes after wave breaking during baroclinic life cycles. One goal of this study is to assess the additional contribution of diabatic processes to the formation and strength of the TIL in such life cycles. Moreover, since irreversible stratosphere-troposphere exchange (STE) is another inherent feature of baroclinic life cycles and a consequence of diabatic processes, we study whether there is a relationship between STE and TIL. We use the non-hydrostatic model COSMO in an idealized mid-latitude channel configuration to simulate baroclinic life cycles. In a first step contributions of individual diabatic processes from turbulence, radiation, and cloud microphysics to the formation of the TIL are analyzed. These results are compared to those from adiabatic simulations of baroclinic life cycles in which the TIL forms during the life cycle with the limitation of being less sharp than in observations. In a second step the combined effects of several diabatic processes are studied to further include interactions between these processes as well as to advance towards a more realistic model setup. The results suggest a much more vigorous development of the TIL due to microphysics and the release of latent heat. Moreover, radiative effects can foster an increase in static stability above the thermal tropopause when large gradients of either water vapor or cloud ice are present at the level of the tropopause. By additionally adding sub-grid scale turbulence, a co-location of high static stability and increased turbulent kinetic energy is found in the vicinity of cirrus clouds at the tropopause level. The potential relation between STE and high static stability is further discussed based on results from trajectory calculations and the distribution of passive tracers of tropospheric and stratospheric origin.
Modeling nuclear processes by Simulink
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my
2015-04-29
Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox softwaremore » that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Provost, G.; Stone, H.; McClintock, M.
2008-01-01
To meet the growing demand for education and experience with the analysis, operation, and control of commercial-scale Integrated Gasification Combined Cycle (IGCC) plants, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) is leading a collaborative R&D project with participants from government, academia, and industry. One of the goals of this project is to develop a generic, full-scope, real-time generic IGCC dynamic plant simulator for use in establishing a world-class research and training center, as well as to promote and demonstrate the technology to power industry personnel. The NETL IGCC dynamic plant simulator will combine for the first timemore » a process/gasification simulator and a power/combined-cycle simulator together in a single dynamic simulation framework for use in training applications as well as engineering studies. As envisioned, the simulator will have the following features and capabilities: A high-fidelity, real-time, dynamic model of process-side (gasification and gas cleaning with CO2 capture) and power-block-side (combined cycle) for a generic IGCC plant fueled by coal and/or petroleum coke Full-scope training simulator capabilities including startup, shutdown, load following and shedding, response to fuel and ambient condition variations, control strategy analysis (turbine vs. gasifier lead, etc.), representative malfunctions/trips, alarms, scenarios, trending, snapshots, data historian, and trainee performance monitoring The ability to enhance and modify the plant model to facilitate studies of changes in plant configuration and equipment and to support future R&D efforts To support this effort, process descriptions and control strategies were developed for key sections of the plant as part of the detailed functional specification, which will form the basis of the simulator development. These plant sections include: Slurry Preparation Air Separation Unit Gasifiers Syngas Scrubbers Shift Reactors Gas Cooling, Medium Pressure (MP) and Low Pressure (LP) Steam Generation, and Knockout Sour Water Stripper Mercury Removal Selexol™ Acid Gas Removal System CO2 Compression Syngas Reheat and Expansion Claus Plant Hydrogenation Reactor and Gas Cooler Combustion Turbine (CT)-Generator Assemblies Heat Recovery Steam Generators (HRSGs) and Steam Turbine (ST)-Generator In this paper, process descriptions, control strategies, and Process & Instrumentation Diagram (P&ID) drawings for key sections of the generic IGCC plant are presented, along with discussions of some of the operating procedures and representative faults that the simulator will cover. Some of the intended future applications for the simulator are discussed, including plant operation and control demonstrations as well as education and training services such as IGCC familiarization courses.« less
Atomistic Simulations of Graphene Growth: From Kinetics to Mechanism.
Qiu, Zongyang; Li, Pai; Li, Zhenyu; Yang, Jinlong
2018-03-20
Epitaxial growth is a promising strategy to produce high-quality graphene samples. At the same time, this method has great flexibility for industrial scale-up. To optimize growth protocols, it is essential to understand the underlying growth mechanisms. This is, however, very challenging, as the growth process is complicated and involves many elementary steps. Experimentally, atomic-scale in situ characterization methods are generally not feasible at the high temperature of graphene growth. Therefore, kinetics is the main experimental information to study growth mechanisms. Theoretically, first-principles calculations routinely provide atomic structures and energetics but have a stringent limit on the accessible spatial and time scales. Such gap between experiment and theory can be bridged by atomistic simulations using first-principles atomic details as input and providing the overall growth kinetics, which can be directly compared with experiment, as output. Typically, system-specific approximations should be applied to make such simulations computationally feasible. By feeding kinetic Monte Carlo (kMC) simulations with first-principles parameters, we can directly simulate the graphene growth process and thus understand the growth mechanisms. Our simulations suggest that the carbon dimer is the dominant feeding species in the epitaxial growth of graphene on both Cu(111) and Cu(100) surfaces, which enables us to understand why the reaction is diffusion limited on Cu(111) but attachment limited on Cu(100). When hydrogen is explicitly considered in the simulation, the central role hydrogen plays in graphene growth is revealed, which solves the long-standing puzzle into why H 2 should be fed in the chemical vapor deposition of graphene. The simulation results can be directly compared with the experimental kinetic data, if available. Our kMC simulations reproduce the experimentally observed quintic-like behavior of graphene growth on Ir(111). By checking the simulation results, we find that such nonlinearity is caused by lattice mismatch, and the induced growth front inhomogeneity can be universally used to predict growth behaviors in other heteroepitaxial systems. Notably, although experimental kinetics usually gives useful insight into atomic mechanisms, it can sometimes be misleading. Such pitfalls can be avoided via atomistic simulations, as demonstrated in our study of the graphene etching process. Growth protocols can be designed theoretically with computational kinetic and mechanistic information. By contrasting the different activation energies involved in an atom-exchange-based carbon penetration process for monolayer and bilayer graphene, we propose a three-step strategy to grow high-quality bilayer graphene. Based on first-principles parameters, a kinetic pathway toward the high-density, ordered N doping of epitaxial graphene on Cu(111) using a C 5 NCl 5 precursor is also identified. These studies demonstrate that atomistic simulations can unambiguously produce or reproduce the kinetic information on graphene growth, which is pivotal to understanding the growth mechanism and designing better growth protocols. A similar strategy can be used in growth mechanism studies of other two-dimensional atomic crystals.
Modelling and simulating reaction-diffusion systems using coloured Petri nets.
Liu, Fei; Blätke, Mary-Ann; Heiner, Monika; Yang, Ming
2014-10-01
Reaction-diffusion systems often play an important role in systems biology when developmental processes are involved. Traditional methods of modelling and simulating such systems require substantial prior knowledge of mathematics and/or simulation algorithms. Such skills may impose a challenge for biologists, when they are not equally well-trained in mathematics and computer science. Coloured Petri nets as a high-level and graphical language offer an attractive alternative, which is easily approachable. In this paper, we investigate a coloured Petri net framework integrating deterministic, stochastic and hybrid modelling formalisms and corresponding simulation algorithms for the modelling and simulation of reaction-diffusion processes that may be closely coupled with signalling pathways, metabolic reactions and/or gene expression. Such systems often manifest multiscaleness in time, space and/or concentration. We introduce our approach by means of some basic diffusion scenarios, and test it against an established case study, the Brusselator model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Particle-In-Cell simulations of high pressure plasmas using graphics processing units
NASA Astrophysics Data System (ADS)
Gebhardt, Markus; Atteln, Frank; Brinkmann, Ralf Peter; Mussenbrock, Thomas; Mertmann, Philipp; Awakowicz, Peter
2009-10-01
Particle-In-Cell (PIC) simulations are widely used to understand the fundamental phenomena in low-temperature plasmas. Particularly plasmas at very low gas pressures are studied using PIC methods. The inherent drawback of these methods is that they are very time consuming -- certain stability conditions has to be satisfied. This holds even more for the PIC simulation of high pressure plasmas due to the very high collision rates. The simulations take up to very much time to run on standard computers and require the help of computer clusters or super computers. Recent advances in the field of graphics processing units (GPUs) provides every personal computer with a highly parallel multi processor architecture for very little money. This architecture is freely programmable and can be used to implement a wide class of problems. In this paper we present the concepts of a fully parallel PIC simulation of high pressure plasmas using the benefits of GPU programming.
Experimentally modeling stochastic processes with less memory by the use of a quantum processor
Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.
2017-01-01
Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218
Flux canceling in three-dimensional radiative magnetohydrodynamic simulations
NASA Astrophysics Data System (ADS)
Thaler, Irina; Spruit, H. C.
2017-05-01
We aim to study the processes involved in the disappearance of magnetic flux between regions of opposite polarity on the solar surface using realistic three-dimensional (3D) magnetohydrodynamic (MHD) simulations. "Retraction" below the surface driven by magnetic forces is found to be a very effective mechanism of flux canceling of opposite polarities. The speed at which flux disappears increases strongly with initial mean flux density. In agreement with existing inferences from observations we suggest that this is a key process of flux disappearance within active complexes. Intrinsic kG strength concentrations connect the surface to deeper layers by magnetic forces, and therefore the influence of deeper layers on the flux canceling process is studied. We do this by comparing simulations extending to different depths. For average flux densities of 50 G, and on length scales on the order of 3 Mm in the horizontal and 10 Mm in depth, deeper layers appear to have only a mild influence on the effective rate of diffusion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou Yu, E-mail: yzou@Princeton.ED; Kavousanakis, Michail E., E-mail: mkavousa@Princeton.ED; Kevrekidis, Ioannis G., E-mail: yannis@Princeton.ED
2010-07-20
The study of particle coagulation and sintering processes is important in a variety of research studies ranging from cell fusion and dust motion to aerosol formation applications. These processes are traditionally simulated using either Monte-Carlo methods or integro-differential equations for particle number density functions. In this paper, we present a computational technique for cases where we believe that accurate closed evolution equations for a finite number of moments of the density function exist in principle, but are not explicitly available. The so-called equation-free computational framework is then employed to numerically obtain the solution of these unavailable closed moment equations bymore » exploiting (through intelligent design of computational experiments) the corresponding fine-scale (here, Monte-Carlo) simulation. We illustrate the use of this method by accelerating the computation of evolving moments of uni- and bivariate particle coagulation and sintering through short simulation bursts of a constant-number Monte-Carlo scheme.« less
Simulation models and designs for advanced Fischer-Tropsch technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, G.N.; Kramer, S.J.; Tam, S.S.
1995-12-31
Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for themore » products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.« less
Study of the coupling between real gas effects and rarefied effects on hypersonic aerodynamics
NASA Astrophysics Data System (ADS)
Chen, Song; Hu, Yuan; Sun, Quanhua
2012-11-01
Hypersonic vehicles travel across the atmosphere at very high speed, and the surrounding gas experiences complicated physical and chemical processes. These processes produce real gas effects at high temperature and rarefied gas effects at high altitude where the two effects are coupled through molecular collisions. In this study, we aim to identify the individual real gas and rarefied gas effects by simulating hypersonic flow over a 2D cylinder, a sphere and a blunted cone using a continuum-based CFD approach and the direct simulation Monte Carlo method. It is found that physical processes such as vibrational excitation and chemical reaction will reduce significantly the shock stand-off distance and flow temperature for flows having small Knudsen number. The calculated skin friction and surface heat flux will decrease when the real gas effects are considered in simulations. The trend, however, gets weakened as the Knudsen number increases. It is concluded that the rarefied gas effects weaken the real gas effects on hypersonic flows.
NASA Astrophysics Data System (ADS)
Lin, Caiyan; Zhang, Zhongfeng; Pu, Zhaoxia; Wang, Fengyun
2017-10-01
A series of numerical simulations is conducted to understand the formation, evolution, and dissipation of an advection fog event over Shanghai Pudong International Airport (ZSPD) with the Weather Research and Forecasting (WRF) model. Using the current operational settings at the Meteorological Center of East China Air Traffic Management Bureau, the WRF model successfully predicts the fog event at ZSPD. Additional numerical experiments are performed to examine the physical processes associated with the fog event. The results indicate that prediction of this particular fog event is sensitive to microphysical schemes for the time of fog dissipation but not for the time of fog onset. The simulated timing of the arrival and dissipation of the fog, as well as the cloud distribution, is substantially sensitive to the planetary boundary layer and radiation (both longwave and shortwave) processes. Moreover, varying forecast lead times also produces different simulation results for the fog event regarding its onset and duration, suggesting a trade-off between more accurate initial conditions and a proper forecast lead time that allows model physical processes to spin up adequately during the fog simulation. The overall outcomes from this study imply that the complexity of physical processes and their interactions within the WRF model during fog evolution and dissipation is a key area of future research.
NASA Technical Reports Server (NTRS)
Changsheng, LI; Frolking, Steve; Frolking, Tod A.
1992-01-01
Simulations of N2O and CO2 emissions from soils were conducted with a rain-event driven, process-oriented model (DNDC) of nitrogen and carbon cycling processes in soils. The magnitude and trends of simulated N2O (or N2O + N2) and CO2 emissions were consistent with the results obtained in field experiments. The successful simulation of these emissions from the range of soil types examined demonstrates that the DNDC will be a useful tool for the study of linkages among climate, soil-atmosphere interactions, land use, and trace gas fluxes.
Low Gravity Freefall Facilities
NASA Technical Reports Server (NTRS)
1981-01-01
Composite of Marshall Space Flight Center's Low-Gravity Free Fall Facilities.These facilities include a 100-meter drop tower and a 100-meter drop tube. The drop tower simulates in-flight microgravity conditions for up to 4.2 seconds for containerless processing experiments, immiscible fluids and materials research, pre-flight hardware design test and flight experiment simulation. The drop tube simulates in-flight microgravity conditions for up to 4.6 seconds and is used extensively for ground-based microgravity convection research in which extremely small samples are studied. The facility can provide deep undercooling for containerless processing experiments that require materials to remain in a liquid phase when cooled below the normal solidification temperature.
Li, Jingrui; Kondov, Ivan; Wang, Haobin; Thoss, Michael
2015-04-10
A recently developed methodology to simulate photoinduced electron transfer processes at dye-semiconductor interfaces is outlined. The methodology employs a first-principles-based model Hamiltonian and accurate quantum dynamics simulations using the multilayer multiconfiguration time-dependent Hartree approach. This method is applied to study electron injection in the dye-semiconductor system coumarin 343-TiO2. Specifically, the influence of electronic-vibrational coupling is analyzed. Extending previous work, we consider the influence of Dushinsky rotation of the normal modes as well as anharmonicities of the potential energy surfaces on the electron transfer dynamics.
1981-03-30
Composite of Marshall Space Flight Center's Low-Gravity Free Fall Facilities.These facilities include a 100-meter drop tower and a 100-meter drop tube. The drop tower simulates in-flight microgravity conditions for up to 4.2 seconds for containerless processing experiments, immiscible fluids and materials research, pre-flight hardware design test and flight experiment simulation. The drop tube simulates in-flight microgravity conditions for up to 4.6 seconds and is used extensively for ground-based microgravity convection research in which extremely small samples are studied. The facility can provide deep undercooling for containerless processing experiments that require materials to remain in a liquid phase when cooled below the normal solidification temperature.
NASA Astrophysics Data System (ADS)
De Lucia, Marco; Kempka, Thomas; Afanasyev, Andrey; Melnik, Oleg; Kühn, Michael
2016-04-01
Coupled reactive transport simulations, especially in heterogeneous settings considering multiphase flow, are extremely time consuming and suffer from significant numerical issues compared to purely hydrodynamic simulations. This represents a major hurdle in the assessment of geological subsurface utilization, since it constrains the practical application of reactive transport modelling to coarse spatial discretization or oversimplified geological settings. In order to overcome such limitations, De Lucia et al. [1] developed and validated a one-way coupling approach between geochemistry and hydrodynamics, which is particularly well suited for CO2 storage simulations, while being of general validity. In the present study, the models used for the validation of the one-way coupling approach introduced by De Lucia et al. (2015), and originally performed with the TOUGHREACT simulator, are transferred to and benchmarked against the multiphase reservoir simulator MUFITS [2]. The geological model is loosely inspired by an existing CO2 storage site. Its grid comprises 2,950 elements enclosed in a single layer, but reflecting a realistic three-dimensional anticline geometry. For the purpose of this comparison, homogeneous and heterogeneous scenarios in terms of porosity and permeability were investigated. In both cases, the results of the MUFITS simulator are in excellent agreement with those produced with the fully-coupled TOUGHREACT simulator, while profiting from significantly higher computational performance. This study demonstrates how a computationally efficient simulator such as MUFITS can be successfully included in a coupled process simulation framework, and also suggests ameliorations and specific strategies for the coupling of chemical processes with hydrodynamics and heat transport, aiming at tackling geoscientific problems beyond the storage of CO2. References [1] De Lucia, M., Kempka, T., and Kühn, M. A coupling alternative to reactive transport simulations for long-term prediction of chemical reactions in heterogeneous CO2 storage systems, Geosci. Model Dev., 8, 279-294, 2015, doi:10.5194/gmd-8-279-2015 [2] Afanasyev, A.A. Application of the reservoir simulator MUFITS for 3D modeling of CO2 storage in geological formations, Energy Procedia, 40, 365-374, 2013, doi:10.1016/j.egypro.2013.08.042
NASA Astrophysics Data System (ADS)
Gayler, Sebastian; Wöhling, Thomas; Högy, Petra; Ingwersen, Joachim; Wizemann, Hans-Dieter; Wulfmeyer, Volker; Streck, Thilo
2013-04-01
During the last years, land-surface models have proven to perform well in several studies that compared simulated fluxes of water and energy from the land surface to the atmosphere against measured fluxes at the plot-scale. In contrast, considerable deficits of land-surface models have been identified to simulate soil water fluxes and vertical soil moisture distribution. For example, Gayler et al. (2013) showed that simplifications in the representation of root water uptake can result in insufficient simulations of the vertical distribution of soil moisture and its dynamics. However, in coupled simulations of the terrestrial water cycle, both sub-systems, the atmosphere and the subsurface hydrogeo-system, must fit together and models are needed, which are able to adequately simulate soil moisture, latent heat flux, and their interrelationship. Consequently, land-surface models must be further improved, e.g. by incorporation of advanced biogeophysics models. To improve the conceptual realism in biophysical and hydrological processes in the community land surface model Noah, this model was recently enhanced to Noah-MP by a multi-options framework to parameterize individual processes (Niu et al., 2011). Thus, in Noah-MP the user can choose from several alternative models for vegetation and hydrology processes that can be applied in different combinations. In this study, we evaluate the performance of different Noah-MP model settings to simulate water and energy fluxes across the land surface at two contrasting field sites in South-West Germany. The evaluation is done in 1D offline-mode, i.e. without coupling to an atmospheric model. The atmospheric forcing is provided by measured time series of the relevant variables. Simulation results are compared with eddy covariance measurements of turbulent fluxes and measured time series of soil moisture at different depths. The aims of the study are i) to carve out the most appropriate combination of process parameterizations in Noah-MP to simultaneously match the different components of the water and energy cycle at the field sites under consideration, and ii) to estimate the uncertainty in model structure. We further investigate the potential to improve simulation results by incorporating concepts of more advanced root water uptake models from agricultural field scale models into the land-surface-scheme. Gayler S, Ingwersen J, Priesack E, Wöhling T, Wulfmeyer V, Streck T (2013): Assessing the relevance of sub surface processes for the simulation of evapotranspiration and soil moisture dynamics with CLM3.5: Comparison with field data and crop model simulations. Environ. Earth Sci., 69(2), under revision. Niu G-Y, Yang Z-L, Mitchell KE, Chen F, Ek MB, Barlage M, Kumar A, Manning K, Niyogi D, Rosero E, Tewari M and Xia Y (2011): The community Noah land surface model with multiparameterization options (Noah-MP): 1. Model description and evaluation with local-scale measurements. Journal of Geophysical Research 116(D12109).
McArthur, Kimberly L; Dickman, J David
2011-04-01
Vestibular responses play an important role in maintaining gaze and posture stability during rotational motion. Previous studies suggest that these responses are state dependent, their expression varying with the environmental and locomotor conditions of the animal. In this study, we simulated an ethologically relevant state in the laboratory to study state-dependent vestibular responses in birds. We used frontal airflow to simulate gliding flight and measured pigeons' eye, head, and tail responses to rotational motion in darkness, under both head-fixed and head-free conditions. We show that both eye and head response gains are significantly higher during flight, thus enhancing gaze and head-in-space stability. We also characterize state-specific tail responses to pitch and roll rotation that would help to maintain body-in-space orientation during flight. These results demonstrate that vestibular sensorimotor processing is not fixed but depends instead on the animal's behavioral state.
State-dependent sensorimotor processing: gaze and posture stability during simulated flight in birds
McArthur, Kimberly L.
2011-01-01
Vestibular responses play an important role in maintaining gaze and posture stability during rotational motion. Previous studies suggest that these responses are state dependent, their expression varying with the environmental and locomotor conditions of the animal. In this study, we simulated an ethologically relevant state in the laboratory to study state-dependent vestibular responses in birds. We used frontal airflow to simulate gliding flight and measured pigeons′ eye, head, and tail responses to rotational motion in darkness, under both head-fixed and head-free conditions. We show that both eye and head response gains are significantly higher during flight, thus enhancing gaze and head-in-space stability. We also characterize state-specific tail responses to pitch and roll rotation that would help to maintain body-in-space orientation during flight. These results demonstrate that vestibular sensorimotor processing is not fixed but depends instead on the animal's behavioral state. PMID:21307332
Jiang, Xianan; Waliser, Duane E.; Xavier, Prince K.; ...
2015-05-27
Aimed at reducing deficiencies in representing the Madden-Julian oscillation (MJO) in general circulation models (GCMs), a global model evaluation project on vertical structure and physical processes of the MJO was coordinated. In this paper, results from the climate simulation component of this project are reported. Here, it is shown that the MJO remains a great challenge in these latest generation GCMs. The systematic eastward propagation of the MJO is only well simulated in about one fourth of the total participating models. The observed vertical westward tilt with altitude of the MJO is well simulated in good MJO models but notmore » in the poor ones. Damped Kelvin wave responses to the east of convection in the lower troposphere could be responsible for the missing MJO preconditioning process in these poor MJO models. Several process-oriented diagnostics were conducted to discriminate key processes for realistic MJO simulations. While large-scale rainfall partition and low-level mean zonal winds over the Indo-Pacific in a model are not found to be closely associated with its MJO skill, two metrics, including the low-level relative humidity difference between high- and low-rain events and seasonal mean gross moist stability, exhibit statistically significant correlations with the MJO performance. It is further indicated that increased cloud-radiative feedback tends to be associated with reduced amplitude of intraseasonal variability, which is incompatible with the radiative instability theory previously proposed for the MJO. Finally, results in this study confirm that inclusion of air-sea interaction can lead to significant improvement in simulating the MJO.« less
Numerical simulation of the processes in the normal incidence tube for high acoustic pressure levels
NASA Astrophysics Data System (ADS)
Fedotov, E. S.; Khramtsov, I. V.; Kustov, O. Yu.
2016-10-01
Numerical simulation of the acoustic processes in an impedance tube at high levels of acoustic pressure is a way to solve a problem of noise suppressing by liners. These studies used liner specimen that is one cylindrical Helmholtz resonator. The evaluation of the real and imaginary parts of the liner acoustic impedance and sound absorption coefficient was performed for sound pressure levels of 130, 140 and 150 dB. The numerical simulation used experimental data having been obtained on the impedance tube with normal incidence waves. At the first stage of the numerical simulation it was used the linearized Navier-Stokes equations, which describe well the imaginary part of the liner impedance whatever the sound pressure level. These equations were solved by finite element method in COMSOL Multiphysics program in axisymmetric formulation. At the second stage, the complete Navier-Stokes equations were solved by direct numerical simulation in ANSYS CFX in axisymmetric formulation. As the result, the acceptable agreement between numerical simulation and experiment was obtained.
NASA Astrophysics Data System (ADS)
Qi, Chenkun; Gao, Feng; Zhao, Xianchao; Wang, Qian; Ren, Anye
2018-06-01
On the ground the hardware-in-the-loop (HIL) simulation is a good approach to test the contact dynamics of spacecraft docking process in space. Unfortunately, due to the time delay in the system the HIL contact simulation becomes divergent. However, the traditional first-order phase lead compensation approach still result in a small divergence for the pure time delay. The serial Smith predictor and phase lead compensation approach proposed by the authors recently will lead to an over-compensation and an obvious convergence. In this study, a hybrid Smith predictor and phase lead compensation approach is proposed. The hybrid Smith predictor and phase lead compensation can achieve a higher simulation fidelity with a little convergence. The phase angle of the compensator is analyzed and the stability condition of the HIL simulation system is given. The effectiveness of the proposed compensation approach is tested by simulations on an undamped elastic contact process.
NASA Astrophysics Data System (ADS)
Caviedes-Voullième, Daniel; García-Navarro, Pilar; Murillo, Javier
2012-07-01
SummaryHydrological simulation of rain-runoff processes is often performed with lumped models which rely on calibration to generate storm hydrographs and study catchment response to rain. In this paper, a distributed, physically-based numerical model is used for runoff simulation in a mountain catchment. This approach offers two advantages. The first is that by using shallow-water equations for runoff flow, there is less freedom to calibrate routing parameters (as compared to, for example, synthetic hydrograph methods). The second, is that spatial distributions of water depth and velocity can be obtained. Furthermore, interactions among the various hydrological processes can be modeled in a physically-based approach which may depend on transient and spatially distributed factors. On the other hand, the undertaken numerical approach relies on accurate terrain representation and mesh selection, which also affects significantly the computational cost of the simulations. Hence, we investigate the response of a gauged catchment with this distributed approach. The methodology consists of analyzing the effects that the mesh has on the simulations by using a range of meshes. Next, friction is applied to the model and the response to variations and interaction with the mesh is studied. Finally, a first approach with the well-known SCS Curve Number method is studied to evaluate its behavior when coupled with a shallow-water model for runoff flow. The results show that mesh selection is of great importance, since it may affect the results in a magnitude as large as physical factors, such as friction. Furthermore, results proved to be less sensitive to roughness spatial distribution than to mesh properties. Finally, the results indicate that SCS-CN may not be suitable for simulating hydrological processes together with a shallow-water model.
NASA Astrophysics Data System (ADS)
Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan
2017-07-01
The creation of an accurate simulation of future urban growth is considered one of the most important challenges in urban studies that involve spatial modeling. The purpose of this study is to improve the simulation capability of an integrated CA-Markov Chain (CA-MC) model using CA-MC based on the Analytical Hierarchy Process (AHP) and CA-MC based on Frequency Ratio (FR), both applied in Seremban, Malaysia, as well as to compare the performance and accuracy between the traditional and hybrid models. Various physical, socio-economic, utilities, and environmental criteria were used as predictors, including elevation, slope, soil texture, population density, distance to commercial area, distance to educational area, distance to residential area, distance to industrial area, distance to roads, distance to highway, distance to railway, distance to power line, distance to stream, and land cover. For calibration, three models were applied to simulate urban growth trends in 2010; the actual data of 2010 were used for model validation utilizing the Relative Operating Characteristic (ROC) and Kappa coefficient methods Consequently, future urban growth maps of 2020 and 2030 were created. The validation findings confirm that the integration of the CA-MC model with the FR model and employing the significant driving force of urban growth in the simulation process have resulted in the improved simulation capability of the CA-MC model. This study has provided a novel approach for improving the CA-MC model based on FR, which will provide powerful support to planners and decision-makers in the development of future sustainable urban planning.
A meta-analysis of outcomes from the use of computer-simulated experiments in science education
NASA Astrophysics Data System (ADS)
Lejeune, John Van
The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.
Nonlinear vs. linear biasing in Trp-cage folding simulations
NASA Astrophysics Data System (ADS)
Spiwok, Vojtěch; Oborský, Pavel; Pazúriková, Jana; Křenek, Aleš; Králová, Blanka
2015-03-01
Biased simulations have great potential for the study of slow processes, including protein folding. Atomic motions in molecules are nonlinear, which suggests that simulations with enhanced sampling of collective motions traced by nonlinear dimensionality reduction methods may perform better than linear ones. In this study, we compare an unbiased folding simulation of the Trp-cage miniprotein with metadynamics simulations using both linear (principle component analysis) and nonlinear (Isomap) low dimensional embeddings as collective variables. Folding of the mini-protein was successfully simulated in 200 ns simulation with linear biasing and non-linear motion biasing. The folded state was correctly predicted as the free energy minimum in both simulations. We found that the advantage of linear motion biasing is that it can sample a larger conformational space, whereas the advantage of nonlinear motion biasing lies in slightly better resolution of the resulting free energy surface. In terms of sampling efficiency, both methods are comparable.
Nonlinear vs. linear biasing in Trp-cage folding simulations.
Spiwok, Vojtěch; Oborský, Pavel; Pazúriková, Jana; Křenek, Aleš; Králová, Blanka
2015-03-21
Biased simulations have great potential for the study of slow processes, including protein folding. Atomic motions in molecules are nonlinear, which suggests that simulations with enhanced sampling of collective motions traced by nonlinear dimensionality reduction methods may perform better than linear ones. In this study, we compare an unbiased folding simulation of the Trp-cage miniprotein with metadynamics simulations using both linear (principle component analysis) and nonlinear (Isomap) low dimensional embeddings as collective variables. Folding of the mini-protein was successfully simulated in 200 ns simulation with linear biasing and non-linear motion biasing. The folded state was correctly predicted as the free energy minimum in both simulations. We found that the advantage of linear motion biasing is that it can sample a larger conformational space, whereas the advantage of nonlinear motion biasing lies in slightly better resolution of the resulting free energy surface. In terms of sampling efficiency, both methods are comparable.
Cirrus Simulations of CRYSTAL-FACE 23 July 2002 Case
NASA Technical Reports Server (NTRS)
Starr, David; Lin, Ruci-Fong; Demoz, Belay; Lare, Andrew
2004-01-01
A key objective of the Cirrus Regional Study of Tropical Anvils and Cirrus Layers - Florida Area Cirrus Experiment (CRYSTAL-FACE) is to understand relationships between the properties of tropical convective cloud systems and the properties and lifecycle of the extended cirrus anvils they produce. We report here on a case study of 23 July 2002 where a sequence of convective storms over central Florida produced an extensive anvil outflow. Our approach is to use a suitably-initialized cloud-system simulation with MM5 to define initial conditions and time-dependent forcing for a simulation of anvil evolution using a two-dimensional fine-resolution (100 m) cirrus cloud model that explicitly accounts for details of cirrus microphysical development (bin or spectra model) and fully interactive radiative processes. The cirrus model follows Lin. Meteorological conditions and observations for the 23 July case are described in this volume. The goals of the present study are to evaluate how well we can simulate a cirrus anvil lifecycle, to evaluate the importance of various physical processes that operate within the anvil, and to evaluate the importance of environmental conditions in regulating anvil lifecycle. CRYSTAL-FACE produced a number of excellent case studies of anvil systems that will allow environmental factors, such as static stability or wind shear in the upper troposphere, to be examined. In the present study, we strive to assess the importance of propagating gravity waves, likely produced by the deep convection itself, and radiative processes, to anvil lifecycle and characteristics.
Catalytic reforming is an important refinery process for the conversion of low-octane naphtha (mostly paraffins) into high-octane motor fuels (isoparaffins, naphthenes and aromatics), light gases and hydrogen. In this study the catalytic reforming process is analyzed under differ...
Kovalchuk, Sergey V; Funkner, Anastasia A; Metsker, Oleg G; Yakovlev, Aleksey N
2018-06-01
An approach to building a hybrid simulation of patient flow is introduced with a combination of data-driven methods for automation of model identification. The approach is described with a conceptual framework and basic methods for combination of different techniques. The implementation of the proposed approach for simulation of the acute coronary syndrome (ACS) was developed and used in an experimental study. A combination of data, text, process mining techniques, and machine learning approaches for the analysis of electronic health records (EHRs) with discrete-event simulation (DES) and queueing theory for the simulation of patient flow was proposed. The performed analysis of EHRs for ACS patients enabled identification of several classes of clinical pathways (CPs) which were used to implement a more realistic simulation of the patient flow. The developed solution was implemented using Python libraries (SimPy, SciPy, and others). The proposed approach enables more a realistic and detailed simulation of the patient flow within a group of related departments. An experimental study shows an improved simulation of patient length of stay for ACS patient flow obtained from EHRs in Almazov National Medical Research Centre in Saint Petersburg, Russia. The proposed approach, methods, and solutions provide a conceptual, methodological, and programming framework for the implementation of a simulation of complex and diverse scenarios within a flow of patients for different purposes: decision making, training, management optimization, and others. Copyright © 2018 Elsevier Inc. All rights reserved.
Design of penicillin fermentation process simulation system
NASA Astrophysics Data System (ADS)
Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi
2011-10-01
Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.
NASA Astrophysics Data System (ADS)
Nellist, C.; Dinu, N.; Gkougkousis, E.; Lounis, A.
2015-06-01
The LHC accelerator complex will be upgraded between 2020-2022, to the High-Luminosity-LHC, to considerably increase statistics for the various physics analyses. To operate under these challenging new conditions, and maintain excellent performance in track reconstruction and vertex location, the ATLAS pixel detector must be substantially upgraded and a full replacement is expected. Processing techniques for novel pixel designs are optimised through characterisation of test structures in a clean room and also through simulations with Technology Computer Aided Design (TCAD). A method to study non-perpendicular tracks through a pixel device is discussed. Comparison of TCAD simulations with Secondary Ion Mass Spectrometry (SIMS) measurements to investigate the doping profile of structures and validate the simulation process is also presented.
Lean flammability limit of downward propagating hydrogen-air flames
NASA Technical Reports Server (NTRS)
Patnaik, G.; Kailasanath, K.
1992-01-01
Detailed multidimensional numerical simulations that include the effects of wall heat losses have been performed to study the dynamics of downward flame propagation and extinguishment in lean hydrogen-air mixtures. The computational results show that a downward propagating flame in an isothermal channel has a flammability limit of around 9.75 percent. This is in excellent agreement with experimental results. Also in excellent agreement are the detailed observations of the flame behavior at the point of extinguishment. The primary conclusion of this work is that detailed numerical simulations that include wall heat losses and the effect of gravity can adequately simulate the dynamics of the extinguishment process in downward-propagating hydrogen-air flames. These simulations can be examined in detail to gain understanding of the actual extinction process.
Yamin, Stephanie; Stinchcombe, Arne; Gagnon, Sylvain
2016-06-01
This study sought to predict driving performance of drivers with Alzheimer's disease (AD) using measures of attention, visual processing, and global cognition. Simulated driving performance of individuals with mild AD (n = 20) was contrasted with performance of a group of healthy controls (n = 21). Performance on measures of global cognitive function and specific tests of attention and visual processing were examined in relation to simulated driving performance. Strong associations were observed between measures of attention, notably the Test of Everyday Attention (sustained attention; r = -.651, P = .002) and the Useful Field of View (r = .563, P = .010), and driving performance among drivers with mild AD. The Visual Object and Space Perception Test-object was significantly correlated with the occurrence of crashes (r = .652, P = .002). Tests of global cognition did not correlate with simulated driving outcomes. The results suggest that professionals exercise caution when extrapolating driving performance based on global cognitive indicators. © The Author(s) 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong, Hong-Yu; Gu, Wei-Min, E-mail: guwm@xmu.edu.cn
2017-04-20
In the classic picture of standard thin accretion disks, viscous heating is balanced by radiative cooling through the diffusion process, and the radiation-pressure-dominated inner disk suffers convective instability. However, recent simulations have shown that, owing to the magnetic buoyancy, the vertical advection process can significantly contribute to energy transport. In addition, in comparing the simulation results with the local convective stability criterion, no convective instability has been found. In this work, following on from simulations, we revisit the vertical structure of radiation-pressure-dominated thin disks and include the vertical advection process. Our study indicates a link between the additional energy transportmore » and the convectively stable property. Thus, the vertical advection not only significantly contributes to the energy transport, but it also plays an important role in making the disk convectively stable. Our analyses may help to explain the discrepancy between classic theory and simulations on standard thin disks.« less
Acoustic response of cemented granular sedimentary rocks: molecular dynamics modeling.
García, Xavier; Medina, Ernesto
2007-06-01
The effect of cementation processes on the acoustical properties of sands is studied via molecular dynamics simulation methods. We propose numerical methods where the initial uncemented sand is built by simulating the settling process of sediments. Uncemented samples of different porosity are considered by emulating natural mechanical compaction of sediments due to overburden. Cementation is considered through a particle-based model that captures the underlying physics behind the process. In our simulations, we consider samples with different degrees of compaction and cementing materials with distinct elastic properties. The microstructure of cemented sands is taken into account while adding cement at specific locations within the pores, such as grain-to-grain contacts. Results show that the acoustical properties of cemented sands are strongly dependent on the amount of cement, its stiffness relative to the hosting medium, and its location within the pores. Simulation results are in good correspondence with available experimental data and compare favorably with some theoretical predictions for the sound velocity within a range of cement saturation, porosity, and confining pressure.
Use of simulated data sets to evaluate the fidelity of metagenomic processing methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mavromatis, K; Ivanova, N; Barry, Kerrie
2007-01-01
Metagenomics is a rapidly emerging field of research for studying microbial communities. To evaluate methods presently used to process metagenomic sequences, we constructed three simulated data sets of varying complexity by combining sequencing reads randomly selected from 113 isolate genomes. These data sets were designed to model real metagenomes in terms of complexity and phylogenetic composition. We assembled sampled reads using three commonly used genome assemblers (Phrap, Arachne and JAZZ), and predicted genes using two popular gene-finding pipelines (fgenesb and CRITICA/GLIMMER). The phylogenetic origins of the assembled contigs were predicted using one sequence similarity-based ( blast hit distribution) and twomore » sequence composition-based (PhyloPythia, oligonucleotide frequencies) binning methods. We explored the effects of the simulated community structure and method combinations on the fidelity of each processing step by comparison to the corresponding isolate genomes. The simulated data sets are available online to facilitate standardized benchmarking of tools for metagenomic analysis.« less
Simulation of Ge Dopant Emission in Indirect-Drive ICF Implosion Experiments
NASA Astrophysics Data System (ADS)
Macfarlane, Joseph; Golovkin, I.; Regan, S.; Epstein, R.; Mancini, R.; Peterson, K.; Suter, L.
2012-10-01
We present results from simulations performed to study the radiative properties of dopants used in inertial confinement fusion indirect-drive capsule implosion experiments on NIF. In Rev5 NIF ignition capsules, a Ge dopant is added to an inner region of the CH ablator to absorb hohlraum x-ray preheat. Spectrally resolved emission from ablator dopants can be used to study the degree of mixing of ablator material into the ignition hot spot. Here, we study the atomic processes that affect the radiative characteristics of these elements using a set of simulation tools to first estimate the evolution of plasma conditions in the compressed target, and then to compute the atomic kinetics of the dopant and the resultant radiative emission. Using estimates of temperature and density profiles predicted by radiation-hydrodynamics simulations, we set up simple plasma grids where we allow dopant material to be embedded in the fuel, and perform multi-dimensional collisional-radiative simulations using SPECT3D to compute non-LTE atomic level populations and spectral signatures from the dopant. Recently improved Stark-broadened line shape modeling for Ge K-shell lines has been included. The goal is to study the radiative and atomic processes that affect the emergent spectra, including the effects of inner-shell photoabsorption and Kα reemission from the dopant, and to study the sensitivity of the emergent spectra to the dopant and the hot spot and ablator conditions.
Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems
NASA Technical Reports Server (NTRS)
Allada, Rama Kumar; Lange, Kevin; Anderson, Molly
2011-01-01
Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA) that were developed using the Aspen Custom Modeler and Aspen Plus process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.
Park, Chanhun; Nam, Hee-Geun; Kim, Pung-Ho; Mun, Sungyong
2014-06-01
The removal of isoleucine from valine has been a key issue in the stage of valine crystallization, which is the final step in the valine production process in industry. To address this issue, a three-zone simulated moving-bed (SMB) process for the separation of valine and isoleucine has been developed previously. However, the previous process, which was based on a classical port-location mode, had some limitations in throughput and valine product concentration. In this study, a three-zone SMB process based on a modified port-location mode was applied to the separation of valine and isoleucine for the purpose of making a marked improvement in throughput and valine product concentration. Computer simulations and a lab-scale process experiment showed that the modified three-zone SMB for valine separation led to >65% higher throughput and >160% higher valine concentration compared to the previous three-zone SMB for the same separation. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Parameter Tuning Scheme of Sea-ice Model Based on Automatic Differentiation Technique
NASA Astrophysics Data System (ADS)
Kim, J. G.; Hovland, P. D.
2001-05-01
Automatic diferentiation (AD) technique was used to illustrate a new approach for parameter tuning scheme of an uncoupled sea-ice model. Atmospheric forcing field of 1992 obtained from NCEP data was used as enforcing variables in the study. The simulation results were compared with the observed ice movement provided by the International Arctic Buoy Programme (IABP). All of the numerical experiments were based on a widely used dynamic and thermodynamic model for simulating the seasonal sea-ice chnage of the main Arctic ocean. We selected five dynamic and thermodynamic parameters for the tuning process in which the cost function defined by the norm of the difference between observed and simulated ice drift locations was minimized. The selected parameters are the air and ocean drag coefficients, the ice strength constant, the turning angle at ice-air/ocean interface, and the bulk sensible heat transfer coefficient. The drag coefficients were the major parameters to control sea-ice movement and extent. The result of the study shows that more realistic simulations of ice thickness distribution was produced by tuning the simulated ice drift trajectories. In the tuning process, the L-BFCGS-B minimization algorithm of a quasi-Newton method was used. The derivative information required in the minimization iterations was provided by the AD processed Fortran code. Compared with a conventional approach, AD generated derivative code provided fast and robust computations of derivative information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Timothy C.; Hammond, Glenn E.; Chen, Xingyuan
Time-lapse electrical resistivity tomography (ERT) is finding increased application for remotely monitoring processes occurring in the near subsurface in three-dimensions (i.e. 4D monitoring). However, there are few codes capable of simulating the evolution of subsurface resistivity and corresponding tomographic measurements arising from a particular process, particularly in parallel and with an open source license. Herein we describe and demonstrate an electrical resistivity tomography module for the PFLOTRAN subsurface flow and reactive transport simulation code, named PFLOTRAN-E4D. The PFLOTRAN-E4D module operates in parallel using a dedicated set of compute cores in a master-slave configuration. At each time step, the master processesmore » receives subsurface states from PFLOTRAN, converts those states to bulk electrical conductivity, and instructs the slave processes to simulate a tomographic data set. The resulting multi-physics simulation capability enables accurate feasibility studies for ERT imaging, the identification of the ERT signatures that are unique to a given process, and facilitates the joint inversion of ERT data with hydrogeological data for subsurface characterization. PFLOTRAN-E4D is demonstrated herein using a field study of stage-driven groundwater/river water interaction ERT monitoring along the Columbia River, Washington, USA. Results demonstrate the complex nature of subsurface electrical conductivity changes, in both the saturated and unsaturated zones, arising from river stage fluctuations and associated river water intrusion into the aquifer. Furthermore, the results also demonstrate the sensitivity of surface based ERT measurements to those changes over time.« less
Johnson, Timothy C.; Hammond, Glenn E.; Chen, Xingyuan
2016-09-22
Time-lapse electrical resistivity tomography (ERT) is finding increased application for remotely monitoring processes occurring in the near subsurface in three-dimensions (i.e. 4D monitoring). However, there are few codes capable of simulating the evolution of subsurface resistivity and corresponding tomographic measurements arising from a particular process, particularly in parallel and with an open source license. Herein we describe and demonstrate an electrical resistivity tomography module for the PFLOTRAN subsurface flow and reactive transport simulation code, named PFLOTRAN-E4D. The PFLOTRAN-E4D module operates in parallel using a dedicated set of compute cores in a master-slave configuration. At each time step, the master processesmore » receives subsurface states from PFLOTRAN, converts those states to bulk electrical conductivity, and instructs the slave processes to simulate a tomographic data set. The resulting multi-physics simulation capability enables accurate feasibility studies for ERT imaging, the identification of the ERT signatures that are unique to a given process, and facilitates the joint inversion of ERT data with hydrogeological data for subsurface characterization. PFLOTRAN-E4D is demonstrated herein using a field study of stage-driven groundwater/river water interaction ERT monitoring along the Columbia River, Washington, USA. Results demonstrate the complex nature of subsurface electrical conductivity changes, in both the saturated and unsaturated zones, arising from river stage fluctuations and associated river water intrusion into the aquifer. Furthermore, the results also demonstrate the sensitivity of surface based ERT measurements to those changes over time.« less
Ithuralde, Raúl Esteban; Roitberg, Adrián Enrique; Turjanski, Adrián Gustavo
2016-07-20
Intrinsically disordered proteins (IDPs) are a set of proteins that lack a definite secondary structure in solution. IDPs can acquire tertiary structure when bound to their partners; therefore, the recognition process must also involve protein folding. The nature of the transition state (TS), structured or unstructured, determines the binding mechanism. The characterization of the TS has become a major challenge for experimental techniques and molecular simulations approaches since diffusion, recognition, and binding is coupled to folding. In this work we present atomistic molecular dynamics (MD) simulations that sample the free energy surface of the coupled folding and binding of the transcription factor c-myb to the cotranscription factor CREB binding protein (CBP). This process has been recently studied and became a model to study IDPs. Despite the plethora of available information, we still do not know how c-myb binds to CBP. We performed a set of atomistic biased MD simulations running a total of 15.6 μs. Our results show that c-myb folds very fast upon binding to CBP with no unique pathway for binding. The process can proceed through both structured or unstructured TS's with similar probabilities. This finding reconciles previous seemingly different experimental results. We also performed Go-type coarse-grained MD of several structured and unstructured models that indicate that coupled folding and binding follows a native contact mechanism. To the best of our knowledge, this is the first atomistic MD simulation that samples the free energy surface of the coupled folding and binding processes of IDPs.
Simulation study on combustion of biomass
NASA Astrophysics Data System (ADS)
Zhao, M. L.; Liu, X.; Cheng, J. W.; Liu, Y.; Jin, Y. A.
2017-01-01
Biomass combustion is the most common energy conversion technology, offering the advantages of low cost, low risk and high efficiency. In this paper, the transformation and transfer of biomass in the process of combustion are discussed in detail. The process of furnace combustion and gas phase formation was analyzed by numerical simulation. The experimental results not only help to optimize boiler operation and realize the efficient combustion of biomass, but also provide theoretical basis for the improvement of burner technology.
Modeling the influence of climate change on watershed systems: Adaptation through targeted practices
NASA Astrophysics Data System (ADS)
Dudula, John; Randhir, Timothy O.
2016-10-01
Climate change may influence hydrologic processes of watersheds (IPCC, 2013) and increased runoff may cause flooding, eroded stream banks, widening of stream channels, increased pollutant loading, and consequently impairment of aquatic life. The goal of this study was to quantify the potential impacts of climate change on watershed hydrologic processes and to evaluate scale and effectiveness of management practices for adaptation. We simulate baseline watershed conditions using the Hydrological Simulation Program Fortran (HSPF) simulation model to examine the possible effects of changing climate on watershed processes. We also simulate the effects of adaptation and mitigation through specific best management strategies for various climatic scenarios. With continuing low-flow conditions and vulnerability to climate change, the Ipswich watershed is the focus of this study. We quantify fluxes in runoff, evapotranspiration, infiltration, sediment load, and nutrient concentrations under baseline and climate change scenarios (near and far future). We model adaptation options for mitigating climate effects on watershed processes using bioretention/raingarden Best Management Practices (BMPs). It was observed that climate change has a significant impact on watershed runoff and carefully designed and maintained BMPs at subwatershed scale can be effective in mitigating some of the problems related to stormwater runoff. Policy options include implementation of BMPs through education and incentives for scale-dependent and site specific bioretention units/raingardens to increase the resilience of the watershed system to current and future climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Haining; Kim, Seungchul; Lee, Kwang-Ryeol, E-mail: krlee@kist.re.kr
2016-03-28
Initial stage of oxynitridation process of Si substrate is of crucial importance in fabricating the ultrathin gate dielectric layer of high quality in advanced MOSFET devices. The oxynitridation reaction on a relaxed Si(001) surface is investigated via reactive molecular dynamics (MD) simulation. A total of 1120 events of a single nitric oxide (NO) molecule reaction at temperatures ranging from 300 to 1000 K are statistically analyzed. The observed reaction kinetics are consistent with the previous experimental or calculation results, which show the viability of the reactive MD technique to study the NO dissociation reaction on Si. We suggest the reaction pathwaymore » for NO dissociation that is characterized by the inter-dimer bridge of a NO molecule as the intermediate state prior to NO dissociation. Although the energy of the inter-dimer bridge is higher than that of the intra-dimer one, our suggestion is supported by the ab initio nudged elastic band calculations showing that the energy barrier for the inter-dimer bridge formation is much lower. The growth mechanism of an ultrathin Si oxynitride layer is also investigated via consecutive NO reactions simulation. The simulation reveals the mechanism of self-limiting reaction at low temperature and the time evolution of the depth profile of N and O atoms depending on the process temperature, which would guide to optimize the oxynitridation process condition.« less
Molecular dynamics simulations of β2-microglobulin interaction with hydrophobic surfaces.
Dongmo Foumthuim, Cedrix J; Corazza, Alessandra; Esposito, Gennaro; Fogolari, Federico
2017-11-21
Hydrophobic surfaces are known to adsorb and unfold proteins, a process that has been studied only for a few proteins. Here we address the interaction of β2-microglobulin, a paradigmatic protein for the study of amyloidogenesis, with hydrophobic surfaces. A system with 27 copies of the protein surrounded by a model cubic hydrophobic box is studied by implicit solvent molecular dynamics simulations. Most proteins adsorb on the walls of the box without major distortions in local geometry, whereas free molecules maintain proper structures and fluctuations as observed in explicit solvent molecular dynamics simulations. The major conclusions from the simulations are as follows: (i) the adopted implicit solvent model is adequate to describe protein dynamics and thermodynamics; (ii) adsorption occurs readily and is irreversible on the simulated timescale; (iii) the regions most involved in molecular encounters and stable interactions with the walls are the same as those that are important in protein-protein and protein-nanoparticle interactions; (iv) unfolding following adsorption occurs at regions found to be flexible by both experiments and simulations; (v) thermodynamic analysis suggests a very large contribution from van der Waals interactions, whereas unfavorable electrostatic interactions are not found to contribute much to adsorption energy. Surfaces with different degrees of hydrophobicity may occur in vivo. Our simulations show that adsorption is a fast and irreversible process which is accompanied by partial unfolding. The results and the thermodynamic analysis presented here are consistent with and rationalize previous experimental work.
Kin, Taichi; Nakatomi, Hirofumi; Shono, Naoyuki; Nomura, Seiji; Saito, Toki; Oyama, Hiroshi; Saito, Nobuhito
2017-10-15
Simulation and planning of surgery using a virtual reality model is becoming common with advances in computer technology. In this study, we conducted a literature search to find trends in virtual simulation of surgery for brain tumors. A MEDLINE search for "neurosurgery AND (simulation OR virtual reality)" retrieved a total of 1,298 articles published in the past 10 years. After eliminating studies designed solely for education and training purposes, 28 articles about the clinical application remained. The finding that the vast majority of the articles were about education and training rather than clinical applications suggests that several issues need be addressed for clinical application of surgical simulation. In addition, 10 of the 28 articles were from Japanese groups. In general, the 28 articles demonstrated clinical benefits of virtual surgical simulation. Simulation was particularly useful in better understanding complicated spatial relations of anatomical landmarks and in examining surgical approaches. In some studies, Virtual reality models were used on either surgical navigation system or augmented reality technology, which projects virtual reality images onto the operating field. Reported problems were difficulties in standardized, objective evaluation of surgical simulation systems; inability to respond to tissue deformation caused by surgical maneuvers; absence of the system functionality to reflect features of tissue (e.g., hardness and adhesion); and many problems with image processing. The amount of description about image processing tended to be insufficient, indicating that the level of evidence, risk of bias, precision, and reproducibility need to be addressed for further advances and ultimately for full clinical application.
Spatiotemporal stochastic models for earth science and engineering applications
NASA Astrophysics Data System (ADS)
Luo, Xiaochun
1998-12-01
Spatiotemporal processes occur in many areas of earth sciences and engineering. However, most of the available theoretical tools and techniques of space-time daft processing have been designed to operate exclusively in time or in space, and the importance of spatiotemporal variability was not fully appreciated until recently. To address this problem, a systematic framework of spatiotemporal random field (S/TRF) models for geoscience/engineering applications is presented and developed in this thesis. The space-tune continuity characterization is one of the most important aspects in S/TRF modelling, where the space-time continuity is displayed with experimental spatiotemporal variograms, summarized in terms of space-time continuity hypotheses, and modelled using spatiotemporal variogram functions. Permissible spatiotemporal covariance/variogram models are addressed through permissibility criteria appropriate to spatiotemporal processes. The estimation of spatiotemporal processes is developed in terms of spatiotemporal kriging techniques. Particular emphasis is given to the singularity analysis of spatiotemporal kriging systems. The impacts of covariance, functions, trend forms, and data configurations on the singularity of spatiotemporal kriging systems are discussed. In addition, the tensorial invariance of universal spatiotemporal kriging systems is investigated in terms of the space-time trend. The conditional simulation of spatiotemporal processes is proposed with the development of the sequential group Gaussian simulation techniques (SGGS), which is actually a series of sequential simulation algorithms associated with different group sizes. The simulation error is analyzed with different covariance models and simulation grids. The simulated annealing technique honoring experimental variograms, is also proposed, providing a way of conditional simulation without the covariance model fitting which is prerequisite for most simulation algorithms. The proposed techniques were first applied for modelling of the pressure system in a carbonate reservoir, and then applied for modelling of springwater contents in the Dyle watershed. The results of these case studies as well as the theory suggest that these techniques are realistic and feasible.
NASA Astrophysics Data System (ADS)
Lee, Hoon Hee; Koo, Cheol Hea; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok
2013-08-01
The conceptual study for Korean lunar orbiter/lander prototype has been performed in Korea Aerospace Research Institute (KARI). Across diverse space programs around European countries, a variety of simulation application has been developed using SMP2 (Simulation Modelling Platform) standard related to portability and reuse of simulation models by various model users. KARI has not only first-hand experience of a development of SMP compatible simulation environment but also an ongoing study to apply the SMP2 development process of simulation model to a simulator development project for lunar missions. KARI has tried to extend the coverage of the development domain based on SMP2 standard across the whole simulation model life-cycle from software design to its validation through a lunar exploration project. Figure. 1 shows a snapshot from a visualization tool for the simulation of lunar lander motion. In reality, a demonstrator prototype on the right-hand side of image was made and tested in 2012. In an early phase of simulator development prior to a kick-off start in the near future, targeted hardware to be modelled has been investigated and indentified at the end of 2012. The architectural breakdown of the lunar simulator at system level was performed and the architecture with a hierarchical tree of models from the system to parts at lower level has been established. Finally, SMP Documents such as Catalogue, Assembly, Schedule and so on were converted using a XML(eXtensible Mark-up Language) converter. To obtain benefits of the suggested approaches and design mechanisms in SMP2 standard as far as possible, the object-oriented and component-based design concepts were strictly chosen throughout a whole model development process.
Gender Differences in Mental Simulation during Sentence and Word Processing
ERIC Educational Resources Information Center
Wassenburg, Stephanie I.; de Koning, Björn B.; de Vries, Meinou H.; Boonstra, A. Marije; van der Schoot, Menno
2017-01-01
Text comprehension requires readers to mentally simulate the described situation by reactivating previously acquired sensory and motor information from (episodic) memory. Drawing upon research demonstrating gender differences, favouring girls, in tasks involving episodic memory retrieval, the present study explores whether gender differences exist…
The distinguishing signature of Magnetic Penrose Process
NASA Astrophysics Data System (ADS)
Dadhich, Naresh; Tursunov, Arman; Ahmedov, Bobomurat; Stuchlík, Zdeněk
2018-04-01
In this Letter, we wish to point out that the distinguishing feature of Magnetic Penrose process (MPP) is its super high efficiency exceeding 100% (which was established in mid 1980s for discrete particle accretion) of extraction of rotational energy of a rotating black hole electromagnetically for a magnetic field of milli Gauss order. Another similar process, which is also driven by electromagnetic field, is Blandford-Znajek mechanism (BZ), which could be envisaged as high magnetic field limit MPP as it requires threshold magnetic field of order 104G. Recent simulation studies of fully relativistic magnetohydrodynamic flows have borne out super high efficiency signature of the process for high magnetic field regime; viz BZ. We would like to make a clear prediction that similar simulation studies of MHD flows for low magnetic field regime, where BZ would be inoperative, would also have super efficiency.
In situ and in-transit analysis of cosmological simulations
Friesen, Brian; Almgren, Ann; Lukic, Zarija; ...
2016-08-24
Modern cosmological simulations have reached the trillion-element scale, rendering data storage and subsequent analysis formidable tasks. To address this circumstance, we present a new MPI-parallel approach for analysis of simulation data while the simulation runs, as an alternative to the traditional workflow consisting of periodically saving large data sets to disk for subsequent ‘offline’ analysis. We demonstrate this approach in the compressible gasdynamics/N-body code Nyx, a hybrid MPI+OpenMP code based on the BoxLib framework, used for large-scale cosmological simulations. We have enabled on-the-fly workflows in two different ways: one is a straightforward approach consisting of all MPI processes periodically haltingmore » the main simulation and analyzing each component of data that they own (‘ in situ’). The other consists of partitioning processes into disjoint MPI groups, with one performing the simulation and periodically sending data to the other ‘sidecar’ group, which post-processes it while the simulation continues (‘in-transit’). The two groups execute their tasks asynchronously, stopping only to synchronize when a new set of simulation data needs to be analyzed. For both the in situ and in-transit approaches, we experiment with two different analysis suites with distinct performance behavior: one which finds dark matter halos in the simulation using merge trees to calculate the mass contained within iso-density contours, and another which calculates probability distribution functions and power spectra of various fields in the simulation. Both are common analysis tasks for cosmology, and both result in summary statistics significantly smaller than the original data set. We study the behavior of each type of analysis in each workflow in order to determine the optimal configuration for the different data analysis algorithms.« less
Surface-water hydrology and runoff simulations for three basins in Pierce County, Washington
Mastin, M.C.
1996-01-01
The surface-water hydrology in Clear, Clarks, and Clover Creek Basins in central Pierce County, Washington, is described with a conceptual model of the runoff processes and then simulated with the Hydrological Simulation Program-FORTRAN (HSPF), a continuous, deterministic hydrologic model. The study area is currently undergoing a rapid conversion of rural, undeveloped land to urban and suburban land that often changes the flow characteristics of the streams that drain these lands. The complex interactions of land cover, climate, soils, topography, channel characteristics, and ground- water flow patterns determine the surface-water hydrology of the study area and require a complex numerical model to assess the impact of urbanization on streamflows. The U.S. Geological Survey completed this investigation in cooperation with the Storm Drainage and Surface Water Management Utility within the Pierce County Department of Public Works to describe the important rainfall-runoff processes within the study area and to develop a simulation model to be used as a tool to predict changes in runoff characteristics resulting from changes in land use. The conceptual model, a qualitative representation of the study basins, links the physical characteristics to the runoff process of the study basins. The model incorporates 11 generalizations identified by the investigation, eight of which describe runoff from hillslopes, and three that account for the effects of channel characteristics and ground-water flow patterns on runoff. Stream discharge was measured at 28 sites and precipitation was measured at six sites for 3 years in two overlapping phases during the period of October 1989 through September 1992 to calibrate and validate the simulation model. Comparison of rainfall data from October 1989 through September 1992 shows the data-collection period beginning with 2 wet water years followed by the relatively dry 1992 water year. Runoff was simulated with two basin models-the Clover Creek Basin model and the Clear-Clarks Basin model-by incorporating the generalizations of the conceptual model into the construction of two HSPF numerical models. Initially, the process-related parameters for runoff from glacial-till hillslopes were calibrated with numerical models for three catchment sites and one headwater basin where streamflows were continuously measured and little or no influence from ground water, channel storage, or channel losses affected runoff. At one of the catchments soil moisture was monitored and compared with simulated soil moisture. The values for these parameters were used in the basin models. Basin models were calibrated to the first year of observed streamflow data by adjusting other parameters in the numerical model that simulated channel losses, simulated channel storage in a few of the reaches in the headwaters and in the floodplain of the main stem of Clover Creek, and simulated volume and outflow of the ground-water reservoir representing the regional ground-water aquifers. The models were run for a second year without any adjustments, and simulated results were compared with observed results as a measure of validation of the models. The investigation showed the importance of defining the ground-water flow boundaries and demonstrated a simple method of simulating the influence of the regional ground-water aquifer on streamflows. In the Clover Creek Basin model, ground-water flow boundaries were used to define subbasins containing mostly glacial outwash soils and not containing any surface drainage channels. In the Clear-Clarks Basin model, ground-water flow boundaries outlined a recharge area outside the surface-water boundaries of the basin that was incorporated into the model in order to provide sufficient water to balance simulated ground-water outflows to the creeks. A simulated ground-water reservoir used to represent regional ground-water flow processes successfully provided the proper water balance of inflows and outfl
[The assessment of simulation practice learning in nursing education as feedback].
dos Santos, Mateus Casanova; Leite, Maria Cecília Lorea
2010-09-01
This paper is a theoretical and reflective work that emerged as a cutting from a case study with qualitative, descriptive and participative approach. It refers to a research project entitled "Study of the Evaluation on Simulation Learning Trigger", carried out by the Morphofunctional Laboratory at the Nursing School from Federal University of Pelotas, Rio Grande do Sul, Brazil. The goal is to demonstrate the importance of the assessment of simulation practice learning as a feedback for the improvement and planning of education. Simulation is an attempt to reproduce the essential features of a real clinical setting. It identifies the assessment of learning as a potential curricular space for the reevaluation of the teaching-learning process and educational planning. The interdisciplinarity inherent in health issues need to be integrated to the processes of thinking feeling and executing nursing teaching practices in order to direct it to completeness, universality in health and to critical, reflective and self-directed training.
Matsuzaki, Ryosuke; Tachikawa, Takeshi; Ishizuka, Junya
2018-03-01
Accurate simulations of carbon fiber-reinforced plastic (CFRP) molding are vital for the development of high-quality products. However, such simulations are challenging and previous attempts to improve the accuracy of simulations by incorporating the data acquired from mold monitoring have not been completely successful. Therefore, in the present study, we developed a method to accurately predict various CFRP thermoset molding characteristics based on data assimilation, a process that combines theoretical and experimental values. The degree of cure as well as temperature and thermal conductivity distributions during the molding process were estimated using both temperature data and numerical simulations. An initial numerical experiment demonstrated that the internal mold state could be determined solely from the surface temperature values. A subsequent numerical experiment to validate this method showed that estimations based on surface temperatures were highly accurate in the case of degree of cure and internal temperature, although predictions of thermal conductivity were more difficult.
Modeling Negotiation by a Paticipatory Approach
NASA Astrophysics Data System (ADS)
Torii, Daisuke; Ishida, Toru; Bousquet, François
In a participatory approach by social scientists, role playing games (RPG) are effectively used to understand real thinking and behavior of stakeholders, but RPG is not sufficient to handle a dynamic process like negotiation. In this study, a participatory simulation where user-controlled avatars and autonomous agents coexist is introduced to the participatory approach for modeling negotiation. To establish a modeling methodology of negotiation, we have tackled the following two issues. First, for enabling domain experts to concentrate interaction design for participatory simulation, we have adopted the architecture in which an interaction layer controls agents and have defined three types of interaction descriptions (interaction protocol, interaction scenario and avatar control scenario) to be described. Second, for enabling domain experts and stakeholders to capitalize on participatory simulation, we have established a four-step process for acquiring negotiation model: 1) surveys and interviews to stakeholders, 2) RPG, 3) interaction design, and 4) participatory simulation. Finally, we discussed our methodology through a case study of agricultural economics in the northeast Thailand.
Yamin, Stephanie; Stinchcombe, Arne; Gagnon, Sylvain
2015-01-01
Driving is a multifactorial behaviour drawing on multiple cognitive, sensory, and physical systems. Dementia is a progressive and degenerative neurological condition that impacts the cognitive processes necessary for safe driving. While a number of studies have examined driving among individuals with Alzheimer's disease, less is known about the impact of Dementia with Lewy Bodies (DLB) on driving safety. The present study compared simulated driving performance of 15 older drivers with mild DLB with that of 21 neurologically healthy control drivers. DLB drivers showed poorer performance on all indicators of simulated driving including an increased number of collisions in the simulator and poorer composite indicators of overall driving performance. A measure of global cognitive function (i.e., the Mini Mental State Exam) was found to be related to the overall driving performance. In addition, measures of attention (i.e., Useful Field of View, UFOV) and space processing (Visual Object and Space Perception, VOSP, Test) correlated significantly with a rater's assessment of driving performance. PMID:26713169
Protein folding simulations: from coarse-grained model to all-atom model.
Zhang, Jian; Li, Wenfei; Wang, Jun; Qin, Meng; Wu, Lei; Yan, Zhiqiang; Xu, Weixin; Zuo, Guanghong; Wang, Wei
2009-06-01
Protein folding is an important and challenging problem in molecular biology. During the last two decades, molecular dynamics (MD) simulation has proved to be a paramount tool and was widely used to study protein structures, folding kinetics and thermodynamics, and structure-stability-function relationship. It was also used to help engineering and designing new proteins, and to answer even more general questions such as the minimal number of amino acid or the evolution principle of protein families. Nowadays, the MD simulation is still undergoing rapid developments. The first trend is to toward developing new coarse-grained models and studying larger and more complex molecular systems such as protein-protein complex and their assembling process, amyloid related aggregations, and structure and motion of chaperons, motors, channels and virus capsides; the second trend is toward building high resolution models and explore more detailed and accurate pictures of protein folding and the associated processes, such as the coordination bond or disulfide bond involved folding, the polarization, charge transfer and protonate/deprotonate process involved in metal coupled folding, and the ion permeation and its coupling with the kinetics of channels. On these new territories, MD simulations have given many promising results and will continue to offer exciting views. Here, we review several new subjects investigated by using MD simulations as well as the corresponding developments of appropriate protein models. These include but are not limited to the attempt to go beyond the topology based Gō-like model and characterize the energetic factors in protein structures and dynamics, the study of the thermodynamics and kinetics of disulfide bond involved protein folding, the modeling of the interactions between chaperonin and the encapsulated protein and the protein folding under this circumstance, the effort to clarify the important yet still elusive folding mechanism of protein BBL, the development of discrete MD and its application in studying the alpha-beta conformational conversion and oligomer assembling process, and the modeling of metal ion involved protein folding. (c) 2009 IUBMB.
Numerical simulation of the casting process of titanium tooth crowns and bridges.
Wu, M; Augthun, M; Wagner, I; Sahm, P R; Spiekermann, H
2001-06-01
The objectives of this paper were to simulate the casting process of titanium tooth crowns and bridges; to predict and control porosity defect. A casting simulation software, MAGMASOFT, was used. The geometry of the crowns with fine details of the occlusal surface were digitized by means of laser measuring technique, then converted and read in the simulation software. Both mold filling and solidification were simulated, the shrinkage porosity was predicted by a "feeding criterion", and the gas pore sensitivity was studied based on the mold filling and solidification simulations. Two types of dental prostheses (a single-crown casting and a three-unit-bridge) with various sprue designs were numerically "poured", and only one optimal design for each prosthesis was recommended for real casting trial. With the numerically optimized design, real titanium dental prostheses (five replicas for each) were made on a centrifugal casting machine. All the castings endured radiographic examination, and no porosity was detected in the cast prostheses. It indicates that the numerical simulation is an efficient tool for dental casting design and porosity control. Copyright 2001 Kluwer Academic Publishers
Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco
2015-02-01
Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.
NASA Astrophysics Data System (ADS)
Sadi, Toufik; Mehonic, Adnan; Montesi, Luca; Buckwell, Mark; Kenyon, Anthony; Asenov, Asen
2018-02-01
We employ an advanced three-dimensional (3D) electro-thermal simulator to explore the physics and potential of oxide-based resistive random-access memory (RRAM) cells. The physical simulation model has been developed recently, and couples a kinetic Monte Carlo study of electron and ionic transport to the self-heating phenomenon while accounting carefully for the physics of vacancy generation and recombination, and trapping mechanisms. The simulation framework successfully captures resistance switching, including the electroforming, set and reset processes, by modeling the dynamics of conductive filaments in the 3D space. This work focuses on the promising yet less studied RRAM structures based on silicon-rich silica (SiO x ) RRAMs. We explain the intrinsic nature of resistance switching of the SiO x layer, analyze the effect of self-heating on device performance, highlight the role of the initial vacancy distributions acting as precursors for switching, and also stress the importance of using 3D physics-based models to capture accurately the switching processes. The simulation work is backed by experimental studies. The simulator is useful for improving our understanding of the little-known physics of SiO x resistive memory devices, as well as other oxide-based RRAM systems (e.g. transition metal oxide RRAMs), offering design and optimization capabilities with regard to the reliability and variability of memory cells.
NASA Astrophysics Data System (ADS)
El Amri, Abdelouahid; el yakhloufi Haddou, Mounir; Khamlichi, Abdellatif
2017-10-01
Damage mechanisms in hot metal forming processes are accelerated by mechanical stresses arising during Thermal and mechanical properties variations, because it consists of the materials with different thermal and mechanical loadings and swelling coefficients. In this work, 3D finite element models (FEM) are developed to simulate the effect of Temperature and the stresses on the model development, using a general purpose FE software ABAQUS. Explicit dynamic analysis with coupled Temperature displacement procedure is used for a model. The purpose of this research was to study the thermomechanical damage mechanics in hot forming processes. The important process variables and the main characteristics of various hot forming processes will also be discussed.
NASA Astrophysics Data System (ADS)
Willgoose, G. R.; Cohen, S.; Svoray, T.; Sela, S.; Hancock, G. R.
2010-12-01
Numerical models are an important tool for studying landscape processes as they allow us to isolate specific processes and drivers and test various physics and spatio-temporal scenarios. Here we use a distributed physically-based soil evolution model (mARM4D) to describe the drivers and processes controlling soil-landscape evolution on a field-site at the fringe between the Mediterranean and desert regions of Israel. This study is an initial effort in a larger project aimed at improving our understanding of the mechanisms and drivers that led to the extensive removal of soils from the loess covered hillslopes of this region. This specific region is interesting as it is located between the Mediterranean climate region in which widespread erosion from hillslopes was attributed to human activity during the Holocene and the arid region in which extensive removal of loess from hillslopes was shown to have been driven by climatic changes during the late-Pleistocene. First we study the sediment transport mechanism of the soil-landscape evolution processes in our study-site. We simulate soil-landscape evolution with only one sediment transport process (fluvial or diffusive) at a time. We find that diffusive sediment transport is likely the dominant process in this site as it resulted in soil distributions that better corresponds to current observations. We then simulate several realistic climatic/anthropogenic scenarios (based on the literature) in order to quantify the sensitivity of the soil-landscape evolution process to temporal fluctuations. We find that this site is relatively insensitive to short term (several thousands of years) sharp, changes. This suggests that climate, rather then human activity, was the main driver for the extensive removal of loess from the hillslopes.
Fast Simulation of Electromagnetic Showers in the ATLAS Calorimeter: Frozen Showers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barberio, E.; /Melbourne U.; Boudreau, J.
2011-11-29
One of the most time consuming process simulating pp interactions in the ATLAS detector at LHC is the simulation of electromagnetic showers in the calorimeter. In order to speed up the event simulation several parametrisation methods are available in ATLAS. In this paper we present a short description of a frozen shower technique, together with some recent benchmarks and comparison with full simulation. An expected high rate of proton-proton collisions in ATLAS detector at LHC requires large samples of simulated events (Monte Carlo) to study various physics processes. A detailed simulation of particle reactions ('full simulation') in the ATLAS detectormore » is based on GEANT4 and is very accurate. However, due to complexity of the detector, high particle multiplicity and GEANT4 itself, the average CPU time spend to simulate typical QCD event in pp collision is 20 or more minutes for modern computers. During detector simulation the largest time is spend in the calorimeters (up to 70%) most of which is required for electromagnetic particles in the electromagnetic (EM) part of the calorimeters. This is the motivation for fast simulation approaches which reduce the simulation time without affecting the accuracy. Several of fast simulation methods available within the ATLAS simulation framework (standard Athena based simulation program) are discussed here with the focus on the novel frozen shower library (FS) technique. The results obtained with FS are presented here as well.« less
NASA Astrophysics Data System (ADS)
Fang, Y.; Huang, M.; Liu, C.; Li, H.; Leung, L. R.
2013-11-01
Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into Earth system models (e.g., community land models (CLMs)), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module, Next Generation BioGeoChemical Module (NGBGC), version 1.0, with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter, and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into CLM. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems. The method presented here could in theory be applied to simulate biogeochemical cycles in other Earth system models.
NASA Astrophysics Data System (ADS)
Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.
2017-12-01
Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.
Kinematic Evolution of Simulated Star-Forming Galaxies
NASA Technical Reports Server (NTRS)
Kassin, Susan A.; Brooks, Alyson; Governato, Fabio; Weiner, Benjamin J.; Gardner, Jonathan P.
2014-01-01
Recent observations have shown that star-forming galaxies like our own Milky Way evolve kinematically into ordered thin disks over the last approximately 8 billion years since z = 1.2, undergoing a process of "disk settling." For the first time, we study the kinematic evolution of a suite of four state of the art "zoom in" hydrodynamic simulations of galaxy formation and evolution in a fully cosmological context and compare with these observations. Until now, robust measurements of the internal kinematics of simulated galaxies were lacking as the simulations suffered from low resolution, overproduction of stars, and overly massive bulges. The current generation of simulations has made great progress in overcoming these difficulties and is ready for a kinematic analysis. We show that simulated galaxies follow the same kinematic trends as real galaxies: they progressively decrease in disordered motions (sigma(sub g)) and increase in ordered rotation (V(sub rot)) with time. The slopes of the relations between both sigma(sub g) and V(sub rot) with redshift are consistent between the simulations and the observations. In addition, the morphologies of the simulated galaxies become less disturbed with time, also consistent with observations. This match between the simulated and observed trends is a significant success for the current generation of simulations, and a first step in determining the physical processes behind disk settling.
Lang, Alon; Melzer, Ehud; Bar-Meir, Simon; Eliakim, Rami; Ziv, Amitai
2006-11-01
The continuing development in computer-based medical simulators provides an ideal platform for simulator-assisted training programs for medical trainees. Computer-based endoscopic simulators provide a virtual reality environment for training endoscopic procedures. This study illustrates the use of a comprehensive training model combining the use of endoscopic simulators with simulated (actor) patients (SP). To evaluate the effectiveness of a comprehensive simulation workshop from the trainee perspective. Four case studies were developed with emphasis on communication skills. Three workshops with 10 fellows in each were conducted. During each workshop the trainees spent half of the time in SP case studies and the remaining half working with computerized endoscopic simulators with continuous guidance by an expert endoscopist. Questionnaires were completed by the fellows at the end of the workshop. Seventy percent of the fellows felt that the endoscopic simulator was close or very close to reality for gastroscopy and 63% for colonoscopy. Eighty eight percent thought the close guidance was important for the learning process with the simulator. Eighty percent felt that the case studies were an important learning experience for risk management. Further evaluation of multi-modality simulation workshops in gastroenterologist training is needed to identify how best to incorporate this form of instruction into training for gastroenterologists.
Integrated Multiscale Modeling of Molecular Computing Devices. Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tim Schulze
2012-11-01
The general theme of this research has been to expand the capabilities of a simulation technique, Kinetic Monte Carlo (KMC) and apply it to study self-assembled nano-structures on epitaxial thin films. KMC simulates thin film growth and evolution by replacing the detailed dynamics of the system's evolution, which might otherwise be studied using molecular dynamics, with an appropriate stochastic process.
NASA Astrophysics Data System (ADS)
Faber, Tracy L.; Garcia, Ernest V.; Lalush, David S.; Segars, W. Paul; Tsui, Benjamin M.
2001-05-01
The spline-based Mathematical Cardiac Torso (MCAT) phantom is a realistic software simulation designed to simulate single photon emission computed tomographic (SPECT) data. It incorporates a heart model of known size and shape; thus, it is invaluable for measuring accuracy of acquisition, reconstruction, and post-processing routines. New functionality has been added by replacing the standard heart model with left ventricular (LV) epicaridal and endocardial surface points detected from actual patient SPECT perfusion studies. LV surfaces detected from standard post-processing quantitation programs are converted through interpolation in space and time into new B-spline models. Perfusion abnormalities are added to the model based on results of standard perfusion quantification. The new LV is translated and rotated to fit within existing atria and right ventricular models, which are scaled based on the size of the LV. Simulations were created for five different patients with myocardial infractions who had undergone SPECT perfusion imaging. Shape, size, and motion of the resulting activity map were compared visually to the original SPECT images. In all cases, size, shape and motion of simulated LVs matched well with the original images. Thus, realistic simulations with known physiologic and functional parameters can be created for evaluating efficacy of processing algorithms.
Fast ray-tracing of human eye optics on Graphics Processing Units.
Wei, Qi; Patkar, Saket; Pai, Dinesh K
2014-05-01
We present a new technique for simulating retinal image formation by tracing a large number of rays from objects in three dimensions as they pass through the optic apparatus of the eye to objects. Simulating human optics is useful for understanding basic questions of vision science and for studying vision defects and their corrections. Because of the complexity of computing such simulations accurately, most previous efforts used simplified analytical models of the normal eye. This makes them less effective in modeling vision disorders associated with abnormal shapes of the ocular structures which are hard to be precisely represented by analytical surfaces. We have developed a computer simulator that can simulate ocular structures of arbitrary shapes, for instance represented by polygon meshes. Topographic and geometric measurements of the cornea, lens, and retina from keratometer or medical imaging data can be integrated for individualized examination. We utilize parallel processing using modern Graphics Processing Units (GPUs) to efficiently compute retinal images by tracing millions of rays. A stable retinal image can be generated within minutes. We simulated depth-of-field, accommodation, chromatic aberrations, as well as astigmatism and correction. We also show application of the technique in patient specific vision correction by incorporating geometric models of the orbit reconstructed from clinical medical images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tang, Bingtao; Wang, Qiaoling; Wei, Zhaohui; Meng, Xianju; Yuan, Zhengjun
2016-05-01
Ultra-high-strength in sheet metal parts can be achieved with hot stamping process. To improve the crash performance and save vehicle weight, it is necessary to produce components with tailored properties. The use of tailor-welded high-strength steel is a relatively new hot stamping process for saving weight and obtaining desired local stiffness and crash performance. The simulation of hot stamping boron steel, especially tailor-welded blanks (TWBs) stamping, is more complex and challenging. Information about thermal/mechanical properties of tools and sheet materials, heat transfer, and friction between the deforming material and the tools is required in detail. In this study, the boron-manganese steel B1500HS and high-strength low-alloy steel B340LA are tailor welded and hot stamped. In order to precisely simulate the hot stamping process, modeling and simulation of hot stamping tailor-welded high-strength steels, including phase transformation modeling, thermal modeling, and thermal-mechanical modeling, is investigated. Meanwhile, the welding zone of tailor-welded blanks should be sufficiently accurate to describe thermal, mechanical, and metallurgical parameters. FE simulation model using TWBs with the thickness combination of 1.6 mm boron steel and 1.2 mm low-alloy steel is established. In order to evaluate the mechanical properties of the hot stamped automotive component (mini b-pillar), hardness and microstructure at each region are investigated. The comparisons between simulated results and experimental observations show the reliability of thermo-mechanical and metallurgical modeling strategies of TWBs hot stamping process.
Results from the VALUE perfect predictor experiment: process-based evaluation
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Soares, Pedro; Hertig, Elke; Brands, Swen; Huth, Radan; Cardoso, Rita; Kotlarski, Sven; Casado, Maria; Pongracz, Rita; Bartholy, Judit
2016-04-01
Until recently, the evaluation of downscaled climate model simulations has typically been limited to surface climatologies, including long term means, spatial variability and extremes. But these aspects are often, at least partly, tuned in regional climate models to match observed climate. The tuning issue is of course particularly relevant for bias corrected regional climate models. In general, a good performance of a model for these aspects in present climate does therefore not imply a good performance in simulating climate change. It is now widely accepted that, to increase our condidence in climate change simulations, it is necessary to evaluate how climate models simulate relevant underlying processes. In other words, it is important to assess whether downscaling does the right for the right reason. Therefore, VALUE has carried out a broad process-based evaluation study based on its perfect predictor experiment simulations: the downscaling methods are driven by ERA-Interim data over the period 1979-2008, reference observations are given by a network of 85 meteorological stations covering all European climates. More than 30 methods participated in the evaluation. In order to compare statistical and dynamical methods, only variables provided by both types of approaches could be considered. This limited the analysis to conditioning local surface variables on variables from driving processes that are simulated by ERA-Interim. We considered the following types of processes: at the continental scale, we evaluated the performance of downscaling methods for positive and negative North Atlantic Oscillation, Atlantic ridge and blocking situations. At synoptic scales, we considered Lamb weather types for selected European regions such as Scandinavia, the United Kingdom, the Iberian Pensinsula or the Alps. At regional scales we considered phenomena such as the Mistral, the Bora or the Iberian coastal jet. Such process-based evaluation helps to attribute biases in surface variables to underlying processes and ultimately to improve climate models.